SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Microcap & Penny Stocks : International Automated Systems
IAUS 0.04000.0%Jul 8 5:00 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Lyle Abramowitz who wrote (325)9/6/1996 1:16:00 PM
From: Larry Holmes   of 7618
 
Lyle, thanks for letting me use your Shannon explanation. I look forward to seeing your refined version!

It also occurred to me that perhaps a "qualitative" example would be helpful, to show why signal to noise ratio is so important. I don't claim to know all about these things, though I use them all the time. I just think it is helpful for even the non-technical investor to know enough about a subject to formulate an opinion of his/her own.

Claude Shannon developed a precise mathematical formula to calculate the maximum amount of "information" which my be sent over a particular communications channel WITHOUT ERROR (assuming one can build a modulator sophisticated enough). His formula allows one to calculate what has become known as "The Shannon Limit" for any communications channel. A modem which tries to exceed this limit in practice may be able to get some data through, but, it will have a high error rate which must be corrected by the addition of extra data, which tends to reduce the performance of the modem, often to less than the actual throughput of a modem operating below the Shannon Limit without the extra error correction.

It is interesting to note that a fellow named Hartley, who has a number of other formulas and designs named after him, discussed what we now call the "Shannon Limit" extensively in the late twenties and early thirties, while Shannon's work was published in 1949. There is a trend in Science these days to give credit to all who contributed to the formation of theories and formulas, so the "Shannon Limit" and "Shannon Channel Capacity Theorem" may also be referred to as "Shannon-Hartley", but they are the same.

The formula was not motivated by the need to set an UPPER limit when Shannon originally developed it. Engineers at the time were debating whether or not one could send more than a few dozen bits per second over a severely band-limited and power-limited channel like POTS. (POTS = Plain Old Telephone System). Shannon developed his Channel Capacity Theorem to convince everyone that there was a much greater UPPER limit to POTS and similar systems than was commonly believed, and this contributed to the continual improvement of modem technology. Engineers and Scientists were able to justify the massive R&D required to develop today's complex technologies because they knew that their work was within the "Limit" of the channel, and did not have to risk spending years of work to discover that they were doomed from the start. Such products need lots of funding, and funding is impossible to obtain unless one can show a reasonable chance of success. Shannon was not trying to convince everyone to limit their work, but rather, to encourage them to try harder! And they did.

Today, because we have the sophisticated electronic devices to build modulators which approach theoretical limits, we see the Shannon Limit in a different light, but we still use it to determine whether a particular method is feasible IN ITS FINAL FORM, without the tedious and expensive process of fully developing a technology in order to discover whether it will work or not. Obviously, one must accept Shannon's work in order to make such a determination. Many have tried to disprove Shannon-Hartley; to date, there are no "disproofs" which can stand up to scrutiny.

One common error in the application of Shannon is to assume that all calculations are done in "binary bits" per second. In reality, the "bit" which Shannon refers to is not necessarily a binary bit, but rather, an "information bit", or a symbol, as we now call it. Modern methods can place several bits in one symbol. Thus, even though the strict application of Shannon's formula results in a channel capacity for POTS of only a few thousand "bits" per second, these are not actually data bits, but groups of data bits, or symbols. Bits and symbols become the same only when "binary signalling" (using only ones and zeros) is used as the modulation method, and this is terribly inefficient for narrowband channels like POTS.

By using the proper units for the values in Shannon's formula, one may directly calculate channel capacity in binary bits per second. Just as with any other engineering calculation, one should always perform the calculation with the units for every value as part of the equation. If you have everything right, the units will work out to be the ones you are after. For example, if one calculates the channel capacity for POTS, using incompatible units, one comes up with a channel capacity of about 3,000 or 6,000 bps (depending on how the error is made). This may lead one to believe that present V.34 modems, operating at up to 33,600 bps, "beat" the Shannon Limit for POTS, but they don't. The lower Shannon numbers are not bits per second, but are symbols per second. V.34 modems use several bits per symbol, at different symbol rates, which are chosen for optimum performance when the modem tests (or "probes") the phone line when it first connects. The upper limit for the number of symbols per second for V.34 modems is about 3000. I have often seen Shannon's formula used to show that since Shannon's Limit for POTS comes out to be about 3000 "bits" per second, V.34 modems already exceeds that, so why shouldn't another technology be able to do even better? The error, again, is in the fact that the numbers used to make the calculation are in units which yield "symbols per second" and not "bits per second".

Most engineering books have charts or tables which allow you to quickly evaluate a channel by applying known parameters. These charts have been calculated in ways which will directly yield bps, for convenience. The ones I use are actually in "bits/second/Hz", and also assume that you are mindful of power limitations (which show up as S/N ratio). Thus, if you have a system which is capable of "x" bits/second/Hz, and you know you have "H" of useable bandwidth, the bits per second rate of this system becomes simply "Hx" (H times x). Even if the S/N ratio degrades for some reason, you will still send Hx bits per second, but you will experience a high rate of error.

It is possible for a system to send data at rates much higher than the Shannon Limit if you ignore errors. This is sometimes done during initial development of a new design, to refine the coarser steps without considering the finer details, so one needs to know the theoretical limit of ANY design, to know whether one is going to be able to later refine the process to eliminate the errors. If not, even though the process may ideally send data at, say, 64000 bps, the data is so full of errors that it becomes useless.

There are rather simple explanations for the limits which Shannon quantified, which I will state in another message, so this one doesn't become impossibly long.

It has been a learning experience to try to put this into words. Funny how I can use something for years with an "intuitive" understanding of it, then, when I try to explain the how and why of it, I am surprised at how difficult that can be!

Larry
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext