SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Intel Corporation (INTC) -- Ignore unavailable to you. Want to Upgrade?


To: Barry Grossman who wrote (73109)2/7/1999 4:36:00 PM
From: Tony Viola  Respond to of 186894
 
Barry, Re: "Since I'm sure that most here are not engineers, and I know that I don't know what
you are really talking about here, would you help us out with a further explanation
of what you are saying here?"

OK, will try to keep the "techie factor" reasonable. Microprocessors, their attending chipsets and motherboards work in the digital world, vs. analog. That means that all bits of information (megabits, megabytes, gigabytes, busses, control signals, etc.) are represented by ones and zeroes. Generally, up (higher voltage) is a one, and down (lower voltage) is a zero. The ones and zeroes tell what is happening in the computer at any instant. That is, your OS, applications, data, everything, is represented by these bits. There are hundreds of millions, or billions of these bits in your computer doing this, all the time you are using the computer. Works fine, or textbook, at slow speeds. However, getting up into the hundreds of Megahertz, the digital starts turning into analog, because of the physics of electronics (nuf said on that). Suffice it to say that, instead of the signals that represent bits going straight up or down, as they change from ones to zeroes, or vice versa, and looking like "square waves" as they are supposed to, they start to degrade. They get jagged, spiky looking, and may not even get all the way up or down to the one or zero level before they have to start changing again. They have to switch just too fast. They look more "analog" than digital.

When digital turns into analog, the information, or ones and zeroes, becomes more difficult, or impossible, to recognize as ones and zeroes by parts of the computer that need to be doing the recognizing. And, that's everywhere in the computer. For every signal generated (one or zero) there is a "receiver" that has to recognize it as a one or zero.

So, designing at these very high frequencies (VHF, or UHF), and keeping things looking digital, requires a science, or design methodology known as transmission line theory (and other EE design knowledge). Transmission line theory comes from radio, TV, radar and other older applications that had to deal with these kinds of frequencies before computers. This design methodology is relatively unknown in the computer biz, because it was supposed to be digital. IBM and a few other computer companies understand and use it. I have seen papers from Intel that give me confidence that they do also. The old DEC, with Alpha, I'm sure uses transmission line theory. Other microprocessor houses I wouldn't bet on.

A picture is worth a thousand words and I wish we had a GUI interface here (not the first time).

Tony