To: Zeev Hed who wrote (9652 ) 11/28/1997 1:34:00 AM From: Bilow Read Replies (1) | Respond to of 18056
Hi Zeev; As long as we are talking about interatomic spacing and Moore's law, I'll throw in my 2 cents, I do, after all have a couple of degrees in the subject and design chips for a living. First of all, references to circuits smaller than an atomic distance are silly cause the energy required to get that small is way, way, too large. When you get there, you typically end up with stuff that is highly radioactive due to various decays, or just hasn't been observed. Muons are stable, and that's about all. Neutrons, for instance, decay into a proton, an electron, and an electron type anti-neutrino in about 20 minutes. I took a class once where one of our homework assignments was to calculate that 20 minutes from QED. It was a brutal assignment, that if my memory is correct, involved deriving a definite multiple integral of some nasty tensor equations and had about 12 integrations. Essentially what made Feynmann famous was a way of quickly writing down the definite integral that solves a particular scattering cross section problem. But even with his technique, you still ended up doing freshman calculus for about 2 days per problem. I think Moore's law, as applied to line widths, will top out well within my professional lifetime. (I intend on working for another 30 years or so.) But anybody who designs high gate-count ICs will agree with me that almost always only a very small part of the die is used. The vast vast majority of the chip isn't doing anything. As an example, a typical memory chip with 16 million storage locations can only read out something like 4 to 8 of them at a time. These inefficiencies are why a computer can just barely beat a human at chess despite the fact that silicon can add or multiply at an almost unbelievably faster rate. In a word, our use of silicon is still incredibly, almost unbelievably inefficient. As we correct those inefficiencies, you should see computer chip performance increases for the next 40 to 50 years, IMO. Then it stabilizes. By that time, you will have computers whose intelligence will be undeniable, and they will have huge effects on our society, far beyond anything which has happened before now. Should be interesting. Maybe I should eat fewer potatoe chips so I can see it. :) Of course, all this is far in the future, and no one now can say what stocks will profit from it, anymore than someone in 1970 could have predicted that MSFT would grow as much as it has. -- Carl