So, it still does not explain the published specifications of 17A.
Please point me to the specification. I believe it must be a worse case spec. I would guess Intel measured some standby currents on functional chips down at what they call L nominal = Lmin + CD control = 60 nm. Now, remember... I think this L nominal actually is the minimum mean poly at which Intel expects to be able to ship a part assuming it meets some across chip linewidth variation of say +/- 80A 3 sigma. Intel claims such a 60nm device will have an off current or 100na/um for the low Vt case. The high Vt devices have off currents 10x lower and so do not contribute much especially if the low Vt devices comprise a significant percentage of the total devices on the chip. I'm now guessing that Intel uses low Vt for perhaps everything that runs at advertized freq. So that may well be 50% of the total devices. If you assume 25 million low Vt devices (mostly logic) and an average logic device width of 4um, you could get 10A standby current if you assume they all are at this 100na/um level. But, herein lies the problem and it is a big one. At 100na/um off current, the NFET still has a threshold of around 130mv. It's Ion/Ioff ratio still is around 10,000. If all the devices were like that, there would be no problem. However, these 25 million low Vt devices with a mean at 60nm actually have some distribution around that mean which is whatever Intel is getting for their across chip linewidth variation. This is a function of mask quality, lith quality, RIE quality, among other things. Despite what they may claim, I sincerely doubt it is better than +/- 80A 3 sigma around that mean of 60nm. And it is not necessarily the off current from the few devices down below the spec that is the problem. Even at 10,000nA/um, the Ion/Ioff ratio is 100:1. The problem is the threshold voltage of those few devices. I refer you again to Intel's chart from their 2001 IEDM paper of Vt vs L poly for the low Vt NFET at high drain bias. At 60nm, it is around 130mv. At 50nm, it is around 40mv. So, we have millions of devices in critical paths with Vts varying from above 130mv down to at least 50mv. Extremely dangerous, I would think. I can't believe this stuff will ever get shipped. As, I said previously, none of this is an issue until you get down to near that minimum mean of 60nm. From that paper, I don't see Intel reaching 3GHz with the process as it is now. The schmoo plot even shows it. And I doubt they would be so foolish to ship parts with 17A standby current even if they passed all stress tests. A stand by current that high guarantees that a significant number of devices are on that very steep roll off part of the Vt vs. L poly curve. It is extremely dangerous to operate there. I believe they will have to shrink the process further (perhaps 10-15% area)) and perhaps raise the voltage slightly to get a significant number of 3GHz parts. The IEDM paper specifically mentions a 5% linear shrink and I wouldn't doubt Intel would raise the voltage again to 1.55v or 1.6v if pressed and if additional stress readouts look promising. Makes sense???
THE WATSONYOUTH
P.S. I remember a time when standby currents were of the order of several hundred milliamps. |