SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Intel Corporation (INTC) -- Ignore unavailable to you. Want to Upgrade?


To: Robert Salasidis who wrote (148907)11/20/2001 12:33:42 AM
From: Elmer  Read Replies (1) | Respond to of 186894
 
Precisely the point. I used a discreet MOSFET to compare the difference in leakage between 25 and 120 degrees C, and the difference was 25x (IRF7401). Using a discreet Schottky diode, the difference between 50 and 100 deg C is 20x (IRF30BQ015)

In the world of Test we used to think we could detect manufacturing defects causing microamps of leakage. It was called Iddq testing and it was the darling of Test a few years back. Like so many other things that come out of academia it sounded great on paper but the problem was that at final test when using high temperature, the normal leakage currents increased 100x or more so how could we screen for a few microamps?

Comparing AMD at 50 dec C (below the knee) vrs Intel at 100 dec C (past the knee) is nonsense.

EP