SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Advanced Micro Devices

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: StockMan who wrote (31332)4/7/1998 9:53:00 PM
From: Katherine Derbyshire  Read Replies (3) of 1571595
 
>>A fully depreciated Fab is more cost effective than one with new equipment. Part of
the overall cost of the product includes the cost of the Fab.

Thus .35u products can be cheaper than newer .25u Fab products.<<

Well, not exactly. If that were strictly true, there would be no reason to go to smaller feature sizes.

A fully depreciated 0.35 micron fab has a lower cost per *wafer* than a new 0.25 micron fab. However, a 0.25 micron fab gets significantly more die per wafer, so the cost per *die* is less. (Assuming lots of things, particularly comparable die % yields in both fabs.)

Moore's Law, which is the foundation on which the entire industry's business model rests, relies on this reduction in cost per die. It states that the number of transistors on a chip doubles every 18 months at no additional cost to the customer.

Katherine

PS However, in the case of Intel vs AMD specifically, you also have to figure in things like Intel having already fully amortized the development and production ramp investments in older parts, and AMD still being in the very early stages of the 0.25 micron learning curve. Far too many variables for a simplistic 0.35 vs. 0.25 micron argument.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext