SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Advanced Micro Devices -- Ignore unavailable to you. Want to Upgrade?


To: Petz who wrote (45499)1/11/1999 9:36:00 PM
From: Elmer  Read Replies (1) | Respond to of 1583371
 
Re: "Yousef, re: $2,000 cost/wafer including depreciation. This is not true using any reasonable definition of depreciation. Furthermore, add R&D and MG&A, and the Celeron is losing money at $70 a chip. Remember, Intel's $780 million in depreciation is at least 3/4 related to CPU production, and they only made 26M CPU's. And I don't really care if they are paying depreciation on useless assets -- that just means they should be charging MUCH MORE than $780M in depreciation per quarter! Thats 3/4 of $30 a CPU anyway you slice it and results in a cost/wafer a heck of a lot higher than $2,000 per wafer (about 120 CPU's)."

John, Intel makes far more chipset devices than they do processors. Add to that controllers, communication and network products and even Flash memory. So depreciation is spread over far more than just processors.

Re: "The $2,000 cost per wafer is total BS."

Not "Total BS" but Yes, that number is too high.

EP



To: Petz who wrote (45499)1/11/1999 10:35:00 PM
From: Yousef  Read Replies (1) | Respond to of 1583371
 
John,

Re: "$2,000 cost/wafer including depreciation. This is not true using any
reasonable definition of depreciation."

Please stop, Petz ... You are making a "fool" of yourself. How many different
people does it take to tell you that you are WRONG.

Make It So,
Yousef