ELmer, take your medicine, your amnesia is getting worse: <I have no idea what's you're talking about>
You had said, <Depreciation is included in the wafer cost.> at Message 20973451
and you had just estimated wafer cost as $5.60 per Prescott or $15.38 per Smithfield, based on a $2,400 fully processed cost per wafer. At Message 20971482 You claimed an average cost of $10 here: Message 20704068 Assuming good yields, I put Intel's costs pretty close to $10.
So, let's review the FACTS: Intel's depreciation last quarter was $1,140M. They sell an average of 43M CPUs a quarter.
At least 70% of Intel's depreciation is attributable to CPUs. That is a very conservative estimate, because many of those loss-leader comm products you like to brag about use old equipment in old fabs=very little depreciation. Or things like "reference designs" with no depreciation at all.
Do the math, that's $19 per CPU that Intel makes. Which makes depreciation per Smithfield 2.75x as big as that ($51), based on your own yield estimates and assuming Prescott is an "average" Intel CPU. (Also, assuming Intel has the yields you *think* they do, claim to know, but also claim amnesia about.)
So, do you agree now that wafer cost of $2,400 per wafer is unreasonably low, and largely irrelevant anyway, because depreciation cost is much more significant?
Petz |