SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD) -- Ignore unavailable to you. Want to Upgrade?


To: kpf who wrote (219935)12/10/2006 6:34:03 PM
From: economaniackRead Replies (2) | Respond to of 275872
 
kpf, OK I will be equally clear, cut the condescending bs and pony up some numbers. AMD R&D spending is currently $1.1 billion per year, Intel spends nearly $6 billion. Even removing a bunch for chipset development is gonna to leave Intel R&D per CPU unit sold at rough parity with AMD. Can you offer any reason to believe that Intel spends much less per unit on R&D? Enough less to be economically significant?

The "AMD needs one development fab, Intel needs one development fab" is nonsense. Intel has a dedicated Fab for development at each node, and other Fabs that do basic research. AMD pays IBM $200 million/year to share their NY development fab and splits the cost of process development. They then took the process to fab 36 which is already running about half capacity on 90nm production. Moreover, Intel pushes the envelope hard to maintain their 1 year process advantage, while AMD is content to implement the new node after Intel, IBM and even some of the contract fabs.

As for categorizing costs, if you claim that all costs up until a specific die reaches mature yields are process R&D I guess you can attribute nearly all R&D to process development, but that doesn't seem to be the way AMD categorizes their research effort or the related expenses. Not that the semantics matter. The question that prompted this increasingly empty series of posts was whether process development costs generate a substantial scale advantage for Intel. The bottom line is that Intel spends 5 times as much as AMD on R&D and sells something like 3 times as many chips, when large economies of scale would be reflected in lower per chip or per dollar revenue spending. So why isn't it?

e

ps in my department, making cryptic aspersions about someone's argument without answering its specifics would "earn you a kick for such hogwash" but I guess every department is different.



To: kpf who wrote (219935)12/10/2006 7:38:55 PM
From: pgerassiRead Replies (1) | Respond to of 275872
 
Dear Kpf:

AMD needs one development fab per node. You just can't do it with a fraction of a fab. It's really as easy as that.

BUZZ! WRONG!

AMD has been doing development without a full fab for it for the last few years. APM helps too. You could also argue that with sharing of process R&D costs with IBM and others, you don't need a full fab. Part of the initial development is at IBM, Fishkill and part at Fab 36. At no time was any full fab needed for development. Just part of a fab. That is AMD's way of R&D, efficient use of limited resources.

Intel OTOH throws a full fab at it, running lots of wafers to try to get to usable yields fastest. They can throw gobs of money at the problems. Intel's way of R&D, quickest results at any cost. Problem is now Intel has less resources and problems cost more. Changing your R&D target from speed at any cost to efficient use of resources, is nearly impossible without big shake ups, usually at the top. Without a Max the Axe, inertia just rolls over any opposition.

Pete