To: kpf who wrote (219935 ) 12/10/2006 6:34:03 PM From: economaniack Read Replies (2) | Respond to of 275872 kpf, OK I will be equally clear, cut the condescending bs and pony up some numbers. AMD R&D spending is currently $1.1 billion per year, Intel spends nearly $6 billion. Even removing a bunch for chipset development is gonna to leave Intel R&D per CPU unit sold at rough parity with AMD. Can you offer any reason to believe that Intel spends much less per unit on R&D? Enough less to be economically significant? The "AMD needs one development fab, Intel needs one development fab" is nonsense. Intel has a dedicated Fab for development at each node, and other Fabs that do basic research. AMD pays IBM $200 million/year to share their NY development fab and splits the cost of process development. They then took the process to fab 36 which is already running about half capacity on 90nm production. Moreover, Intel pushes the envelope hard to maintain their 1 year process advantage, while AMD is content to implement the new node after Intel, IBM and even some of the contract fabs. As for categorizing costs, if you claim that all costs up until a specific die reaches mature yields are process R&D I guess you can attribute nearly all R&D to process development, but that doesn't seem to be the way AMD categorizes their research effort or the related expenses. Not that the semantics matter. The question that prompted this increasingly empty series of posts was whether process development costs generate a substantial scale advantage for Intel. The bottom line is that Intel spends 5 times as much as AMD on R&D and sells something like 3 times as many chips, when large economies of scale would be reflected in lower per chip or per dollar revenue spending. So why isn't it? e ps in my department, making cryptic aspersions about someone's argument without answering its specifics would "earn you a kick for such hogwash" but I guess every department is different.