SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD) -- Ignore unavailable to you. Want to Upgrade?


To: economaniack who wrote (219802)12/9/2006 10:59:22 AM
From: eracerRespond to of 275872
 
Re: Graphics processors eat huge amounts of silicon

Seems like a great fit for a company which leads the industry in the move to smaller manufacturing processes. With ideal scaling a 484mm^2 8800GTX GPU would shrink to 121mm^2 at 45-nm. Realistically it would end up in the 150-200 mm^2 range which is still smaller than the 206 mm^2 Smithfields which are selling for under $100.



To: economaniack who wrote (219802)12/9/2006 11:16:40 AM
From: TenchusatsuRead Replies (1) | Respond to of 275872
 
Economaniack, > IE has 0 (zero) marginal cost, and Microsoft has a monopoly on the PC operating system, so they could bundle them together, charge whatever they wanted and "give" away IE while making whatever profit they wanted.

a) What's the marginal cost of a Windows CD-ROM? Heck, what's the marginal cost of an application that you pay to download?

b) Why is the Microsoft monopoly mentioned like it is a fact of life, while the alleged Intel "monopoly" is discussed on this thread like the worst thing that could ever happen?

c) What's stopping Intel from just "giving" away the chipset and graphics as a bundle with the CPU? (Not saying that Intel would do it, but they could and so could AMD.)

Like you said, the basic architecture of the PC itself has become a battleground. Why wouldn't the pricing models itself change just as fast as the PC architecture?

Tenchusatsu



To: economaniack who wrote (219802)12/9/2006 1:30:56 PM
From: Sarmad Y. HermizRead Replies (2) | Respond to of 275872
 
E,

I see your post has 7 recommendation. And the hour is still early. Congratulations. A definite candidate for the Hall Of Fame of Recs.

I understand you have been following AMD for a long time. I wonder if you have insight on what has gone wrong in AMD's 65 nm process. 7 months ago (early April) you said AMD was already making 65 nm chips, and volume would be ramping within a couple months of April. I wonder what is your interpretation of events (or lack of) since then ?

Sarmad

---------
Message 22331245
-------

To: paarl99 who wrote (192388) 4/6/2006 9:40:27 AM
From: economaniack of 219818

Beyond beyond.

AMD is already producing 65nm chips. Fab 36 was equipped for 65nm. They have just made srams on 45nm and they aren't sure whether they will use immersion or not means it is years away from production. 65nm will start production ramping in a couple months. AMD will need the capacity gains from going to 65nm next year.

E
------------



To: economaniack who wrote (219802)12/10/2006 12:01:57 AM
From: pgerassiRespond to of 275872
 
Dear Economaniack:

PC architecture isn't a battleground like you seem to think. If nVidia wanted to emphasize the NB as the central point, they would have produced integrated GPUs into NB over the last few years. They got out of that business for a while. They use their NBs to help them sell their discrete GPUs. They found it hard to compete with Intel which subsidizes their GPUs from their CPUs. With AMD, they and ATI could compete with each other and neither could subsidize their NBs with GPU profits.

ATI went for the embedded GPU market while nVidia went for chipsets. Once each saw that other was making money in their outside niche, they built me too products to get in. Thing is that the chipset market is a quicker entry than the embedded market. So we see that only ATI is visible in the embedded market while both fight with Intel, Via and SIS in chipsets.

So there is no battlefield in PC architecture. Its settled as far as it goes. What is the battlefield is what accelerator should be added to the CPU and in what form should that addition be. We had a battle ground in FPUs. There were vector FPUs, loosely coupled FPUs and tightly coupled FPUs. Tightly coupled FPUs won that debate and were inserted into CPUs.

Now we have vector FPUs, loosely coupled PUs, programmable FPGAs, multithreaded PUs and, of course, GPUs, both loosely and tightly coupled. Intel wants to go the internally developed tightly coupled GPU. The others have to be loosely coupled multiple hops away from the CPU.

AMD OTOH has a very good plan to see how much coupling is required and which should be where. They opened their architecture to allow anything from a loosely coupled independent fully GP CPU to a dependent tightly coupled accelerator getting commands straight from the AMD64 core's scheduler. Anything that is popular will get a seat right beside the core on die and be added into the AMD64 ISA to become globally available to programmers everywhere. Torrenza becomes the R&D for future CPU die enhancements.

Everyone who participates wins and quite a few who do not. Torrenza suppliers have prototypes already tested so get a 6-12 month lead on the competition. CPU makers get knowledge of what works. AMD gets a 1 to 2 year head start on other CPU makers by already having a design that works and needs at most a process switch to be added onto the CPU die. OEMs get knowledge of which configurations are good for what purposes and how popular they are. ASIC makers find which configurations are popular enough to have specialized versions be profitable and not popular enough to get AMD's or other CPU maker's attention.

And other CPU makers can win here. There is nothing to stop someone like IBM to make a socket F Power6 and make IBM Power servers using a socket F server MB without any AMD64 CPUs plugged in. Ditto for Cray, Sun, Fujitsu, nVidia, VIA, SIS, Motorola, TI, Transmeta, and a host of others. Even Intel, if they could ever swallow their pride. AMD would still make out like a bandit due to the huge mountain of AMD64 and x86 software. They will likely have the bulk of the socket F fillers and a percentage of the revenue of the rest. And any that make socket F versions, could also make AM2 and S1 versions for the desktop and mobile markets.

Pete