SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD) -- Ignore unavailable to you. Want to Upgrade?


To: jspeed who wrote (213661)10/16/2006 8:43:15 AM
From: j3pflynnRead Replies (1) | Respond to of 275872
 
jspeed - I don't think that's going to happen, unless Intel goes down the same road; after all, the majority of systems would still need discrete cards for high performance, and it'd be foolish to cede that market to nVidia, unless ATi doesn't think they can make money there. AMDATi may be able to take discretes to a CPU socket, but I think it'll be quite a while before they can bring a high-performance video GPU on-die. There just isn't enough space, not to mention GPU-style high performance memory access. My 2c.



To: jspeed who wrote (213661)10/16/2006 8:56:13 AM
From: pgerassiRead Replies (2) | Respond to of 275872
 
Dear Jspeed:

JO is full of it. Intel has lousy integrated graphics. The performance just isn't there to play any kind of games. Ok, maybe turn based strategy. To remove discrete graphics in the mainstream, you need to get 50-75% of the high end average performance in the integrated realm. That will likely require about an on die memory cache equal to 4 frames of display at the highest resolution. One holding the frame being sent to the monitor, one being drawn, one holding the depth and the other one to hold texel lists, vertex lists, textures, bump mapping, lighting, etc in the cache.

Given a high end display of 2Kx1.5K at 32 bit per pixel with 32 bit depth, thats 12MB per frame or 48MB of on die RAM. HDTV at 1920x1080 at 32bit, could reduce that to 32MB. 32MB on die is not likely until 32nm, although ZRAM might allow it to be on early 45nm. Given that 32nm isn't until 2009, it is likely that discrete graphics will not reduce significantly before then.

Now that does not say that a GPU couldn't migrate on to the CPU die before then, but that is likely to be for value line CPUs where integrated graphics are fine and resolutions are 1024x768x24 or 1280x800x32. Here frame rates can be in the 24-30 range at medium quality. HDTV video will need to be 60fps, but that is less demanding. 2 channel DDR2 or DDR3 will be more than enough for these value line tasks. This will be for things like value laptops, set top boxes, A/V equipment, auto navigation, etc.

That is how it will likely go. As the one die integrated graphics becomes faster and more capable, then and only then will discrete begin to die off. We are very far away from sufficient power to do the ultimate virtual reality simulations. Where its hard to tell that you are not looking through the eyes of a real person in a real environment. Until we can do that with integrated graphics, discrete will always be there.

Pete



To: jspeed who wrote (213661)10/16/2006 9:34:15 AM
From: eracerRead Replies (1) | Respond to of 275872
 
Re: Joe Osha prediction that AMD will let Discrete Graphics "Die on the Vine"

Let's hope not. ATI has a strong presence in the mainstream/performance PC and gaming console markets. AMD could eventually become the one of the most despised names in video gaming history if they intentionlly or unintentionally kill the only serious competition that exists in those markets.



To: jspeed who wrote (213661)10/16/2006 9:34:42 AM
From: fastpathguruRespond to of 275872
 
Joe Osha prediction that AMD will let Discrete Graphics "Die on the Vine"

No way, no how. On-die GPUs won't be able to compete against discrete chips until it is cost-effective to put a bleeding edge GPU on-die.

Of course, the GPU logic may be partitioned and re-allocated among system components so that they don't look anything like today's GPUs... The video/codec section may go onto the traditional HTT southbridge, enhancing its role in providing all IO, while the massive compute resources are put in a specialized "stream processor." (In that sense, the "GPU" will go away, only by giving way to its successor.)

Just wait. R600 will shock not just in its graphics performance, but also drive into HPC thanks to specialized platform/toolchain support.

fpg



To: jspeed who wrote (213661)10/16/2006 10:09:05 AM
From: RinkRead Replies (1) | Respond to of 275872
 
Joe Osha has little imagination.

I think AMD stated it won't abandon discrete graphics. I've also read that it made commitments to Canada that it will expand engineering.

Besides that I'm very confident that AMD continue high end discrete graphics for logical reasons. High end discrete graphics will never make it into the CPU. It needs 10x the mem bandwidth at high costs; also currently the die sizes are quite a bit larger already than those of cpu's. Instead AMD will use high end discrete technology to seed into the CPU -- the CPU will always contain a subset of high end discrete, either feature wise (gen n-1) or capacity wise (e.g. 25-50% of the programmable shaders compared to high end discrete graphics). AMD also has the best technology to do both integrated and high end discrete: XBAR for integrated graphics, and HTT/HTTc for high end discrete. Besides top end discrete has highest margins.

In conclusion there isn't any reason to presume AMD will abandon high end discrete.

Regards,

Rink



To: jspeed who wrote (213661)10/16/2006 11:53:24 AM
From: PetzRead Replies (1) | Respond to of 275872
 
If AMD did let ATI discrete graphics "die on the vine," it would be an under-table deal with NVidia, it would probably backfire anyway, and I'd consider selling my AMD because of it.

Petz