>>>The "HDTV decode" you claim is really nothing more than 24 fps fluffy film bitstreams at 1280x720, not the real 60fps or even 1920x1080x24 or full 1920x1088x30i<<<
Correct. How much material is going to be broadcast at 1920i? By the time that is being broadcast en masse, are we still going to be using Rage128's? How many of ATI's competition can even handle a single HDTV mode? My point is straightforward: ATI has a huge lead on its competition. Given that it's a multifunctional chip that is less than half the price of a dedicated HDTV decode chip, I think that's pretty darn remarkable! How many Rage128's do you think ATI would sell if their chip cost twice as much as they do right now? Let me answer that for you -- a big fat zero.
>>>HDTV requires the full power of a high-end Pentium just to VLD and motion vector reconstruction. Lame.<<<
Yeah perhaps at 1920 on a low end PC. The reality is that we're moving closer and closer to this everyday. Everybody said the same thing about ML/MP MPEG-2 as well. Now you have complete full frame rate decode happening with 20% CPU usage.
>>>A $500 PC has a Pentium powerful enough for broadcast/cable TV analog-to-MPEG-2 encode <<<
Go back to my message. I said a fast PIII.
>>>I argue that a low cost Pentium (say Celeron) with reduced system memory (32 MB) etc. and FULL hardware support for HDTV decode is less expensive and provides a more pleasant experience for the $500 Media PC.<<<
Sure, if we're talking a low end machine that can't do a whole lot more, then yes a dedicated HDTV decode solution is better. But guess what? People still want to play the latest games on their $500 machine. They still want to take those Digital VCR captured movies and edit them. All of a sudden they have a very unpleasant experience.
>>>MCP is inevitably in hardware to reduce L2 cache trashing, whereas the other stages of decode can at least live within L1. <<<
Exactly. That's why that was the first thing to be offloaded by ATI into hardware, along with the YUV->RGB conversion.
>>>The coding efficiency of today's software-only encoders is lousy. Have you compared a 4 mbit/sec stream made by Ligos/MGI/etc. against one produced by a C-Cube DVExplore or Sony single-chip encoder ? The lower the bit rate for a level of video quality, the more hours on your hard disk will yield. Factor storage costs into the equation.<<<
Right, that's exactly what I said in my last message. That the motion estimation was nowhere near as complete as what can be done in a dedicated core. And I also mentioned that it would make sense for ATI to move that into their hardware next so that they could effectively generate P/B frames. You made it seem as if it was somewhere far off in the future. It's not. Software is already doing it, albeit not as nicely as dedicated hardware.
To put all this in plain english: I'm arguing that ATI has hardware that is clearly ahead technically compared to its competition (and no, a dedicated MPEG-2 encoder is not its direct competitor). You're arguing that it doesn't make your coffee and deliver the paper to your bed. Fine. I never said that it could do everything, I just said that it could do it better than its competition like the TNT2 or S4 or G400.
The same argument that you're using could be said about CPU's. "They all suck because they can't do realtime radiosity+tracing down to X levels." Well, what CPU at the same price range as a PIII/500 can? What's that you say? "A $13million dollar supercomputer could do this?" Sorry it's just slightly out of my price range...
P.S.- Correct me if I'm wrong but C-Cube's DVX part doesn't do 1920 HDTV decode either. |