To: pgerassi who wrote (22794 ) 12/18/2000 12:15:57 PM From: fyodor_ Read Replies (2) | Respond to of 275872 pgerassi: NTSC is interlaced and has a resolution of 640x480 (square pixels). Well, NTSC is 340x420@60Hz. Interleaving is used to give 680x420@30Hz. There's no difference between the two bandwidth-wise.The video boards in PCs actually sample at a higher rate of 720x480 from NTSC for better conversions after processing. Ahh, I didn't know that ;)Yes, the Gigapixel technology would help greatly. Just the experimental HSR in the 1.04.01a 3dfx drivers which, although buggy, allows a Voodoo 5500 to easily beat a GeForce 2 Ultra 500 at 1600x1200 4x FSAA (75fps vs 25fps). I'm not familiar with the algorithms used by 3dfx for Hidden Surface Removal, but don't underestimate what NVIDIA can accomplish using the "C" in TCL (clipping). Sure, this requires that the game supports T&L, but high-polygon games for the X-Box will (or should, at least ;)). With regards to your bandwidth calculations, remember that NVIDIA will, by all accounts, be using some form of Z-compression (from NV20 forward). Add that to the polygon clipping (a form of HSR) and I still don't see that NVIDIA will have much of a problem delivering the bandwidth necessary for a console. Of course, if loads of people all of a sudden decided to buy High Res TVs of one kind or another, the situation would change somewhat. However, I consider that highly unlikely to happen within the first X-Box cycle. Remember, TVs are extremely cheap compared to monitors. This is almost solely due to one thing: Resolution. For the immediate future, High Res TVs are going to be severely limited by price. Additionally, virtually the only benefits of buying a High Res TV today is when playing DVDs. There are not a lot of "digital TV channels" available today. Sure, the number will grow, but the progress has been very slow so far and I think it unlikely that an explosion will happen within the next couple of years. Lastly... The GeForce series use an extremely brutish approach to anti-aliasing. It would not take a lot of work by NVIDIA to improve upon this greatly, further reducing the bandwidth requirements. All that said: Sure, NVIDIA could definitely benefit from some sort of tile-based rendering scheme (e.g. GigaPixel's). I don't dispute that. In fact, as I've stated before, I'm a huge fan of tile-based rendering. What I will maintain, however, is that I'm not at all convinced that they need it to reach the bandwidth requirements for a console box. (Z-buffer compression, clipping/HSR and improved anti-aliasing will do the trick). -fyo