SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD) -- Ignore unavailable to you. Want to Upgrade?


To: Joe NYC who wrote (56832)10/2/2001 6:35:30 AM
From: andreas_wonischRespond to of 275872
 
Joe, Re: If the frame rate is 150 fps and monitor refresh rate is only 75 Hz, every other frame goes basically nowhere. It doesn't leave the video card.

Yes, good point. If even the worst-case scenario framerate is above the refresh rate of your monitor (i.e. 100 Hz in most cases), you don't get any benefit from higher frame rates. That's why I think that framerates in the 200fps range are completely irrelevant. IMO "Quake III" benchmarks doesn't make sense any more: Of course you can find out which CPU is theoretically faster at a given resolution but you won't get any practical advantage out of it because your monitor can't display that framerates anyway. And even at 1600x1200 pixels the framerate is above 100fps with high-end video cards (as shown in Anand's latest GeForce 3 Titanium review). We'll need new, more demanding games to evaluate CPU performance.

The faster the blinking is, the more the eye and the brain get fooled into thinking that they are looking at a persistent light.

And at some point -- probably around 100 refreshes per second -- you can't distinguish a monitor picture from a persistent picture. So my guess is that that's also the limit to distinguish between real motion and "faked" motion with dozens of frames per second.

Andreas



To: Joe NYC who wrote (56832)10/2/2001 7:29:44 PM
From: Robert SalasidisRespond to of 275872
 
The main reason to get extra speed, is for software that may allow more photorealistic images, more detail etc.

If the maximum detail, maximum resolution of a given game gives fps rates > the monitor refresh rate, then the CPU/graphics card is faster than is required for that application.

As for the comment of a 75Hz monitor displaying a 25 or 35 fps moving image. I thought about this before, but could not find any convincing arguments one way or the other. I do agree about the motion blur comment (ie a movie taken with a fast shutter speed will appear more "jumpy" than one with a slow shutter speed. When watching a 24fps movie however, even with a fast shutter speed, it does not seem that bad.

I would think more photorealistic games would likely be able to get away with 30 fps on regular computer monitors (but I cannot back that up).