SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD) -- Ignore unavailable to you. Want to Upgrade?


To: andreas_wonisch who wrote (56752)9/30/2001 3:01:03 PM
From: fyodor_Read Replies (1) | Respond to of 275872
 
Andreas: I believe the limit is at roughly 100fps. Just set your monitor refresh rate to 60 Hz and compare it with 100 Hz. You'll notice a huge difference. But between 100 Hz and 120 Hz it seems about the same (at least for me) -- both images don't appear to flicker any more. So the limit for the human eye is probably in that range.

Well, it's actually a very complicated issue and while I've had some biophysics and neuro-science, I don't pretend to fully understand it.

Two main points, though&#133 a) The eye has two main receptor types. One is good at distinguishing colors, the other good at detecting movement. The latter are located primarily in the peripheral regions of the eye, which is why you can more easily detect flickering if you don't look directly at the monitor, but rather off to one side. b) The screen is essentially a static image. The parts of the brain involved in dealing with inputs from the optical receptors are very adept at resolving movement (among other things, the brain essentially does an XOR of the images).

There are several games who actually report three different frame rates: minimum, maximum and average. Unfortunately most hardware sites only use the average frame rate for their benchmark tests.

I seem to recall some benchmarks one Quake (1 or 2) a while ago (on Tom's Hardware, I believe) where the whole frame rate trace through the demo was shown. It was quite informative, in that the two video boards tested (one NVIDIA, one 3dfx, IIRC) displayed roughly same average frame rate, but vastly different characteristics during "dips" (again, IIRC, the 3dfx board did better during the dips).

BTW, another point: Every framerate that is actually above the refresh rate of your monitor (i.e. usually everything above 100fps) can't be display anyway, so it doesn't make much sense to compare 200 fps to 250fps. If you use a TFT display it's even worse (usually they can only be refreshed between 20 and 30 times per second).

I thought they had come a long way in solving the TFT refresh issue, no?

For example,

- Screen Performance
Brightness : 170 cd/m2(Typ.)
Contrast Ratio : 200 : 1 (Typ.)
Response Time : < 50 ms
Viewing Angle : Horizontal: 160 degrees;
Vertical: 160 degrees
- Scanning
H-Frequency : 30~80KHz
V-Frequency : 56~120Hz
Maximum Resolution : 1280*1024 @
75Hz


(source: lge.com )

-fyo



To: andreas_wonisch who wrote (56752)10/2/2001 3:05:42 AM
From: Joe NYCRead Replies (2) | Respond to of 275872
 
Andreas,

Good discussion.

I believe the limit is at roughly 100fps. Just set your monitor refresh rate to 60 Hz and compare it with 100 Hz.

The frame rate that guys like Tom and Anand measure is I believe the frame rate of how fast data get's written to the memory buffer of the video card (on average as you point out). On the other side, this data get's written by the RAMDAC to generate analog output of the video card to the analog monitor.

The ideal situation would be if the graphics card can generate data at the same frame rate as the refresh rate of the monitor, never dropping below this floor. You don't really care a frame rate any higher than that, since the information is not going anywhere. It stays in the video card. If the frame rate is 150 fps and monitor refresh rate is only 75 Hz, every other frame goes basically nowhere. It doesn't leave the video card.

Your comparison of different monitor refresh rates and your perception of them is I think a different argument, and it goes into how a human eye reacts to an image being drawn on CRT with various refresh rates. The human eye is not too crazy about anything below 70 Hz. You can have a monitor with refresh rate of 75 Hz (which will not hurt your eyes, while the monitor is displaying something with a frame rate of 25fps, which you may perceive as sluggish, but it will not hurt your eyes.

If you use a TFT display it's even worse (usually they can only be refreshed between 20 and 30 times per second).

I was under the impression that the common refresh rate was 60 Hz. But the way TFT generates image is different, since each pixel generates persistent light, as opposed to CRT, where the spot that represents a pixel gets hit by the ray, is at it's maximum intensity as it is hit, and slowly goes down in the intensity of the light for the next 1/60 of a second (in case of 60 Hz), which basically generates blinking at very fast pace. The faster the blinking is, the more the eye and the brain get fooled into thinking that they are looking at a persistent light.

Joe