SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Advanced Micro Devices - Moderated (AMD)
AMD 203.14-0.8%Jan 9 9:30 AM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Robert Salasidis who wrote (56733)9/30/2001 11:12:44 AM
From: fyodor_Read Replies (1) of 275872
 
Robert: If you are getting 35 fps, then that is better than what you see on TV or in the theatre.

Movie frames are manipulated (motion blur) to make it seem smooth. It's well-documented that the human eye can easily discern between 50fps and 100fps, and probably much higher as well (although I don't have the documentation to back that up).

The higher power does let you get higher refresh rates at higher resolutions, but the original comment was stating that 150 vs 175 fps is irrelevant - which is true.

The problem here is that the "150fps" or "175fps" are averages. What a gamer, casual or hardcore, cares about is not the average frame rate. As you point out, it doesn't really matter much if the instantanious frame rate is 150 or 175fps, but it does matter what the frame rate is during the inevitable "dips" (where there is plenty of action, lots of AI, smoke, lights, particles or whatever).

Since no one supplies a trace of the frame rate during a benchmark, the only number we have is the average frame rate.

-fyo
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext