SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Applied Materials -- Ignore unavailable to you. Want to Upgrade?


To: Robert O who wrote (62642)4/4/2002 12:18:55 PM
From: mitch-c  Read Replies (1) | Respond to of 70976
 
OT - off the beaten path, but still chip-related - video chips/companies/opinions

I haven't looked much at the business models or stock action behind video chip companies - instead, I'm an end-user/specifier who has to fix what I choose to use. From that perspective, I think the NVIDIA chips provide much better support to high-end graphics packages, and stick much closer to the "standards" (DirectX support, coprocessing, etc.) than others. NVIDIA-based cards have become my preferred choice, for now.

For your "integrated video home" scenario, the computer video alone will probably not make a significant difference to the picture *quality* ... that will depend more on monitor resolution and camera resolution. But - stable, standard video drivers WILL ease the process of hooking all that stuff together and passing the pictures (files) back and forth in a viewable form.

Deconstructing advertiser-speak:
"High speed performance":: faster refresh rates across the memory space for complex (calculated or rendered) images. Smoother video motion; fast frame rates.
"Richer colors":: more memory for greater color depth. (16-bit, 24-bit, or 32-bit colors at higher resolutions).
"Greater resolution":: more memory to address more dots on the screen. (1024x768, 1280x1024, 2048x1532, and so on).

Nowadays, each of these is likely to be limited more by your monitor than by your video card. Bob's choice of (twin!) flat-panel LCD's is excellent (drool ... <g>) - as a rule of thumb, I figure a certain size LCD has as sharp a picture as the next larger CRT. (15" LCD == 17" CRT).

You always need to be wary of Intel as a gorilla - but the math and techniques behind video processing (I did a little grad work, there) are fairly arcane, and optimizing them can get to be more art than science. I think that ceding the low end in favor of the high end is a workable strategy - after all, isn't that what Intel does with CPU's?

- Mitch