Andreas,
<<. In the not too distant future CPU performance won't matter any more to all those gaming freaks and they will buy cheaper solutions (i.e. AMD). Probably only designers (CAD, graphics) or scientist will need those high end CPU then>>
I'd like to respectfully disagree on this one. I've heard the opinion that applications (including games) will take advantage of the technology available, so that the argument that we'll reach a point of Mhz 'excess' doesn't necessarily hold. I guess I immediately think of several similar examples: one from the past - I remember when the Apple II first came out, and had a disk drive attached, and floppy disks (5 1/4") held 360k of information. Everyone wondered why in hell one would *ever* need more than one floppy disk per system, since most programs were measured in dozens or perhaps hundreds of lines of code. A 10 gig hard drive is now considered cramped. One from the present - while your friend may be correct that anything above a 400 Mhz system with a good video card makes little practical difference in frame rates at standard resolutions (e.g., 640X480), the 'bleeding edge' that most gamers are now into is running games at 1280X768, or 1600X1200, with all 'options' on (e.g., volumetric fog, z-buffering, etc.)and while I'm anything but a technie, I would think that at such resolutions "Mhz Matters" (TMjamok99)in terms of CPU speed/framerates, rather than just how fast the processor or memory in the video card is. (Again, I could be wrong, a tech whiz I'm not - admittedly). One from the future - When we're playing games where one gets to fight Obi-Wan Kenobi (or whatever the current cultural movie icon is in that year) in 'Holographic 3D VidSpace' (TM Nvidia Corp. - 2011 A.D.) in your living room, CPU speed might have something to do in bringing that experience to you <g> |