To: keithsha who wrote (46092 ) 6/7/2000 7:03:00 PM From: SunSpot Read Replies (1) | Respond to of 74651
Yes, 4000 Linux machines doing one thing: Providing the google.com search engine. Soon it will probably be 6000 Linux machines. This is not number crunching but data processing and databases. And it's a real world commercial solution, not just a sponsored demo like Microsofts Terraserver (...and Microsoft Research needed a large database to demonstrate the capabilities of its new database software.) I agree, that supercomputing is about hardware. But still, the fact that Linux is very much used in supercomputing, and Windows isn't, shows "headroom" in this group of applications. If you want to talk about headroom, then maybe we should talk about possibilities of scaling a system. A typical Linux tool is 20KByte to 300KByte. I just wrote a network monitoring program that occupied 6804 bytes. In a 32MB server, this program would fit 4931 times. So, with 32MB RAM, we've got ourselves a pretty big server with plenty of headroom. Windows 2000 Server doesn't even run well on 128MB RAM. If memory or CPU is not enough on a L¡nux system, move the entire solution to 64-bit Alpha or an IBM S/390. Windows doesn't do mainframes, and even NT on Alpha processors is 32 bit. Do you still think Windows has more headroom? It only has, when you limit yourself to certain applications and only examine Intel 32-bit processor solutions. I cannot take seriously comparisons between an NT system and a Linux system with the same amount of memory and the same processor. They should compare two computer systems with the same TCO or the fastest solution both systems can provide, and choose a realistic use for the servers, not a specific piece of software. I haven't seen any benchmarks of Linux on 64-bit Alpha, the future Itanium or the IBM S/390 yet, but I think the figures are pretty.