How do Linux and Windows NT measure up in real life ?
gnet.dhs.org
Uptime and downtime
And then there's that crucial factor of availability. To be accepted as an operating system for enterprise use, the downtime caused by the OS should be minimal. That means first of all that the OS itself should be exceptionally stable, and here Linux is miles ahead of NT whose frequent breakdowns are almost legendary. This is due to either memory problems, file management or "occasional" problems (almost impossible to trace). One of the strongest points of Linux is that, in case of problems, reconfiguration or software loading, most of the time you do not have to restart your computer, contrary to Windows NT. And the same is true for preventive maintenance. Bloor Research had both operating systems running on relatively old Pentium machines. In the space of one year, Linux crashed once because of a hardware fault (disk problems), which took 4 hours to fix, giving it a measured availability of 99.95 percent. Windows NT crashed 68 times, caused by hardware problems (disk), memory (26 times), file management (8 times), and a number of odd problems (33 times). All this took 65 hours to fix, giving an availability of 99.26 percent. The winner here is clearly Linux.
Thanks to slashdot.org |