James, are you an engineer, or just pretending to be one? Wait - don't answer...
"Memory defragmentation" is no reason to reboot any OS. All modern OSs allocate memory in page-size increments. Allocation of smaller units "as small as a few bytes" is typically handled by the compiler's run-time support routines, and is allocated within each application out of pages obtained from the OS.
The pages can be reordered in application memory space by the OS at will (by writing the page table), out of as many discontiguous pages as need be, when there is a need to create contiguous memory longer than 1 page for an application requesting it. There is nothing to gain through physical defragmentation, because there is no additional overhead associated with the use of discontiguous pages.
It is NOT "increasingly difficult for the OS to allocate contigious chunks for large requests", because it is not necessary to ever do so. DIS-contiguous pages will work just fine. The OS MAKES them contiguous, without having to move a byte, with a bit of help from the hardware. Some OSs have been doing this for, oh, 30 years or so. (IBM)
(OK - I lied. In NT, at least, it IS necessary to allocate certain types of memory - primarily for certain types of DMA - in physically-contiguous areas. The OS normally allocates this type of memory only at boot time, although it IS possible for drivers to request it at other times.)
FWIW, I run both Windows NT and Solaris-X86 side-by-side. NT does requires infrequent reboots, at least compared to Windows 95/98. But it does require a reboot for seemingly trivial installation of software. Software installations under Solaris very seldom require a reboot - at least 10 times less frequently than does NT.
My NT system needs to be rebooted, for one reason or another, on average once a week. I don't recall ever HAVING to reboot Solaris X-86. (I have only rebooted it to install hardware.) |