SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Pastimes : Computer Learning -- Ignore unavailable to you. Want to Upgrade?


To: Patricia Meaney who wrote (26263)3/19/2002 12:54:15 PM
From: Ron  Respond to of 110653
 
Depends on your need for speed... I am running a couple of trading computers with high speed connections and do a lot of extremely fast day trading. DeFrag makes a definite difference. I normally defrag once every couple of weeks. And yes, I have Norton Systemworks, and agree speed disk is faster. But on the one computer I was having the slow defrag with, Norton was not installed. It will be soon, however :)
If I did not mess around so much with mp3 music files, I probably would not need to DeFrag as often, but one has priorities, you know .....



To: Patricia Meaney who wrote (26263)3/19/2002 1:01:46 PM
From: CharleyMike  Read Replies (1) | Respond to of 110653
 
If it's all you got, run it often.

Norton SystemWorks does a much better job with Speed Disk plus you get WinDoctor, an Anti-virus app, and more than most people will ever use.



To: Patricia Meaney who wrote (26263)3/19/2002 3:01:31 PM
From: thecow  Respond to of 110653
 
Patricia

From what I read the current "in vogue" thing to do is defrag less often than once was suggested and to scandisk more. I have seen in several places that a scandisk once a week and a defrag every month or two is sufficient with 98 machines. Here's a good explanation of defrag from PMS Witch a while back.

"To:PMS Witch who wrote (16578)
From: PMS Witch Tuesday, Feb 20, 2001 11:57 AM
View Replies (1) | Respond to of 26278

Disk fragmentation …
Recently, on this thread, a few participants have mentioned various levels of disk fragmentation. The posts hit that magic number where my interest was stimulated enough to look at the matter further, yet no so many that I became bored with the subject.

First example…

I have a partition devoted to storing images of my system. Each image is roughly 150meg. For discussion simplicity, I’ll claim I have ten. When checking this partition, Norton reports zero fragmentation. This is no surprise since these files are never deleted; thus, no holes ever appear for later filling with file fragments. I consider zero fragmentation the simplest case: All files stored in continuous disk space. Simple!

Now, let’s try something. I wipe the partition with the images, create a 75meg file, and store it at the beginning. Next, I store a tiny text file, say a grocery list. Following this, I delete the 75meg file and store a 150meg image. Now I have my image split in half on the disk since the first 75meg occupies the space vacated by my 75meg junk, the tiny file must be written around, and the remaining 75 meg of my image gets written beyond this tiny file. I continue this process until I have ten images, all split in the middle. Then I delete my tiny files. I’m again left with ten huge files.

This time, when I check fragmentation, each and every file on this disk is split, so the disk gets reported as being 100% fragmented. However, the disk contains 20 chunks of continuous data of 75meg each. By all practical standards, this disk has been written very efficiently, but I wouldn’t know it by the report. Running the defragging software against this disk will cause almost 1.5 gig of data to be moved for a profoundly tiny improvement in storage efficiency.

Next example…

Some people save all e-mail. (A practice Microsoft probably wishes their employees hadn’t followed!) If one is sufficiently popular or important, they’ll eventually get a huge pile. For discussion, lets assume 100,000 messages are saved on disk in separate files. Further, lets assume each is less than the minimum cluster size for that disk, (Mine is 4K) which will mean that each and every one of those 100,000 files will be viewed as stored without fragmentation. Lastly, assume the system has 1,000 other files, (Mine has 1,500) and that the disk has never been de-fragmented. Obviously, the system files on this disk will be a real mess, scattered into all parts of the disk, and under normal inspection, likely to report levels of fragmentation that qualify for the record books. But when the fragmentation report gets generated, the system counts 1,000 files in a mess and 100,000 files pristine. The report indicates 1% fragmentation. The user concludes her disk is in excellent shape.

Next example…

Jumping back to the big files with splits in the middle, we see that one break causes the file to register as fragmented. But what if the file had 10 breaks, 100 breaks, or 1,000? Clearly, there are different levels of fragmentation, but currently, they are measured in a binary manner: Continuous or fragmented. Also, as in my first example, the break was a one cluster interruption. The disk would continue to spin, the head would remain on the same cylinder, and the loss would be the latency of one read. The impact would be minimal. If the data was spread all over the disks many surfaces, on various cylinders and sectors, the performance compromise would be severe as many seek times were added to many latencies. Yet in both cases, we’d have one file marked as fragmented, and this one file would be included in the reported statistics identically no matter what its impact on storage efficiency.

Next example…

My disk images are written once, and with luck never used. If I need to restore my system a number of times from the same image, I’ve real problems. In short, the data just sits there and does nothing. Compare this with a program such as Excel on an accountant’s machine. (Excel, for those who may not know, is a spreadsheet program. It displays rows and columns of cells on the screen. Users type formulas into these cells, and Excel returns beeps and error messages.) The Excel program file is 5meg. Our accountant friend may execute this program over 100 times per day. Clearly, any fragmentation in this file would be 100 times more serious than in a relatively unused file. Again, this fragmentation would be reported as yes or no, with no weight given to its impact.

A suggestion…

I think fragmentation should be reported as the number of continuous clusters used by files divided by the total number of clusters used by those same files. In short: F = (1 - CC / TC) or as a percent F = ((1 – CC / TC) / 100) Reporting this value would give users a clearer understanding of just how efficiently or poorly their data is stored on their disk. (This wouldn’t help our accountant friend, because file use is disregarded.) Although Win98 tracks application use, as far as I know, it doesn’t track the use of individual files beyond last access date and time. This lack of data would make implementing any algorithm considering file use difficult; therefor, I’ll limit my wishes to the possible.

Since the percentages reported are inconsistent with storage efficiency anyway, we may be better off if we treat them sceptically, or at least with less concern. I would suggest that when users begin to notice increases in disk activity for routine work, or what sounds like additional clicks as the disk heads flutter back and forth, it may be time to defrag. If the system is performing its work in a reasonably acceptable manner, disk defragging efforts could be more profitably applied elsewhere. I’m convinced that defragging disks has consumed far more time and effort than it’s ever saved once the work has been completed.

Having said this, I still defrag, just less often.

Cheers, PW. " Edited in from a different PMW post...The change-hating technology-fearing Luddite



To: Patricia Meaney who wrote (26263)3/19/2002 3:24:35 PM
From: thecow  Read Replies (1) | Respond to of 110653
 
Patricia

From Smart Computing magazine.

Another fast, free method for keeping your hard drive running smoothly is to defrag it regularly. As programs are added to and removed from the computer, bits of files get scattered all over the surface of the hard drive. If the drive heads that read data have to move all over looking for the scattered data, programs begin to stutter and run more slowly. The solution is to pack all of the stray data into one contiguous area of the hard drive so the drive heads don’t have to move as much. This is what defragging does. Windows 98 users get the added benefit of Intel optimization software that intelligently places the files you access most often close together and also can keep most of the data for a program, such as all of the files required by Adobe Photoshop, close together.

Here we’re setting up a permanent 500MB swap file on our second hard drive to help increase the speed and efficiency of the primary drive.
Windows 95/98 (Win9x) comes with a built-in defragging tool, which you can launch by clicking Start, Programs, Accessories, System Tools, and Disk Defragmenter. A drop-down list lets you select which drives you want to defrag, and you can click the Settings button to access some advanced options. The Settings page is where you enable the Intel software in Win98, and you can also have the Disk Defragmenter look for hard drive errors. We don’t recommend this latter option unless your hard drive is crashing unexpectedly, as it adds a lot of time to the defrag process.

Click OK to apply the settings but don’t click OK in the Select Drive window yet. First, shut down all of the extra programs that are running on your PC, including your Internet connection and antivirus software. If there are any icons showing in the System Tray (except the yellow speaker icon), you’ll need to close the programs associated with them before running the Disk Defragmenter. In some cases you may need to press CTRL-ALT-DELETE one time only and manually select a program by clicking its name in the Close Program list and clicking End Task. If you are unable to close all of your other programs, they may write new data to the hard drive and force the defragmenter to stop and start over from the beginning.

It’s important to tell Windows what type of computer it’s running on if you want to keep your hard drive running efficiently.
Even if you can’t shut down all of the other programs, click OK to run the Disk Defragmenter. You can watch the program at work by clicking the Show Details button, but don’t launch any other programs while the defragmenter is running. Wait until the process is completely finished before rebooting the computer or doing any other work. We recommend running the Disk Defragmenter every month or so.



To: Patricia Meaney who wrote (26263)3/19/2002 5:05:22 PM
From: Eric L  Read Replies (1) | Respond to of 110653
 
re: 3-0 for frequent defrag and OnTrack stuff

<< Ok, so we have one vote for defrag and one vote who says it's useless.>>

Actually we have two for defragging frequently. Three counting me.

Zardoz said:

Defrag is really useless...

Then he said

Go spend a few dollars, buy norton system works, and run speedisk

Norton SpeedDisk is a defagmenting disk optimization utility available in Norton SystemWorks (and Norton Utilities). It is what I use.

I also have another very good Utilities Suite from OnTrack (SystemSuite 4.0) that is very comprehensive (more comprehensive than systemworks) and very good (includes a decent firewall), and slightly less expensive than Norton System Works. Their Jet DeFrag is faster than SpeedDisk. A knock off on SystemSuite 4.0 has been the frequency of updates for anti-virus updates. They have a live-update feature similar to Norton's and I have noticed they are updating more frequently lately (I still prefer and use NortonAV).

SystemSuite 4.0 also includes a great powerful File Manager called PowerDesk 4.0 (5.0 just released). It is similar to the late and great file Manager that CentralPoint used to support in PCTools and then morphed over to Norton Navigator and then (unfortunately) abandoned.

OnTrack has a fully functional freeware version of PowerDesk 4.0 (without file viewers) that somebody recently mentioned. It is a great piece of freeware.

ontrack.com

One idiosyncrasy of SystemSuite 4.0's one button checkup (for me at least on XP Pro) is that it whacks my Norton AV (requiring a complete reload and it also whacks something in Word for Office XP requiring a short reinstall). I haven't isolated what component is doing that. One of these days I'll troubleshoot it or just fire off a note to OnTrack support.

Zardoz advice on creating a separate small partition for virtual memory is very sound advice for WIN95/98xx but XP is a slightly different variation on this.

- Eric -