SOURCE: Bruce Boyers Marketing Services

November 17, 2009 17:22 ET

Fragmentation Solutions: Is It Time for Prevention?

BURBANK, CA--(Marketwire - November 17, 2009) - File fragmentation has been a plague of computer system administrators since the advent of modern computing. It's a blessing and a curse; while fragmentation makes more efficient use of disk space, it also causes major performance problems and takes a serious toll on hardware life. Hence, it has always been an issue requiring attention.

Originally defrag was handled by backing up and restoring each drive, manually. That meant many overtime hours for system personnel, as the computer had to be offline while this was being done.

Next came manual defragmentation, which at least meant that backup-and-restore could be eliminated except where it actually needed to be done for other reasons. The defrag had to be run by an operator, though, which still meant nights and weekends spent in the computer room instead of being home with the family or out with friends.

Scheduled defragmentation was a considerable breakthrough. The system administrator could schedule defrag so that it would run while he or she was gone -- and nobody had to stay behind to make sure it ran.

As computing became central to business, and as business became global with the Internet, scheduled defragmentation became outmoded. So many systems had to remain up and running 24X7 that finding times in which a system had few to no processes on it was a real problem. It required nearly as much overtime to schedule defrag as it had to run it manually in the past. Some sites had even reverted to manual defrag simply because regular schedules couldn't be set.

Fully automatic defragmentation was a solution that made it possible for defrag to occur invisibly, in the background, so that scheduling no longer had to be done. There was never a performance hit from defrag and system performance and reliability were consistently maintained.

But technology marches on, and things evolve. Which begs the question: what's next?

The next logical step in defragmentation would not be defrag at all; it would be the prevention of fragmentation before it actually occurs.

By the time fragmentation happens, the system has already wasted I/O resources by writing fragmented files to scattered spaces on the disk, so a toll is already being taken even before a defragmenter goes to work. Hence, it would be highly beneficial to develop a method with which files could be written contiguously (as close to being in one piece as possible) right at the outset, while taking no toll on system resources to accomplish it.

With fragmentation prevention, system resources would be saved in reading files, as well as those saved in writing files in the beginning. Significant savings would also be achieved in energy consumption and cooling -- even more than is done with defrag.

We've moved as far up the line as we can with defragmentation -- and skillfully so. The time is now ripe for fragmentation prevention.

Contact Information