SOURCE: Bruce Boyers Marketing Services

October 20, 2009 08:16 ET

Why Not Prevent Fragmentation in the First Place?

BURBANK, CA--(Marketwire - October 20, 2009) - Prevention is always preferable to having to address a situation afterwards. A prime example is the intention behind the sign you see upon entering many national forests in the United States: a picture of Smokey the Bear with the slogan, "Only YOU can prevent forest fires." The U.S. Forest Service knows that educating people on the subject of prevention from childhood on is the best way to ensure that many fires don't get started in the first place. Addressing conflagrations after the fact is the stuff of major television news -- thousands of acres burned and lives, homes and wildlife seriously threatened. This can all be stopped before it happens by making sure campfires are completely out, along with other safety measures.

Over in the world of computers, we have a threat much less life-threatening yet nonetheless costly and aggravating: file fragmentation. From the beginning, considerable effort has been spent to halt its performance-crippling spread. First, it was backing up and restoring drives. Then, it was manual defragmentation. Soon after came scheduled defrag, and finally a fully automatic solution arrived.

These methods all have degrees of effectiveness ranging from mediocre to excellent, but they all share one thing in common: they address fragmentation after the fact of its creation. By the time fragmentation happens, the system has already wasted precious I/O resources by writing fragmented files to scattered spaces on the disk. Further resources must then be expended to clean up the fragmentation. Better and better defrag methods continue to be built -- but they're still cleaning up the mess after it's been made.

Just as you would teach people to take prevention measures to avert the costly fighting of fires, why wouldn't we then find a method of preventing fragmentation? Think of it: we would seldom to never have to defragment again. The problem would simply be solved before it began.

The programming feat would be to evolve a resource-efficient method that would allow files to be written contiguously (as close to being in one piece as possible) to begin with, while taking no toll on system resources to accomplish it.

With a majority of fragmentation prevented, system resources would be saved in reading files, as well as those saved in writing files to begin with. Drive wear would be greatly reduced when compared with after-the-fact defrag, and significant savings would also be achieved in energy consumption and cooling -- even more than is done with defrag.

With fires, prevention is always best. The time has come to prevent costly and time-consuming fragmentation as well.

Contact Information