SOURCE: Bruce Boyers Marketing Services

October 27, 2009 13:31 ET

Actually Preventing the Fragmentation Disease

BURBANK, CA--(Marketwire - October 27, 2009) - In the field of medicine, disease prevention is heavily touted. Such prevention runs from the simplicity of washing your hands to prevent the spread of germs, to complex suiting of workers to protect them from carcinogenic chemicals. It includes getting regular check ups at the doctor, as well as a healthy diet from childhood on to prevent many issues such as early onset diabetes. Medical professionals and health care organizations alike fully understand the drastic cost differential between preventative measures and the treatment of disease.

Within the universe of computing, there are diseases as well, and prevention of them is also of paramount importance. The first that would probably leap to mind for anyone would be the wide array of computer viruses. Today, numerous sophisticated methods exist that routinely scan computers for viruses and eliminate them when found. It is all too well known that the only "treatment" for unchecked viruses is to wipe the hard drive clean and start all over again. Again, treatment is vastly more expensive than prevention to begin with.

Another computer disease that is just as common, and that causes untold costs in lost performance and IT overtime, is file fragmentation. Left unaddressed, it quickly mounts, splitting files and free spaces into thousands or even tens of thousands of pieces. The I/Os needed to read and write fragmented files take a serious toll on enterprise productivity and hardware life.

Interestingly, however, prevention of fragmentation has never been attained. All manner of handling fragmentation after the fact has been implemented. First came manual defrag, which requires copious amounts of IT overtime. Then came scheduled defrag, which requires less time but still means precious IT hours spent in finding and implementing schedules. Fully automatic defrag was then evolved, allowing defragmentation to occur in the background, saving considerable time and money in the process.

The one factor all of these methods share in common, however, is that they all address fragmentation after the fact of its occurrence. The problem is that by the time fragmentation happens, the system has already wasted I/O resources by writing fragmented files to scattered spaces on the disk.

It is obvious that fragmentation prevention would be far more efficient. A method would need to be developed so that files could be written contiguously (as close to being in one piece as possible) to begin with, while taking no toll on system resources to accomplish it.

With a majority of fragmentation prevented, system resources would be saved in reading files, as well as those saved in writing files to begin with. Significant savings would also be achieved in energy consumption and cooling -- even more than is done with defrag.

Diseases are always best dealt with by prevention. This should now include fragmentation as well.

Contact Information