SOURCE: Bruce Boyers Marketing Services

October 12, 2009 13:58 ET

Defrag: An Ounce of Prevention?

BURBANK, CA--(Marketwire - October 12, 2009) - "An ounce of prevention is worth a pound of cure." So said inventor and statesman Benjamin Franklin over 200 years ago, but for some reason this self-evident truth still isn't practiced much. A prime example is the internal combustion engine, which has been causing horrific pollution for over 100 years. All manner of high tech -- such as catalytic converters and ever more efficient carburetors -- has been applied to reduce the engine's pollutant effects. Yet still it pollutes, and you would think by now we would have done away with the source of the problem, and noxious and environmentally-harming fumes would simply be prevented in the first place.

Another example lies in the field of computers, as we live day in and day out with file fragmentation. Plaguing us since the invention of the modern computer, the battle has constantly been waged against file fragmentation to negate its crippling effects on performance and its life-shortening toll on hardware.

Originally defragmentation could only be performed manually while the system was offline. Then, scheduling defrag came along, at least allowing those poor system personnel to go home once in awhile. This method was with us until fairly recently, when fully automatic defrag came along, giving IT back considerable hours while keeping system performance and reliability consistently maximized.

But what if file fragmentation was prevented altogether? When fragmentation occurs, the system has already wasted precious I/O resources by writing fragmented files to cluttered spaces on the disk. State of the art defrag works great, but what if fragmentation was negated beforehand, so that files were always saved contiguously and fragmentation just never occurred?

It wouldn't necessarily be that difficult, although it would require a stroke of genius on somebody's part. Essentially a resource-efficient method would have to be discovered that allowed files to be written contiguously (as close to being in one piece as possible) to begin with, while taking no toll on system resources to accomplish it.

Not only would system resources be saved in reading fragmented files, but considerable resources would also be saved in writing files to begin with. Drive wear would be greatly reduced when compared with after-the-fact defrag, and significant savings would also be achieved in energy consumption and cooling -- even more than is done with defrag.

Just as it seems that we could now do away with constantly handling vehicle pollution after the fact, it would seem that we could also do away with that constantly-pesky fragmentation problem. Let's hope that somewhere R&D folks are burning the midnight oil to finally put an end to this plague on IT departments everywhere.

Contact Information