Lesser the data, faster the recovery

Business No Comments »

As the BP oil spill continues to blacken waters in the Gulf of Mexico and beyond, Lloyd’s of London, the world’s largest insurer has estimated total claims from the explosion of the Deepwater Horizon oil rig could run into multiple billions. In the Louisiana wetlands, the oil has find ways around the protective booms to reach wild cane fields, discoloring the base of green cane and fouling the air with a horrendous smell. Pictures of pelicans covered in oil and dead dolphins on oil-fouled beaches are circulating on the Web. A third of the Gulf has been closed to fishing, and tourists are staying away from the beaches. And almost forgotten in the news – 11 oil workers were killed in the explosion.

Newly released internal documents show BP PLC estimated 4.2 million gallons of oil a day could gush from a damaged well in the Gulf of Mexico, if all equipment restricting the flow were removed. Democratic Massachusetts Congressman Ed Markey released documents showing BP estimates that in the worst-case scenario the leak could gush between 2.3 million and 4.2 million gallons of oil per day. The current worst-case estimate of what’s leaking is 2.5 million gallons a day.

BP has lost 65% of its market value, has established a $20 billion disaster fund, is spending billions more trying to deal with the emergency and may face bankruptcy before the emergency is over. And all because it skimped on safety equipment and did not plan for a disaster that was inevitable at some time, in some place, given the amount of drilling the company is doing.

What IT managers can learn from this ?

Disasters can happen in many ways, be it natural disasters such as floods, hurricanes, tornadoes, or earthquakes or man-made disasters such as hazardous material spills, infrastructure failures, and terrorism. The central role of information technology in business-critical functions, combined with the transition to an around-the-clock economy, has made protecting an organization’s applications and IT infrastructure in the event of a disruptive situation a vital business priority. Of companies that experience a major loss of business data, 43% never reopen, 51% close within two years, and only 6% survive long-term. Driven in part by these grim statistics, most large companies spend 2% to 4% of their IT budget on disaster recovery planning to avoid much larger potential losses in a disaster.

When IT discusses disaster recovery, usually the first thought is off-site data backups. That of course is a vital strategy. But many organizations constantly battle with overly-long data backups that exceed their windows, and when they test their recovery plan they are dismayed at the time it takes to reload those central business databases. In an economy where time is money, these are costly problems.

So how can you reduce backup windows and the time required to restore vital transactional databases in the event of a disaster. The answer is to reduce the size of those production applications by archiving inactive data and secondly, decommissioning/retiring applications that are inactive and no longer needed for day-to-day business activities. According to industry analysts, 80% of the data in large corporate databases is no longer needed to support day-to-day business and 10% of the applications in an unoptimized portfolio are candidates for Application Retirement. So how do you remove these unneeded data/applications from data center while preserving it for research and compliance? The answers are

that need to be restored at all.


© Solix Technologies, Inc.
Entries RSS