Nicholas Carr presented a thesis five years ago that IT no longer provided the differentiating competitive advantage and therefore no longer mattered. Since then, he has been proven wrong many times over with billions of dollars spent on new IT initiatives, simply because there was a positive, if not high, Returns on Investment (ROI) on most enterprise-wide IT projects.
There is always a scrutiny every time a CIO presents a project for approval. Being a relatively new discipline, Database Archiving had been one area that got relegated to a lower priority simply because there were few implementations and even fewer evidences of hard numbers on ROI.
A few years ago, at an Oracle Conference, a customer hammered “Why Aren’t You Archiving”? He presented a convincing case with a strong ROI. As with all new technologies, there’s a tipping point for adoption to accelerate. That tipping point may have just happened.
Gartner’s June 2008 Report on Hype Cycle for Storage Software Technologies has unequivocally stated that the ROI for implementing a Database Archiving solution is “exceptionally high”. It further goes on to add “When database archiving is used for application retirement (with retired data archived to tape in an XML format), the business impact can be even more dramatic”.
This Report could not have come at a better time. With the economy near to a recession, it underscores what we have been saying all along. Archive to not just improve application performance; archive to improve business agility. Archive to not just reduce storage costs; archive to reduce energy costs. Archive to not just ensuring data retention; archive to insure against an expensive litigation. The high ROI is indisputable.
And for applications retirement, the Solix archiving solution with its core as a metadata based repository, we not only move legacy data to a XML format, but we can also move it to the new database of the new application, before we finally retire it to a Tier 3 storage. Archiving before upgrading or migrating reduces time, implementation costs and carrying over “dirty” data.
A Depository, implementing anti money laundering laws, recently sent rejection letters to many capital market investors who have to be registered with one of the two depositories in India. The reason given: computer glitch. Isn’t that familiar? We heard the same thing when one of the largest Banks acquired a credit card issuer in 2005. A number of affected customers even blogged about their bitter experiences. The Bank’s IT problems started earlier when it had acquired another Bank in 2004, and they were struggling with integrating the systems, which first meant a rip-and-replace of the acquired Bank’s system. The subsequent acquisition of the credit card issuer compounded the IT integration problem.
On the other hand, last week, I heard a completely different story after visiting a tax collection department. A merger of two municipalities and adding more sources of tax collections meant their systems needed change. No glitches this time. There wasn’t a single complaint of citizens getting a wrong tax notice or a valid refund claim being rejected.
This got me thinking. Why should a Bank or a Depository – with vast resources and experience in managing large systems – run into a problem while a government agency with limited capital budgets and a bureaucratic set-up get it right?
I don’t believe there’s a simple answer to this. But the tax collection agency’s approach may be revealing. They adopted the principle of application modernization rather than rip-and-replace. It may surprise a few, that a government organization is an early adopter of application modernization; not so surprising if one considers that a rip-and-replace is always more difficult for them from budget approval aspect. This tax collection agency over the years has been modernizing what was once a client-server application to one that is web-driven and event-driven; their entire business process began changing from late ‘90s with the advent of the Internet, merger of two municipalities and adding more sources of tax collection. They never had an application switch over downtime; they did not have to go through a BPR or user retraining. For most, it remained business as usual.
The reason for complexity which perhaps explains the difference as well between the mergers of the two banks and the municipalities is that commercial organizations in the same industry and region are likely to have completely different business processes. On the contrary, Government departments in the same state and country are likely to have similar business processes. In my example, while the tax departments of the two municipalities were running applications from different vendors, their integration was simpler compared to what two commercial organizations may go through even if, hypothetically, they run the same ERPs and core banking systems. If they are different, and if they run on different databases, then what the Banks did – application retirement for one – may seem near inevitable.