IT Governance: Transitioning to Cloud is now a Best Practice

Cloud Enabling Solutions No Comments »

Recent events in Middle East again bring to focus the type of political governance that is effective as well as viable over the long run. History shows that only a multi-party democracy with a strong Constitution, where Judiciary is independent of Legislature as well as Executive and where there are at least two strong political parties can provide effective governance that is of the people and for the people and has the strongest chance of surviving at least two hundred years.

Recent business scandals point to another truism in governance. Successful companies that have lasted for at least thirty years have well laid-out corporate governance guidelines that are effectively monitored by an Audit or Corporate Governance Committee headed by an independent Board member and where the Chairman, CEO and CFO roles are distinct.

Recent criticisms of IT departments not moving at the speed of business and being far too resource hungry have brought to fore the importance of good IT governance – Infrastructure, Process and Data. Poor infrastructure governance has led to poor utilization of assets. Poor process governance has led to failed or very expensive IT projects that have delivered little ROI. And poor data governance has embarrassed senior managers with missing or stolen data that have led to serious legal or regulatory consequences.

In the last few years Solix has offered solutions that largely addressed a few fundamental issues relating to data governance – exponential data growth, legacy data retention/destruction, data duplication and data breach.

We have now gone one step further with our announcement last month of Cloud-enabling solutions with Solix EDMS. This addresses two problems a CIO faces that comes under the purview of infrastructure governance: while an organization may have decided that Cloud is the way to go for optimal utilization of IT assets and delivering higher SLAs to business users, the CIO faces the challenge to ensure a smooth transition to the Cloud and ensuring that the Cloud infrastructure is also not wasted by moving legacy applications to it. Therefore he faces the problem to very quickly identify the candidates for retirement. A process that took over a year at minimum now needs to be done in less than ninety days.

This is just the beginning of some exciting R&D work that our engineers are doing in area of Cloud-enabling technologies. Just as democracy is the best form of political governance for citizen welfare, Cloud is emerging as a best practice in Information Governance for users and business agility.

Black Friday & Cyber Monday: How Safe Is Your PII Data?

Data Security No Comments »

Thanksgiving bargains are over. Now we have new deals for the eager Christmas and New Year shoppers. I wanted to buy a laptop but gave Black Friday a miss (no way was I going to line up for over a day). I turned up at the Fry’s store on Saturday morning about 90 minutes before store opening hours. I really should have known better. There was already a long queue. I returned home and bought it online.

And then an interesting article popped up on my screen: tips to avoid ruining your Thanksgiving and Holiday season through identity theft. The mantra: don’t fall prey to shoulder-surfing or to phishing. Just adopt best security practices when shopping.

Actually, it’s no longer petty thefts. Over the years it has become an organized crime, so much so that “according to prosecutors, tens of millions of credit and debit card numbers were stolen by the ring, at a combined cost to (retail) companies, banks and insurers of almost US$200m”.

Best security practices also need to be adopted by the corporate sector where the problem of data theft has been going on for years. Earlier this month in a court ruling, TD Ameritrade was asked to make a settlement to its customers who lost their PII data in a data theft case three years ago.

I have written in these columns and elsewhere about this. Data theft is big business. The irony is that data security technologies are fairly well proven. It’s not because the hackers are outsmarting technology; it’s because most companies in industries like retail and personal banking, which are vulnerable to attacks for PII data, have not instituted sufficient measures in place to prevent such attacks. Proof of that: an auto-generated acknowledgement email confirming the purchase once came with the full 16-digit credit card number (and the CVV) intact in their raw form!

Of course, we as a retail customer – whether on-line or from a brick-and-mortar shop – need to adopt standard operating shopping procedures when using the credit card or supplying PII data. Equally, all retail and financial companies also need to implement a comprehensive data security policy. Some have; it’s the “rest of world” business outfits that scare me. Here’s a simple suggestion: start with Data Masking and protect your customers’ PII data. Test environments are a big source of data thefts.

Case Study on Application Portfolio Management

Application Retirement No Comments »

If you are considering Application Modernization, Application Decommissioning or Application Overhaul, I would suggest a key reference. Application Portfolio Triage: TIME for APM (author: Jim Duggan from Gartner). Jim suggests a very elegant model, called TIME, which suggests a four-way categorization based on current and future business value of each application and the cost and risk of replacing it. This categorization can then become the framework for a strategic approach to application portfolio management.

TIME implicitly helps to explain why a few mainframe applications (like an online trading system in a stock exchange running on a proprietary fault-tolerant system) have survived so long. They are complex and mission critical and the cost and risk of replacing them had, till now, far outweighed the advantages associated with modern software and hardware. What’s happening now, however, is that the risk of continuing with them has sharply gone up. Most of the current crop of staff in an IT Department were not even born when some of the legacy platforms came on the de-support list by the respective vendors!

I am currently seeing a TIME model at work, which makes it even more fascinating. One of our customers has just gone through the 1st stage of “Tolerate” with an old version of an ERP running on a very old version of a RDBMS as they selected a new ERP for the enterprise. But the rollout took longer than expected while the performance of the older ERP was sharply deteriorating but needed to be retained for a couple more years. So the customer moved to the 2nd stage of “Invest” with a Database Archiving solution to keep the older ERP chugging for a few more years. They are currently in the 3rd stage of “Migrate” when all their business process across the globe would be migrated to the newer ERP. Once this is completed, we will see the final act of “Eliminate” (or Retirement) of the old ERP with some of the legacy data from it moved to the new ERP, some taken to the Datawarehouse, and most to a long term archive – and some very old or unimportant data completely purged.

Application Retirement makes a very good business case for reducing a Data Center’s operating costs. Why spend good money maintaining legacy applications that are no longer delivering any business value and there is little risk in eliminating them? Application Retirement technologies like Solix EDMS Application Retirement or the Solix ExAPPS Application Retirement Appliance are built to reduce risks of application retirement while at the same time delivering a very high ROI.

Cut the “Muda” in Your Application Portfolio:
Achieving Leaner Data Center through Application Retirement

Application Retirement, Enterprise Applications No Comments »

While Toyota is a much maligned name today, it taught the world Lean Manufacturing drawing on the concept of eliminating waste (“muda” in Japanese).  That spawned an entire discipline of “Lean” in Operations, epitomized by Lean Six Sigma. Now Green IT pundits are espousing a Leaner Data Center through Application Retirement. There is indeed a significant muda in a Data Center: estimates have shown anywhere between 70 – 80% of available resources (time, people, equipment, and software) are in a near idle state for over 50% of the time. Much of the lean in a Data Center had, till now, focused on Virtualization and Consolidation.  Now a new branch is evolving – and it is evolving rapidly – Rationalization, a major subset of which is Application Retirement.

Come to think of it – this mirrors the stages of lean manufacturing. The first stage was Just in Time – optimizing physical inventory across the supply chain. This is akin to Virtualization that optimizes servers and storage across multiple applications they are hosting to eliminate the muda capacity at any point of time. The second stage was Reducing Set-Ups – enabling smoother set-ups and more flexible operations. This may be compared with Consolidation of servers, storage and Data Centers that reduces muda (in terms of time, people productivity and resources) when, say, a new upgrade happens in the ERP version. The US Federal Government, for example, is embarking on the most ambitious Consolidation project ever undertaken. The other – and perhaps most important element of Lean Manufacturing was Cellular Manufacturing that rationalized manufacturing processes to reduce the number of SKUs even while increasing the number of car models. Rationalization in a Data Center aims to reduce the muda of servers, storage, applications and unnecessary processes – and maybe undertaken independent of consolidation – while at the same time the CIO is delivering more value-oriented systems to the businesses.

There are many sub-disciplines within Data Center Rationalization. The one that has caught the most attention is Application Retirement. As the name suggests, it is to cut the muda out of the no-longer-required applications but retain and manage historical data and provide access to it when (and if) required for audit, compliance or litigation support reasons. It is a category within Application Portfolio Management (APM) and is going by different names: Application Decommissioning, Application Sunsetting and Application Optimization. Mirroring them in APM – to deliver the CIO’s mantra of “more” – are Application Modernization, Application Renewal and Legacy Modernization.

The history of lean manufacturing has shown that fullest extent of process improvements comes only after implementing Cellular Manufacturing and not just JIT. The history of lean Data Centers actually started with Consolidation. The movement to Global Single Instances began over a decade ago. The last five years has seen the rapid adoption of Virtualization. We expect to see Application Retirement taking centre stage over the next decade. And believe me, there’s a lot of muda out there in the application portfolio of any enterprise – sometimes running to thousands!

Application Retirement
Two Retirements and An Appliance

Application Retirement, Solix ExAPPS - Industry's First Application Retirement Appliance No Comments »

Here’s a tale of two Application Retirements

The first was a pharmaceutical company that was recently acquired by a global major and had to retire its ERP and move over some of the more current data to the acquirer’s corporate ERP, some to the corporate Datawarehouse, and rest – the most historical and scarcely accessed data – archived to a secure repository for meeting long term statutory compliance or be accessed for audits or e-Discovery purpose.

The second was another global major that had lots (over a few hundred) departmental applications running in its different Divisions and subsidiaries around the world. A couple of years back it standardized on SAP. To reduce IT infrastructure costs it decided to decommission old hardware and all its legacy applications. The challenge was to maintain application context to all this data coming from many different applications running on mainframes, proprietary platforms and even Unix-based systems.

These are two ends of the spectrum on Application Retirement. While the basic problem is the same – preserving application context to the legacy data and making it available for reporting/query purposes long after the hardware and the application has been removed from the premises – there’s a clear difference between the two. The first is a RDBMS-based fairly modern ISV developed ERP running on a distributed system. The second is more complex – with many (and multiple) non-relational bespoke and packaged systems running on multiple proprietary systems and mainframes. However, the approach till today had been the same for both – one of throwing a lot of consulting effort even if an archiving solution tailored for application retirement was being used.

At Solix, we wondered if we could take a different approach and make the Application Retirement process for the first category simpler, make it faster and make it cheaper. We took the analogy of municipality planning. Modernizing the existing downtown in Sunnyvale, California that has lots of small houses and a few large bungalows but has no high rises must necessarily be different if one has to do the same in a city like Chicago.

Birth of an Appliance

Given our goals of (1) Making it Simpler, (2) Making it Faster, (3) Getting a Quicker ROI, and (4) Reducing the Total Cost of Ownership, we looked at different options. The challenge was to automate the entire process once the candidate(s) for application retirement had been identified: from data classification, data migration, building application context, data de-duplication & compression to enabling querying and reporting. It was a business problem that our Engineering had to solve. That’s the hallmark of Innovation in today’s world. We are proud to announce Industry’s First Application Retirement Appliance.

Solix ExAPPS: Industry’s First Application Retirement Appliance

Solix ExAPPS provides a complete hardware and software combination for Application Retirement and data preservation plug-and-play solution. Customers can plug Solix ExAPPS into a network port and power it up to have industry’s first and only pre-configured Application Retirement solution. This is how it stacks up:

All that a customer has to do is point Solix ExAPPS at the first candidate for application retirement using a Web browser. After that Solix ExAPPS will migrate all the application data including transactional business objects and reports, add application context to the legacy data, provide de-duplication, compression, and store in an immutable form in Solix ExAPPS. Once that is done, business users or IT can query and report on the legacy data using standard reporting tools that are used in enterprises. To meet compliance requirements, Solix ExAPPS would make the legacy data immutable to guarantee that the data cannot be modified. This process would then be repeated for the next application candidate. You can do this for packaged or custom applications running on any RDBMS on Windows, Linux or Unix system. It cannot get simpler than that.

Bursting of Another Bubble!
Size and Reputations Count No Longer

Business No Comments »

Who would have imagined Dubai would have a serious debt crisis? Of course, pundits are all over themselves declaring they knew the bubble was about to burst. Many have asked me, what’s the next bubble to burst? This reminds me of not long ago – the real estate bubble, the Madoff bubble and not too far back – the dot com bubble. Is China also a bubble?

Much of our actions – personal or business – are based on reputations. Bernie Madoff had a good reputation and friends put their money with him. Dubai had an impeccable reputation as a destination for investments and as an investor itself. Dubai Ports were actually awarded the management of six US Ports in 2006 till of course the hue and cry in US Senate and Congress got them to sell off their stake to AIG.

Reputation is a double-edged sword. Madoff earned it and Dubai earned it and they leveraged (pun intended) that to the hilt to further their business interests. But somewhere along the way, they lost it. And it had very little to do with poor luck or the misfortunes  of the global economy. It had to do with one hard fact – a leveraged model. And along the way, there was also a soft scorned fact – lost Integrity. Madoff thought he could bluff his way through a Ponzi scheme once he felt secure with his reputation; Dubai thought growth with mostly borrowed money could be everlasting once they felt the world was beholden to them for their wealth. Were they naïve? I don’t think so.

I guess it’s time to come to grips with the hard reality that success built on leveraged models is risky; and that reputation alone is not good enough. Just as size is no guarantee for survival, reputation is no guarantee for intrinsic integrity. And a highly leveraged model with scorn regard for integrity makes a strong recipe for a bubble to burst.

At Solix, we may be small but we are not leveraged. We do not operate with VC funding or with borrowed money. We hold integrity dear to our heart; it’s a non-negotiable value at Solix. The first lesson of integrity to us is we exist due to our customers and the value that we deliver to them comes first.

Last month was a milestone in Solix journey. We celebrated our 100th customer win. We also launched the Solix Customer Advisory Board. The Board represents customers who have implemented Solix EDMS for different ERPs across different databases and for different data management functions: Database Archiving, Application Retirement, Test Data Management and Data Masking. The charter of this Board is to guide Solix through its future technology and product roadmap, feature enhancements and partner validations.

A hundred customers and two significant OEM relationships – for Database Archiving with Oracle Financial Services and Data Masking with Voltage Security   is a strong testimony to the solid foundation of Solix EDMS technology and the confidence they all have on Solix. Over the long run, good technology and good products succeed and endure a long life; weak technologies, even if their parentage is strong, eventually die. A few of our recent successes has been through competitive switch of a product that is likely to be phased out as it does not appear to fit in with the acquirer’s product strategy roadmap. This is another truism in business history that should not be missed when doing product evaluations.

Lessons Revisited:

  • Size is no guarantee for survival. Neither is a glitzy reputation guarantee for any integrity.
  • Greater the leveraged model, higher is the risk of failure (or a bubble burst)!
  • Weak products fail irrespective of parentage. Conversely good technology, well accepted by customers, most often outlive their creators.

How Can We Make Data Masking Simple?
Lending a Critical Application to Cloud Computing

Business No Comments »

Exactly a year ago, I had written on the subject of Data Security in these columns. Exactly a year ago, someone had hijacked his employer’s servers containing sensitive data; all hell broke loose, because the employer was also a government agency. Exactly a year ago, we introduced the Solix Data Privacy Pack for Oracle e-Business Suite.

Much of the vulnerability I wrote then was from Insider Threats and from Test & Development Copies – as far as database applications were concerned. Of course, there are vulnerabilities in production databases, in the application themselves, in the storage and in tapes. However keeping raw data in test & development databases is just too risky, particularly in the case of application maintenance outsourcing.

So after a year, where do we stand today on this subject?

The awareness of the problem is much stronger particularly because it is now mandated through a few regulations like PCI, HIPAA, UK Data Protection Act. It has a strong established discipline on its own right under “Application and Data Security”. It’s called “Data Masking;” and has other names like Data Privacy, Data Obfuscation, Data Sanitization….. (just as Data Archiving is sometimes also referred as Data Relocation). Over the last one year, it has got itself a strong band (although still limited in numbers) of technical gurus, advocates and customers – all the ingredients that can make a technology an application killer.

However, that’s not enough. Data breach is still happening. Sensitive data resides not just in the corporate that may have implemented different data security technologies. They also reside with their suppliers and other business partners – in their production databases, test systems and elsewhere. We therefore need a wider penetration of data security technologies.

In parallel, we have another computing trend emerging – Cloud. The principles of Cloud are too compelling to ignore. But what it lacks are few of the critical infrastructure stacks to make the Cloud safe – within the network, the applications and the data. And until the Cloud is safe, there is going to be skepticism and resistance in its adoption.

Now what if we marry Data Masking technologies with “Private Cloud,” such that we bring the simplicity of Cloud to Data Masking and bring the safety of data to a Private Cloud? After all, the raison d’etre of private cloud is security.

Solix EDMS Data Masking is now Cloud Computing ready. We expect this technology convergence to usher in a holistic, consistent and integrated masking across the supply chain through private clouds. And it will benefit in large scale adoption of both Data Security technologies and Cloud Computing – just as many collaborative technologies saw larger adoptions with emergence of Intranets and Extranets and Internet itself has evolved to Web 2.0 and beyond!

How Do We Make the Complex Simple?
It’s About Design Principles. And Architecture

Enterprise Applications No Comments »

We manage a complex problem. Database archiving for packaged applications can be tricky. An ERP customer has to go through many changes: customizations & localizations, patches & upgrades, occasional cross platform migrations and even significant technology changes. ERP technology moved from client-server to Internet and now it is starting to evolve to the Services Oriented Architecture (SOA).

A question often times asked is how we manage all these changes – not just for one application but for multiple applications running on multiple databases.

Take the case of patches. There would be patches for both database and the ERP. And for ERP we have minor patches as well as major patches. Minor patch would be where there are just structural changes or changes to the data type. A major patch would be where there are additions or splits to tables or formulae changes which impact data. Each of them have a certain impact to the application and could potentially have to the archived data and tables as well. How does our archiving technology handle such changes with minimal impact to the customer environment?

We realized that solving this complex problem can only be achieved through deep engineering. A solution will have to be architected grounds-up to anticipate and handle such complex changes in a manner that is simple to install, support, implement and use without causing a major disruption in the archiving process when such a change occurs.

We adopted a few design principles that are like Core Values to our Engineering.

  • Flexibility: ability to handle all data types
  • Extensibility: ability to work across multiple platforms, databases, middleware and applications
  • Scalability: ability to cover multiple data management functions
  • Openness: for easy plug-ins with partner platforms and solutions
  • Security: providing the highest levels of data integrity and security
  • Simplicity: for support, implementation and use; the complexity is only to be in our labs
  • Robustness: ability to manage change without significant re-work and disruptions

The road to achieve this was through adoption of 3-tier J2EE architecture and developing tools that confined the complexity to only our labs.

The proof of simplicity is scalability and flexibility to manage change.

Scalability: While life of Solix EDMS started with Database Archiving for Oracle e-Business Suite, we quickly expanded the footprint to cover other applications running on Oracle and non-Oracle databases. Not only that. We today cover other data management functions such as Test Data Management, Data Masking and Application Retirement and that includes platforms like AS/400 and z/OS.

Flexibility to manage change: Our customers have upgraded from Oracle e-Business Suite R10.7 to R11 and now R12 – with minor and major patches in between. And many of them were with significant new customizations. Or sometimes customizations that were dropped from one Release to the next, because the new Release had incorporated the previous customized functionality.

We achieved this in matter of just two years. Application Archiving is definitely a complex subject. But we hide that elegantly. It’s what’s hidden that intrigues all.

A.D. 2008: The Good, The Bad and The Ugly
Welcome 2009

Enterprise Applications No Comments »

It began as a good year like all New Years do. At least for Solix it did. Early on the year, we entered new markets – and continued that through the year with customer acquisitions in Eastern Europe, support of JD Edwards archiving and application sunsetting, support of DB2 on both IBM’s i-series and z/OS and strengthened our partnerships through our integrations with Oracle’s Universal Online Archive and HDS HCAP But it was a Bad Year. The US economy tumbled and the global economy crashed. Retailers have already reported the worst year in history and many lost their jobs and their homes. Bankruptcies led to the fall of many big names. For the IT industry, it was tough times as decisions took longer and there was pressure on prices and margins. It was an Ugly Year. The Mumbai tragedy was horrific. The scene of the ugliest carnage – the Taj Hotels – also happened to be one of our customers. We pray for all the victims and their families. Talking about ugliness, the Madoff scandal could not have come at a worse time.

I am sure we are all desperately looking forward to the New Year –and hoping that the worst is behind us. There’s a lot of hope and expectations. But it will be a U – rather than a V recovery, and we need to prepare ourselves for that. The new US administration, taking office on 20th January, is talking about structural changes through a “ new energy and environment plan”. The gestation period may be long but the rewards would be ever lasting. Along with many in the IT industry, we too are doing our bit in helping to create Green IT Centers and we will continue investing in this area.

We will soon announce our new Release. It will have a number of industry-unique enhancements in Data Privacy, Long Term Data Retention, support of industry vertical applications and SaaS-enabled features. We will closely follow the trajectory of Cloud Computing. Still in its infancy, we do believe the current economic climate would accelerate its adoption. Once that happens, Data Management would become a critical support function.

In a difficult year, we grew. For that we thank our customers. Many of you were great references. And that means a lot in our business. To our partners – who enabled a market reach that otherwise would just not have been possible – we thank you for your support. To the Solix team – thank you for making our customers happy.

And this is to wish all – our colleagues, customers, partners a great year ahead of us.

Does ROI Matter?
Importance of Licensing Models

Enterprise Applications No Comments »

It’s been a few years since the debate started on software licensing. And the consensus appears to be that acquiring perpetual licenses is on a downward trend. (I guess it’s still too early to call for its eventual demise). Customers are demanding fast and easily measurable ROI on their enterprise software deployments; the software industry is responding with either open source or utility-pricing.

Moving toward utility-pricing is not surprising at a time when the IT industry is going through structural changes. It started first with Application Service Providers (ASPs) or On-Demand, which some like to call; which mutated later to a Software-as-a-Service (SaaS) model. And lately there’s another trend emerging: Cloud Computing for infrastructure with a completely new player as a leader – Amazon.

All most interesting and something we had thought about when we were architecting our then new Release 4.0 back in fall 2006. We incorporated the basics to deploy our software in a SaaS environment, or at least make it possible for Managed Service Providers (MSPs) to offer data management services using our product on a rental basis on a pay-as-you-go model. The software allows metered pricing and is multi-tenanted.

Over the last eighteen months, we have developed this further and built maturity to it and now we have begun offering on a utility-pricing model, with one key difference. For a utility, you pay more as your consumption grows even if it is sometimes on a reducing rate per unit. In our case, the more you archive the less you pay – in absolute terms.

It’s a pricing model we believe would be attractive in current economic conditions and would be trend setter for rest of the computing industry, perhaps even for the utilities.

For us, it provides a predictable and steady cash flow rather than a series of bell curves of peaks during Quarter-ends and troughs in between. Having doubled our market share in the last two years, we believe utility-pricing will help us double our market share again –this time in 12 months; but more importantly would offer our customers a predictable ROI with their operating budgets when getting capital budget approvals is getting increasingly difficult.

Does Database Archiving Have Enough Customer Proof Points?
Leading Analysts Says YES

Application Archiving No Comments »

For years, packaged solutions for Database Archiving seemed like a beautiful foster child that had to be adopted only after a home grown baby was attempted, perhaps conceived but finally failed.

With nearly 1000 customers around the world using packaged Database Archiving solutions, there’s now the industry accepted threshold of customer proof points. The good thing is that most customers have reported significant ROI from their archiving software to the analyst firms who have conducted customer surveys in this area. A most recent study has also shown Solix doubling market share in the last two years*.

Solix customers for Database Archiving range from Asia, Europe and North America in diverse verticals like Discrete Manufacturing, Telecom, Logistics, Federal and CPG running packaged ERPs/CRMs like Oracle E-Business Suite, JD Edwards, Siebel, BaaN and custom applications on Oracle and other Databases. All of them reported a high ROI.

While we have had market validation – from both customers and analysts – our challenge is to provide even a higher ROI than what our competitors offer, and that’s where innovation comes in.

Our most recent innovation was what Sai referred in his last blog: our integration with Oracle Universal Online Archive. With this integration customers will get unmatched higher security, retention management, storage management, Records Management, and auditing to the archived data. The Compliance Officer can have a good night’s sleep, confident that archived data cannot be modified. This is what only Solix offers among Database Archiving vendors. And we are confident, that customers will now see exponentially higher ROI as they adopt this integrated offering.

* This compares market shares as stated in Gartner’s September 2006 Report versus their October 2008 Report.

Does Size Always Count?
Small is Beautiful

Business No Comments »

This caption coming from an archiving vendor, I would not fault the reader for thinking that this blog would again be about archiving and extolling the virtues of making production databases smaller. Yes, big (and getting bigger) production database size is not good, and one needs to archive little used historical data to make it smaller and make the business operations – not just the IT – more agile, more energy-efficient and more compliant with the need for immutable records management.

However, this will be about what I sometimes hear from customers that they feel a big vendor is always a safe bet. Having read the economist E.F. Schumacher’s 1973 seminal work, “Small is Beautiful,” I have confidently countered that; and happily, on most occasions, I have been able to convince them.

See the history and see what’s happening now. A venerable big institution like Bear Stearns went down. So did Enron. And now the two big mortgage companies are being bailed out by the US Government.

Let’s look at the Technology Industry. Once upon a time, long, long ago there used to be a very big company called DEC. Some of their products still survive, but not the company. During the best days of DEC, a small company – Oracle Corp – emerged to launch a RDBMS. Oracle later acquired DEC’s Rdb, and Rdb still survives on Oracle’s price list. Understand that in the early 1980s, Oracle – the small company then – had to fight the Goliath, DEC for RDBMS business on VAX/VMS.

And sometimes successful big companies enter a business only to get out of it later. GE, Honeywell and Xerox are good examples of this. All had at some point of time in their illustrious history been in the computer business. I daresay, only students of technology history remember this!

Having worked earlier for two organizations that were near start-ups when I had joined but became (and still are) the leaders in their field – and in parallel seeing what were then considered the Goliaths in the industry – either completely gone or struggling or out of that business, I would like to make the following observations:

  • It is necessary for the small company to prosper and get bigger; there are no two questions about that. Too many small companies became history (and history does not even remember them) simply because they could not overcome the first minimum threshold of adequate customer acquisition and profitability.
  • Once they acquire a certain size, it becomes necessary (but that’s not sufficient) to grow out of being just a niche player. See what happened to Netscape and Ingres.
  • Perhaps the sufficient condition (unless there are serious management issues) is the fact that the big company just cannot afford to miss the technology bus. If they are late getting into that, it may be just too late to catch up. DEC missed Unix. Ultrix was just too late and not good enough.

Now coming to why GE, Honeywell and Xerox got of the computer business. It made good business sense. It was good business sense for IBM to divest its PC business. They exited in spite of the mainframe (GE/Honeywell) or Workstation (Xerox) or PC (IBM) markets being very good at that time because in this product line they were not number one or two in the industry; it was not major revenue or earnings earner; and they did not see this business as a prerequisite for success in their core areas. In fact it was the same reason why DEC gave up Rdb; the RDBMS market was very robust. But for DEC, this product line was not core to the success of their mini computer business.

So before deciding that bigger is safer, I would suggest looking at the two fundamentals of necessary (are they niche?) and sufficient (are they current on technology?) conditions of survivability of the big. And then one needs to ask if the product line is indeed strategic or even peripheral to their core business. Because, if it is not you can guarantee that sooner rather than later, it will be shed off.

It’s happening right now as we speak. In some cases, Small Is Beautiful. That’s why Schumacher’s book remains a classic even after thirty five years.

Will it ever stop?
Secure Thy Network and Your Data

Data Security No Comments »

One would think, given the attention it has been getting for nearly a decade, that by this time every large IT Department would have done a thorough due diligence of its security systems and would have instituted necessary controls. Apparently not, if one reads the continuing regularity of security breaches. Take the example of a few weeks back – about an insider’s flagrant abuse to his employer’s computer systems. This news of hijacking is even more frightening than the occasional breach in data theft from insiders.

The good news is that both can be stopped. The bad news is the “ostrich-like” approach that most organizations take – it can’t happen to me, till it actually happens. A CD with sensitive data left in the airline seat pocket is deemed extreme carelessness and the person reprimanded for it. What’s glossed over is the fact that he should not have been allowed to have that data in the first place in his CD, thumb drive or on his laptop. It should never have left the security of the server in the first place.

Secure thy network and your data. It’s been said so many times before that it now sounds hackneyed, except for the lurking fear that tomorrow you or I could be victims with our personal data in the hands of a felon.

Think now about this hijacking incident. If he had not been arrested, he could have deleted some data, and added some, depending on his prevailing state of mind (if he was momentarily insane) or did all of above and modified for criminal purpose, if he was truly malevolent. Imagine if this hijacker was also part of a sinister network and you get the picture!

Of course, a home can be robbed even after locking all doors and windows and installing alarm systems that are kept in working conditions. But at least we know that the probability is minimized. I could say with 99% statistical confidence that security breaches are happening to those organizations that have not undertaken any form of due diligence of access, operational and IT security and consequently have failed to implement a comprehensive security system.

As an IT vendor focused on data management, we are deeply concerned and have decided to do something about this. We are introducing this week a package configured to mask up to 39 columns of sensitive data in non-production systems of Oracle e-Business Suite. It’s pre-built and an out-of-the-box solution. If you want to secure corporate and personnel data in a week in your Oracle eBusiness Suite non-production environments, call us now.

Who Moved My Data?
Application Archiving for Business Agility

Application Archiving No Comments »

There’s an apocryphal story I heard about a CIO who asked his DBA to surreptitiously archive one month’s data from his ERP each week, starting with the oldest data stretching back five years. The DBA archived to the point where the oldest historical data was being accessed by any functional user. As expected, the DBA archived three years’ data and the functional users never noticed. The question then is why did he have to move the data in a covert manner?

Business owners – the true owners of the data – are usually the most reluctant when they hear the word “archive”. They think data will be moved offline and access will be lost – never mind the fact that they rarely, if ever, access it at all! Perhaps the solitary word “archive” is a misnomer. Some prefix it and call it “active archiving” – meaning the data would still remain active – while others refer to it as “near-line archiving” – where data would be moved to near-line storage and still be accessed online. And there are some who avoid the word “archive” altogether and call it “data relocation”.

Whatever one may call it – once archived, moved or relocated – business users – and not just CIOs and DBAs – see immediate benefits in terms of application performance improvements. One such customer, a BaaN user on Informix Database, ran into serious issues whenever the application hit the table limit size. The system would be down for several days seriously impacting manufacturing schedules. With Solix EDMS, which supports BaaN Archiving, the customer now moves closed historical data to a separate database, resulting in:

  • Improved application performance and stability – which in turn, helped to maintain their manufacturing schedules
  • Reduced future hardware costs through space reclamation after DB size reduction
  • Reduced downtime associated with frequent database maintenance
  • Reduced backup and restore downtime.

A non-believer may say a swallow does not make a summer. Customers like Forbes Marshall and TRACO, in engineered-to-order manufacturing, deployed Solix EDMS for Oracle E-Business Suite archiving to trim their DB size in Bill of Material and Order Management systems on their production servers and experienced immediate benefits in the shop floor through better application performance. These examples prove the point that Archiving helps to avoid a hit on the company’s top line revenue figures.

Archiving is a pain killer for an organization, not a vitamin providing a short term panacea. If the CIO is the custodian of company data, let him Archive it. Archiving achieves business agility with a high ROI and should be a top priority project in a recession-hit economy.

Does ROI Matter?
Why Aren’t You Archiving?

Enterprise Applications No Comments »

Nicholas Carr presented a thesis five years ago that IT no longer provided the differentiating competitive advantage and therefore no longer mattered. Since then, he has been proven wrong many times over with billions of dollars spent on new IT initiatives, simply because there was a positive, if not high, Returns on Investment (ROI) on most enterprise-wide IT projects.

There is always a scrutiny every time a CIO presents a project for approval. Being a relatively new discipline, Database Archiving had been one area that got relegated to a lower priority simply because there were few implementations and even fewer evidences of hard numbers on ROI.

A few years ago, at an Oracle Conference, a customer hammered “Why Aren’t You Archiving”? He presented a convincing case with a strong ROI. As with all new technologies, there’s a tipping point for adoption to accelerate. That tipping point may have just happened.

Gartner’s June 2008 Report on Hype Cycle for Storage Software Technologies has unequivocally stated that the ROI for implementing a Database Archiving solution is “exceptionally high”. It further goes on to add “When database archiving is used for application retirement (with retired data archived to tape in an XML format), the business impact can be even more dramatic”.

This Report could not have come at a better time. With the economy near to a recession, it underscores what we have been saying all along. Archive to not just improve application performance; archive to improve business agility. Archive to not just reduce storage costs; archive to reduce energy costs. Archive to not just ensuring data retention; archive to insure against an expensive litigation. The high ROI is indisputable.

And for applications retirement, the Solix archiving solution with its core as a metadata based repository, we not only move legacy data to a XML format, but we can also move it to the new database of the new application, before we finally retire it to a Tier 3 storage. Archiving before upgrading or migrating reduces time, implementation costs and carrying over “dirty” data.

Mergers and Applications Retiring

Enterprise Applications No Comments »

A Depository, implementing anti money laundering laws, recently sent rejection letters to many capital market investors who have to be registered with one of the two depositories in India. The reason given: computer glitch. Isn’t that familiar? We heard the same thing when one of the largest Banks acquired a credit card issuer in 2005. A number of affected customers even blogged about their bitter experiences. The Bank’s IT problems started earlier when it had acquired another Bank in 2004, and they were struggling with integrating the systems, which first meant a rip-and-replace of the acquired Bank’s system. The subsequent acquisition of the credit card issuer compounded the IT integration problem.

On the other hand, last week, I heard a completely different story after visiting a tax collection department. A merger of two municipalities and adding more sources of tax collections meant their systems needed change. No glitches this time. There wasn’t a single complaint of citizens getting a wrong tax notice or a valid refund claim being rejected.

This got me thinking. Why should a Bank or a Depository – with vast resources and experience in managing large systems – run into a problem while a government agency with limited capital budgets and a bureaucratic set-up get it right?

I don’t believe there’s a simple answer to this. But the tax collection agency’s approach may be revealing. They adopted the principle of application modernization rather than rip-and-replace. It may surprise a few, that a government organization is an early adopter of application modernization; not so surprising if one considers that a rip-and-replace is always more difficult for them from budget approval aspect. This tax collection agency over the years has been modernizing what was once a client-server application to one that is web-driven and event-driven; their entire business process began changing from late ‘90s with the advent of the Internet, merger of two municipalities and adding more sources of tax collection. They never had an application switch over downtime; they did not have to go through a BPR or user retraining. For most, it remained business as usual.

The reason for complexity which perhaps explains the difference as well between the mergers of the two banks and the municipalities is that commercial organizations in the same industry and region are likely to have completely different business processes. On the contrary, Government departments in the same state and country are likely to have similar business processes. In my example, while the tax departments of the two municipalities were running applications from different vendors, their integration was simpler compared to what two commercial organizations may go through even if, hypothetically, they run the same ERPs and core banking systems. If they are different, and if they run on different databases, then what the Banks did – application retirement for one – may seem near inevitable.


© Solix Technologies, Inc.
Entries RSS