Gartner’s Technology Predictions for 2011

Business No Comments »

I recently attended Gartner Expo, where Gartner’s experts discussed the top 10 technologies and trends they believe will be strategic for most organizations in 2011. What picked my interest are the following and the impact Solix EDMS can have on them:

Cloud Computing: Cloud computing services exist along a spectrum from open (public) to closed (private). The next three years will see the delivery of a range of cloud service approaches that fall between these two extremes. Vendors will offer packaged private cloud implementations that deliver the vendor’s public cloud service technologies (software and/or hardware) and methodologies (i.e., best practices to build and run the service) in a form that can be implemented inside the consumer’s enterprise, much as Google does today with Gmail.

At Solix, we are seeing increased trends from prospects who want to buy Data Management as a service. Solix ExAPPS, Industry’s first application retirement appliance, is seeing a lot of demand, with this surge. I won’t be surprised to see the majority of the IT purchases being done as a cloud service in couple of years.

Next Generation Analytics: The leading edge here is real-time simulations and models that predict future outcomes to support individual business decisions, rather than just analysis of results of past actions after the fact. While this may require significant changes to existing operational and business intelligence infrastructure, it promises significant improvements in business results.

Information Lifecycle Management has an important role to play in identifying and moving inactive data to lower storage tiers. This allows these demanding new predictive analytical tools to focus on the most important active data rather than being bogged down in a morass of irrelevant historical information that does not apply to the present and future business environment.

Storage Class Memory: Gartner sees huge use of flash memory in consumer devices, entertainment equipment, and other embedded IT systems. In business, flash memory offers the best of RAM and very high speed storage with a list of advantages of its own — space, heat, performance, and ruggedness among them. As a replacement for RAM, flash offers equivalent performance but with the huge advantage that flash memory is persistent in a power outage, so that when power is restored the device starts up immediately where it left off. This makes it a new, premium choice that allows you basically to store the most valuable, most active data in permanent memory rather than out on a disk drive, where it is instantly available but protected from crashes. Flash is already being used as a “Tier 0” in applications, primarily in the financial industry, that demand extremely fast reads and writes of large amounts of sensitive data.

The disadvantage is cost. It will continue to have enough of a premium for the next several years to make flash an impractical choice for storing anything but the most high-leveraged data, with the rest getting archived or retired. A strong ILM environment with effective data tiering will be important for realizing maximum advantage from your flash memory investment.

The BRICs Decade

Business No Comments »

According to Goldman Sachs, over last decade the BRICs (Brazil, Russia, India & China) made their mark on the global economic landscape, accounting for more than a third of world GDP growth. In the process their economies have grown from one-sixth to almost a quarter of the world economy. And this may only be the start. In “Dreaming with BRICs: The Path to 2050″, Goldman predicts that these four rising economic powerhouses will continue their strong growth and their combined economic wealth will exceed that of the G6 (U.S., U.K., Italy, France, Germany and Japan) by year 2040.

We have been working on expanding our footprint to BRIC nations and Japan. Our goal was to access second and third largest economies of the world, besides the fastest growing nations. As part of that effort, I happened to meet Merrily Kautt, who teaches at the University of Colorado-Denver Business School. She was looking for a real-life business research project for one of her International Marketing classes. We were happy to accept when she offered to have her class research on how to expand the Solix footprint to China and Japan. The result was an impressive report combining social, cultural, political, and historical aspects of these countries, culminating in practical guidance on how to do business with them. The students did an excellent presentation to our senior management, which among other things convinced us to hire two of them for our business development team.

This research also contributed to the launch of our operations in China (www.solixchina.com) last month with the opening of a Chinese support center. At the same time, we acquired our first Japanese customer through one of our global partner, the kind of expansion we always wanted.

If you are a SI/Reseller with infrastructure offering in growing economies, partner with us. According to Gartner, Enterprise information archiving (EIA) will become a key infrastructure component for Enterprises by 2013. Our partner programs are designed to help you extend your business to this high growth market.

Lesser the data, faster the recovery

Business No Comments »

As the BP oil spill continues to blacken waters in the Gulf of Mexico and beyond, Lloyd’s of London, the world’s largest insurer has estimated total claims from the explosion of the Deepwater Horizon oil rig could run into multiple billions. In the Louisiana wetlands, the oil has find ways around the protective booms to reach wild cane fields, discoloring the base of green cane and fouling the air with a horrendous smell. Pictures of pelicans covered in oil and dead dolphins on oil-fouled beaches are circulating on the Web. A third of the Gulf has been closed to fishing, and tourists are staying away from the beaches. And almost forgotten in the news – 11 oil workers were killed in the explosion.

Newly released internal documents show BP PLC estimated 4.2 million gallons of oil a day could gush from a damaged well in the Gulf of Mexico, if all equipment restricting the flow were removed. Democratic Massachusetts Congressman Ed Markey released documents showing BP estimates that in the worst-case scenario the leak could gush between 2.3 million and 4.2 million gallons of oil per day. The current worst-case estimate of what’s leaking is 2.5 million gallons a day.

BP has lost 65% of its market value, has established a $20 billion disaster fund, is spending billions more trying to deal with the emergency and may face bankruptcy before the emergency is over. And all because it skimped on safety equipment and did not plan for a disaster that was inevitable at some time, in some place, given the amount of drilling the company is doing.

What IT managers can learn from this ?

Disasters can happen in many ways, be it natural disasters such as floods, hurricanes, tornadoes, or earthquakes or man-made disasters such as hazardous material spills, infrastructure failures, and terrorism. The central role of information technology in business-critical functions, combined with the transition to an around-the-clock economy, has made protecting an organization’s applications and IT infrastructure in the event of a disruptive situation a vital business priority. Of companies that experience a major loss of business data, 43% never reopen, 51% close within two years, and only 6% survive long-term. Driven in part by these grim statistics, most large companies spend 2% to 4% of their IT budget on disaster recovery planning to avoid much larger potential losses in a disaster.

When IT discusses disaster recovery, usually the first thought is off-site data backups. That of course is a vital strategy. But many organizations constantly battle with overly-long data backups that exceed their windows, and when they test their recovery plan they are dismayed at the time it takes to reload those central business databases. In an economy where time is money, these are costly problems.

So how can you reduce backup windows and the time required to restore vital transactional databases in the event of a disaster. The answer is to reduce the size of those production applications by archiving inactive data and secondly, decommissioning/retiring applications that are inactive and no longer needed for day-to-day business activities. According to industry analysts, 80% of the data in large corporate databases is no longer needed to support day-to-day business and 10% of the applications in an unoptimized portfolio are candidates for Application Retirement. So how do you remove these unneeded data/applications from data center while preserving it for research and compliance? The answers are

that need to be restored at all.

Virtualization Prerequisite

Data Management No Comments »

Clearly the “next big thing” in the data center is virtualization. But we believe that IT shops that virtualize wholesale, without first retiring applications that they no longer need using tools like Solix EDMS or Solix ExAPPS, are missing a large chunk of the potential ROI.

The real benefits of virtualization are the substantial energy, cooling, and space savings, which have allowed organizations to extend the lives of data centers for extra decades. The big offenders in the data center are underutilized servers. An idle server uses 60%-70% of its maximum power and cooling requirements. Thus by raising server utilization from the 15% rate typical in pre-virtualized environments to the 85% rate normally seen in virtualized systems, the CIO can eliminate many of the boxes on the floor, cutting power and cooling demand while making room for future growth.

But virtualizing software that the organization no longer needs is not the best answer for those applications, and actually can make them harder to identify as they get lost in the overall virtualized environment. Often these obsolescent applications are forgotten and only found when someone does a systematic audit of what is running on each server or blade in the data center.

How important is this? Gartner estimates that 10% of the applications in an unoptimized portfolio are candidates for retirement while an additional 33% are candidates for migration or rationalization. Why? All applications eventually outlive their usefulness. Business needs and processes change, new technologies supplant old ones, the enterprise buys new divisions with duplicate processes running on different software. But those old applications are often never shut down. Users still need access to data, some users prefer the old application, others find that it is better for one or two lingering tasks.

Virtualizing these obsolescent applications still leaves them using up valuable resources and costing the enterprise money in software licensing and maintenance, support costs, and associated personnel. The better answer is to use application portfolio management (APM) to identify these candidates for oblivion and retire them first. This can have a major impact on resource use even after virtualization, maximizing ROI. And APM then can become the basis of managing the applications environment in a more sophisticated matter going forward — for instance by identifying applications that might be replaced by SaaS and those that should not be virtualized at all.

The problem, of course, is what happens to the data. And the solution is Solix ExAPPS, which relocates the data from retired applications with its full metadata into its repository with almost 90% compression and makes that data available to authorized users through a single viewer, replacing multiple large applications with a single, efficient tool, taking the ROI for virtualization to the max.


© Solix Technologies, Inc.
Entries RSS