Seven Billion and Counting

Business No Comments »

The world population has reached 7 billion and is heading towards 9 billion by 2050. But the story is more complex than that single sentence implies. Population demographics are changing. The internal population of the first world, excluding naturalized citizens and resident aliens, is level or in some cases shrinking, and some countries, such as Germany, are creating incentives for population growth. In contrast, much of Africa and Asia, excluding Japan, are experiencing huge growth. Barring some unforeseen major shift, experts expect India, which presently has the second largest population, to pass China in 2025. Another implication of these trends is the impact on both culture and religion. Worldwide for instance, the populations of certain religions are growing lot faster than other; needless to say, this will have an impact on our geo-politics. Further, the ratio of men to women is also changing worldwide as many third-world cultures prefer male children.

Perhaps the most important implications of this news is that the amount of basic resources – food, water both for personal use and agriculture, and energy in particular – are being strained by the continued population growth. Both China and India, for instance, are tapping deep, antique aquifers that do not refill to support increased agriculture to feed their populations. When those inevitably run out, they will need to find a replacement water source or face a decrease in food production, which has negative implications for supporting their population. Every additional individual will require the production of more food, water, and energy, which will generate increased waste and pollution, further escalating the struggle over natural resources and space.

This is also the year of increasingly extreme weather conditions, widespread drought, enormous forest fires, and especially historic rainfall and flooding. This year has seen unusual flooding throughout the world including the Mississippi Valley in the United States, Queensland in Australia, and currently ongoing in Bangkok, the capital of Thailand. These exacerbating conditions will further impact our productivity and put more strain on the system.

The bottom line: We have to do more with less. We need to further educate our youth on population growth problems and on the need to establish a balance between population and the resources available. The solutions will always lie in empowering youth to make the right decisions. We need to explore sea water purification to increase the water supply for agriculture and personal use, solar and wind power to supplement and eventually replace more polluting forms of power generation. In my recent trip to UC Denver, saw their Solar Powered Light Bulb Invention. This is great, as bulk of third world countries still use fuel for light. This is perfect example for doing more with less; we need more of this to handle the population growth.

Solix is all about efficiency, do more with less, organizing data, optimizing CPU, Memory, Storage, and therefore Energy usage. This has an impact on the entire infrastructure – network, hardware and software resources and of course on backup/recovery operations.

What can we learn from HP

Business No Comments »

Our prospects often ask why should they should select Solix when major players like IBM and HP also offer information lifetime management (ILM) products that duplicate many of the basic features of our product. The answer is that we are singularly focused on ILM, both in terms of service and advanced development. IBM and HP are certainly good companies, but for them ILM is a minor part of their overall product set. ILM customers of IBM and HP will never get attention from the company CEO (unless they are also major users of many other products from those companies). And while these companies certainly have vast resources compared to Solix, those are not focused primarily on their ILM products but rather on their larger product sets. As a result small vendors tend to provide more advanced, best-of-breed products, and that certainly is true of Solix.

Unfortunately the normal vendor selection process fails to recognize these advantages. It is generally remains mired in bureaucratic processes. RFPs fail to differentiate among mature products or identify innovators. They usually are based on requirements that buyers can envision now, often missing the vision for the future. Their feature lists look quite similar to capabilities that vendors can deliver in current releases rather than more visionary features that don’t exist in many products today. The result: Newer, innovative products don’t get considered often citing viability and track-record concerns.

Leaving aside the questions of whether HP’s recent reorganization is good or bad for HP, we can derive several specific lessons from HP’s recent announcements that apply to the question of small vendors in the marketplace:

  • Small vendors are often eliminated from consideration as suppliers based on long-term viability concerns. HP’s sudden decision to kill its webOS-based products including both its new tablets and Palm smartphones just weeks after announcing those tablets and making public statements about its commitment to competing in the tablet market, however, show that big vendors can simply shut down whole product sets without warning. How would you feel if your company had made a commitment to those webOS tablets based on HP’s overall relationship with your company, its size and stability, and its assurances from high level executives of its long-term commitment to that technology?
  • Large companies traditionally have problems keeping up with the leading edge in technology innovation in part because their very size creates inertia. Innovation in large companies, as HP has demonstrated, often focuses on major business moves such as HP’s announcement that it will sell its entire PC division and get out of the PC business while moving into software by acquiring Autonomy. Customers of HP desktops and laptops have to wonder who their supplier will be next year.
  • Small companies with a precise focus often are less vulnerable. Japanese car companies were smaller when they started, but they brought in a new design philosophy and better technology at lower cost. Because they never had manufactured big, high-performance but gas-guzzling, overhead-cam engines, they were free to introduce new, more economical, engine technologies. They are forced to differentiate through innovation. Another example, Tesla has created the first really practical electric car in terms of overall performance, forcing the larger companies to embrace electric power. And the customers of those small companies benefit from that innovative spirit, not just today but into the future. A small vendor with one product set is totally committed to that product for its survival and needs to keep innovating on that platform to stay ahead of the competition and maintain its differentiation.

The overall lesson here is that vendor size does not guarantee stability of a specific product. A small vendor that is focused on a vital product will be closer to its customers and understand their needs for that product better. It will put more effort into meeting those needs, including those that the customers themselves may not understand, because its entire organization from top to bottom is focused on that product and on how it best fits into the overall IT ecosystem. It will provide excellent service because every customer is vital to it. And it will not suddenly abandon its core product set for something entirely new.

 

One Percent Clone

Data Management No Comments »

Here is a case study of one of our customer, how they used Solix EDMS Test Data Management tool set to automate the creation of subsets of production databases for test and development, increasing productivity, saving money, and improving security.

We recently completed an engagement with a large Public Sector customer based in Washington DC, working with an Oracle EBusiness Suite, a 16 TB database. We were able to automate the creation of data subsets for test and development and the resulting database size is 1% (180 GB to be precise) of that database while meeting the requirements of the test environment. This means we saved 99% storage and depending on the number of copies, the storage savings could be significant. If you further consider the savings in backup, and CPU use, along with associated energy and cooling, the overall savings is substantial. And smaller the dataset, faster to load, backup, and restore, and requires fewer hardware resources overall. This also improves security, as this data subset contains lesser data than fully operational database, further sensitive data has been masked by Solix EDMS Data Masking.

An adequate test database must contain data that meets the needs of the selected test cases in sufficient quantity to meet the test requirements. And to test an application correctly, the test environment must match the production system as closely as possible and it must meet the needs of development, testing & training environments, each of which may have a different level of requirements. Determining how much data and what data for each environment, manually is complicated and time consuming, and IT staff lacks the time, many shops simply clone their entire production database for tests and end up creating data breach situations like what Sony is currently facing.

Test Instances for other than for load testing, can work with smaller subsets. In this case, for instance, using a clone of the entire 16 TB database would create tremendous delays in testing. Just loading the full database would take significantly longer than the subset Solix created, and then each test that involved processing the data would be similarly time-consuming. And it would require essentially a full duplicate of the hardware in the production environment, starting with enough storage to hold the 16 TB database. Studies show that 60% of application development and testing time is devoted to data-related tasks that, at best, have only a peripheral relationship to the applications actually being tested. All of this time, and the extra resources required by the full database, are wasted, therefore.

IT organizations looking for ways to improve productivity need to attack this problem. The best answer is to use an automated tool set to capture test data requirements and create a subset of the production database that meets test needs while minimizing the size of the test database. And that toolset should also allow you to mask sensitive data in the dataset, to provide the highest level of security to that data. This is exactly what the Solix EDMS tool set does. Here is a case, where one of the largest data centers with one of the most popular enterprise application is using Solix EDMS Test Data Management.

Solix EDMS Test Data Management

Enabling the Data Center of Tomorrow

Cloud Enabling Solutions No Comments »

Today’s enterprise data centers are bloated, out of control, stuffed with extra servers, storage, and networking devices maintained at great expense to meet the requirements of maximum loads that only happen for a few days a month or one month out of the year. As a result, they use much more power, cooling, and space than IT can afford in this era of limited resources and soaring energy costs. The data center of tomorrow needs to be more flexible and efficient, use less space and power, and be easier to manage and scale.

To get there, analysts predict enterprises will focus their infrastructure spend on virtualization, data center consolidation, and data center migration in 2011, while the leading organizations are already moving into private and hybrid public/private cloud architectures.

We see major benefits for enterprises and medium-sized companies in building a hybrid cloud architecture on top of a virtualized data center environment. Chief among those are:

  • Shifting costs from capex to opex by moving part of the data center to the public cloud, thus freeing scarce capital for more important investments.
  • Optimizing human capital and IT resources by focusing internal resources on critical applications and data while moving more peripheral operations either to SaaS providers or public cloud infrastructure provided by companies like Amazon, Google, and Microsoft.
  • Consolidating the data center by facilitating resource sharing through a private cloud infrastructure sized to support normal operations while drawing on public cloud infrastructure when needed to meet high demand. This will reduce operating costs such as power and cooling, maintenance, and administration as well as capex investments while increasing SLA compliance.
  • Transferring risk particularly for maintaining high availability during high demand periods and mis-estimating work loads, from internal IT to the cloud service provider.
  • Ensuring availability at lower cost by increasing flexibility with an architecture that automates shifting resources to meet increased demands both by reallocating internal compute, storage, and network resources and drawing on public cloud resources as required.

Creating a successful hybrid cloud architecture, however, takes some thought. The basic question that must be answered is which applications and data should be retained in-house and which should be entrusted to the public cloud.

Our recommendations are:

  1. Move active applications to a private cloud: According to Gartner, private cloud computing is gaining interest among large enterprises, with investments in private clouds increasing steadily through 2014. The biggest beneficiary will be the end-customer, as the growth of private cloud changes the relationship between the customer and the IT organization. Metrics will be based on service delivery. This will not necessarily be an easy change. It will require management support, process changes, funding changes, service standardization, and most important, changes in culture. But the advantages in increased resource utilization, savings in both capex and opex, and increased flexibility to meet demand changes, will more than pay for the investment.
  2. Move semi-active applications to public cloud: Applications/data that are less active and perhaps read-only, with lower SLAs that will not be impacted by access through the Web, and that have lower security needs, are candidates for public cloud platforms like Amazon EC3 and Microsoft Azure.
  3. Move inactive data to Solix ExAPPS: Moving inactive, historic data to either the public or private cloud only increases your cost, since you are paying by GB/CPU, while providing little benefit, since this data is rarely if ever used. The better strategy is to retire this data using Solix ExAPPS, which automates the process based on customizable business rules while keeping the data available in a secure, read-only environment that meets compliance requirements for business data preservation.

Solix EDMS and Solix ExAPPS can help you create your Datacenter of Tomorrow


© Solix Technologies, Inc.
Entries RSS