Solix Big Data Suite
Data Growth Crisis

According to a recent survey by Gartner, data growth is now the leading data center infrastructure challenge1. Left unchecked data growth impacts application performance, increases costs and challenges compliance objectives.

“While all the top data center hardware infrastructure challenges impact cost to some degree, data growth is particularly associated with increased costs relative to hardware, software, associated maintenance, administration and services,” said April Adams, research director at Gartner2.

Structured data growth is capable of stripping entire data centers of cooling and power capacity. Data replication and disaster recovery processes are impacted because more and more data is harder and harder to move. System availability is reduced as mission critical batch processes are no longer able to execute within scheduled times, and “outage windows” necessary to convert ERP data during upgrade cycles extends from hours to days.

Unstructured data growth poses just as serious a challenge. Email, images, video, machine generated data and social files are equally as critical to business success, and unstructured data is being created and stored at an even higher rate.

And equally important, increasing user demand for specialized analytics to mine enterprise data for better business results has compounded the data growth challenge. Gartner has remarked that, “by 2016, 75% of structured data archiving applications will incorporate support for big data analytics.”3

Solix Big Data Suite

The explosion of both structured and unstructured data is driving adoption of a new enterprise blueprint to enable growing amounts of enterprise data to be stored more reliably and at the lowest possible cost. The Hadoop Distributed File System (HDFS) has rapidly emerged as the leading nearline storage platform because it provides secure, stable storage for structured and unstructured enterprise data with enhanced access. Moreover, Apache Hadoop represents the lowest cost alternative for highly scalable, bulk storage of enterprise data.

The Solix Big Data Suite leverages an Information Lifecycle Management (ILM) framework and Apache Hadoop to store less frequently accessed enterprise data in a nearline repository. Moving less frequently accessed data to a nearline repository improves production application performance, reduces infrastructure costs and enables powerful big data analytics opportunities.

Solix Enterprise Archiving and Solix Enterprise Data Lake applications utilize best practice ILM processes to ingest and store both structured and unstructured enterprise data. Data retention is based on policies and business rules to ensure proper compliance and control. Universal data access is maintained through structured reporting as well full text search.

Solix Enterprise Archiving

Solix Enterprise Archiving and application retirement with the Solix Big Data Suite improves enterprise application performance and reduces infrastructure costs. Enterprise application data running online is first moved, and then purged from its source location according to ILM policies to ensure governance, risk and compliance objectives are met.

Data archiving best practice requires that MOVE and PURGE processes be coordinated and validated. Solix Enterprise Archiving ensures proper data governance since enterprise data is ingested and stored based on retention management policies with support for custom business rules. Archive data is classified for security and compliance requirements such as legal hold, and universal access is provided for business users through structured reports and full text search for business objects.

Solix Enterprise Data Lake

A central challenge for enterprise data warehouse (EDW) platforms is to deliver highly specific data views that meet the needs of business users rather than canonical top-down enterprise views which may or may not satisfy end user’s requirements. The Solix Enterprise Data Lake reduces the complexity and processing burden to stage EDW and analytics applications, and it provides highly efficient, bulk storage of enterprise data for later use when it is needed.

The Solix Enterprise Data Lake provides a copy of production data and stores it “as is” in bulk to be better described and distilled later. This simple COPY process eliminates the need for heavy extract transform load (ETL) processing during ingestion. Once resident within the Hadoop file system (HDFS), enterprise data may better described or transformed later for use with business analytics applications such as those available from the Solix App Store.

The Solix Enterprise Data Lake employs an Information Lifecycle Management (ILM) framework to meet governance, risk and compliance objectives and ensure that best practices for data retention and classification are deployed. ILM policies and business rules may be pre-configured to meet industry standard compliance objectives such as COBIT or custom designed to meet more specific requirements.

Big Data and The New Enterprise Blueprint

We now understand that the world is drowning in data. It is estimated that over 15 petabytes of new information is created everyday….


Solix Big Data Suite

A perfect storm of data growth is brewing. According to a recent survey by Gartner, data growth is now the leading infrastructure challenge….



Solix Enterprise Archiving

There are nearly one trillion devices connected to the Internet and data growth is now managed in petabytes….



Solix Enterprise Data Lake

CIOs are in a difficult position. The demands for operational efficiencies and improved enterprise data warehouse (EDW) seem to be odds….