IDC Spells Out Value of In-Memory as Database of Future

Researcher stresses that enterprises need to be doing business intelligence on hot, fresh data to make accurate decisions, not weeks-old data in a warehouse.

IT research firm IDC released a new report Aug. 25 that advises enterprises with a need to better handle their data for making strategic business decisions to look seriously at in-memory databases.

Fundamentally, the researcher said that enterprises need to be doing business intelligence on hot and fresh data to make accurate business decisions and projections and not waste time on information sitting in a data warehouse that is weeks or months old.

"The whitepaper is a fairly straightforward exposition of the benefits of in-memory [database] capability in general and for Oracle users in particular," IDC analyst Carl Olofson, author of the report, told eWEEK.

"In general, what it means is you can set up your database to do complex queries without facing the classic problems of allocating the data in storage, so that you can make the queries run faster using partitioning, that kind of thing, using com secondary indexes.

"You can also perform complex queries on production data [using an in-memory database] because they have that capability."

The in-memory database world is growing as IT decision-makers realize the value of speed in this department. MemSQL, Oracle DB In-Memory, SAP HANA, Pivotal, Starcounter, DataStax, Terracotta, EMC eXtremDB, Gemfire, Microsoft SQL, Informix, Oracle TimesTen, VoltDB, SafePeak and Kognitio are just a few of the in-memory databases now available.

These are still much more expensive than standard databases—often five to 10 times more costly at acquisition. But, like so many other business-tool decisions managers have to make, this type of new-gen component can make a huge difference in cost and efficiency to an enterprise over a span of time.

IDC research shows that most business managers admit to basing their intra-day decisions on personal knowledge and educated guesswork rather than on data. Data is commonly used for monthly, quarterly, annual and multiyear planning—but not for on-the-spot business decisions. This is what needs to change, Olofson said.

The reason for this old-school approach is simple: The data is not available, Olofson said. Most business intelligence systems extract data on a periodic basis from operational databases and load it into analytic databases for query and reporting. These may be operational data stores, data marts or data warehouses.

Operational data stores and data marts typically have schemas similar to those of operational databases, but with indexes for query enhancement, cubes, materialized views and so forth to enhance query performance. They may receive data through frequent extract, transform and load (ETL) processes or from dynamic data movement tools, such as change data capture (CDC) software.

They are used to making short-term planning decisions, looking at the next day, the next week and the next month. Data warehouses, which receive data from many databases and collect it in a schema carefully crafted to encompass all the source data in a way that optimizes the value of queries, receive their data through scheduled ETL (extract, transform and load) jobs and are used for quarterly, semiannual, annual and multiyear planning exercises.

Using an in-memory database, Olfson said, decisions can be made in near-real time, saving not only time and effort but hard storage costs.

You can obtain a portable document format (PDF) copy of the report here.

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor-in-Chief of eWEEK and responsible for all the publication's coverage. In his 13 years and more than 4,000 articles at eWEEK, he has distinguished himself in reporting...