We read a lot about in-memory technology these days, especially as it applies to Big Data and analytics. In fact, it’s one of the fastest growing areas in all of computing, and has been since before Gartner identified it as such back in 2012. That should be no surprise to anyone since these new technologies promise to allow businesses to obtain actionable insight into their own data. This type of insight will play a big role in how much success these businesses will experience in the future, and in-memory technology will play an important role. It may interest you to know, however, that in-memory technology has been around for a long time, runs on all platforms, and that Big Data and analytics may not even be where in-memory technology makes its biggest impact on businesses.
A bit of background on what in-memory technology actually does: It eliminates disk I/O, increasing the speed of access to data; thereby, increasing the speed of processing. As the amount of data that a business needs to process grows, so does the need for that processing to finish as quickly as possible, and in-memory computing is an effective means to this end. A secondary benefit is cost savings: in-memory technology allows data processing to use fewer machine cycles, which drives down the operational expense of that processing.
So how is in-memory technology used? Well, for Big Data and analytics, the main usage is in-memory databases (IMDBs), database management systems that primarily rely on main memory for data storage. This development was made possible by the dire business need for increased computing speeds coupled with the sharp decrease in memory costs. IMDBs provide the full feature set of a disk-based DBMS, such as language engines, ACID, data models, etc., including all of the support provided by a DBMS—locks, latches, SQL parsing, index management, etc. Examples include IBM BLU Acceleration, Oracle in-memory option, SAP HANA, and Tableau—and these are all distributed or cloud solutions.
But this is only the most recent use of in-memory technology. As most tech types will know, buffering and caching are also a form of in-memory technology. These techniques are as old as computing itself, and are still in use today—they reduce the time to access data by placing selected data, or the most recently used data within memory, where it can be accessed without I/O. Virtually all databases have some type of data buffering and caching, to the point where it is completely ubiquitous.
Another type of in-memory technology that has been around for a very long time is the use of in-memory utilities that support high-performance in-memory tables. These are used to bypass the overhead associated with all other types of DBMS (and IMDB) data access. They are used to augment an existing DBMS by making some data available to applications by using a very short code path, and are managed by an efficient API, which is called by an application to access the data.
Typically, 5% or less of the most-often accessed data would be copied from the DBMS into high-performance in-memory tables, leaving the bulk of the data in the DBMS. In this way, maximum benefit is achieved by handling a small amount of data in a much more efficient way. This type of in-memory technology has been used for decades by the largest banks, card networks, insurance companies, and retailers to optimize their most valuable, ultra-high capacity mainframe-based transaction processing applications.
In-memory technology is nothing new; it is a proven and reliable way to solve existing and new challenges. While it has laid the foundation for the fastest-growing areas in modern computing—Big Data and analytics—it also powers the most business-critical transaction processing systems on the planet, and is part of every computing system, new and old, anywhere and everywhere.
(originally posted on LinkedIn)
Regular Planet Mainframe Blog Contributor
Allan Zander is the CEO of DataKinetics – the global leader in Data Performance and Optimization. As a “Friend of the Mainframe”, Allan’s experience addressing both the technical and business needs of Global Fortune 500 customers has provided him with great insight into the industry’s opportunities and challenges – making him a sought-after writer and speaker on the topic of databases and mainframes.