One of the challenges of working in the modern world is adjusting to a faster and faster speed of computing. Globalization and mobile technology mean instant communications and near-instantaneous reactions are the minimum benchmark a firm must meet to keep up. Luckily, the business world has the mainframe, the fastest and arguably one of the most powerful computer platforms available. Mainframes are more than capable of processing all the instantaneous interactions our digital-age business world requires, from communication across systems, to logging transactions, to performing analytics in real time.
Unfortunately, the extract, transform, load (ETL) software used to proliferate mainframe data can no longer overcome the limitations of delay, cost, and complexity. ETL is reliable as a combustion engine, but the mainframe hardware on which it functions can easily outperform what ETL can do. You don’t power a starship with a hemi, and nowadays, you need something better than ETL to leverage the power of data in a mainframe.
You need Data Virtualization (DV).
Data Virtualization is overtaking ETL as the right choice for mainframe data processing and accelerating business transformation. Here are just four of the many ways DV blows ETL out of the water in terms of making the best use of mainframes.
Where ETL fully extracts and examines every piece of data it works with, DV accesses the data where it resides, looking only at the information it needs for a given request. This vastly reduces the time necessary to find and utilize the information, as well as reducing the risk of data errors. It’s the difference between driving to the library, borrowing a book, bringing it home, and reading the whole thing and just Googling the specific paragraph of information you need from home.
A DV function is much simpler than an ETL function, because DV applications touch the pertinent aspects of a piece of data and not entire data sets. This means one system can perform many more tasks at once. So not only is the actual processing speed of each function much faster, a great many more functions can be performed at once, further expanding the mainframe’s ability to process data. For businesses like fiscal organizations with many customers who all require their requests be processed quickly, ETL is just too cumbersome to keep around.
The world wants real-time analytics. That means using the current data, as every second that goes by after a piece of data has been analyzed changes minutely the figures themselves, creating a new state no longer reflected by old analyses. ETL by nature is not real-time because it creates copies of data at a particular moment, so what you effectively have is an analysis of a photograph instead of a look at live footage. DV looks at data as it currently is, not a snapshot of what was, allowing for dynamic, real-time results.
Never in the history of security has it been a great idea to make multiple copies of sensitive information. Well that’s exactly what ETL does by its very nature: It copies data to multiple systems. DV allows organizations to store all their data in a single powerful “data vault” with no risk of compromise by copying data to other, potentially less secure locations.
In many ways, the mainframe is a device that’s ahead of its time. ETL helped us leverage just a little of Big Iron’s incredible power, but Data Virtualization is the next step to leveraging the mainframe, beating ETL in every way that counts. In a world where customers waiting seconds or even minutes for their access or analysis to be completed is unacceptable, businesses need the fastest computers in the world operating at peak performance.
With over 25 years of software development experience, Bryan Smith manages all aspects of R&D organization and operations for Rocket Software. Prior to joining Rocket, Bryan was an IBM Distinguished Engineer in the Information Management division of the IBM Software Group, where he held various engineering, management, and customer-facing roles. Bryan has thirteen granted patents on software and holds B.S. and M.S. degrees in Computer Science with a minor in Mathematics from the California State University, at Pomona and Chico.