By Sonja Soderlund
If you work with Db2 for z/OS, this week’s trivia challenge should interest you. We’re reviewing two articles that will help you maximize Db2’s capabilities and deepen your understanding of “data fabric”.
For those who spend their professional lives working with Db2 for z/OS, it’s a bad sign when things start to get a bit sluggish. But does a lethargic Db2 necessarily mean that it’s time for an expensive upgrade?
While many consultants might suggest that you re-architect key applications or possibly even consider migrating, Larry Strickland would encourage you to step back and consider another option.
Strickland, who is DataKinetics’ Chief Products Officer, is passionate about making technology simple for the end user. In his article, “Accelerating Your Database,” (♧ HINT) he offers a tutorial on leveraging memory to lower the demands your applications make on your Db2 database.
Strickland argues that while Db2 is indeed “THE database for business operations,” it’s likely that your Db2 applications are being slowed down by the need to access enormous amounts of database data for every business transaction.
In short, he suggests that by copying reference data (10% or less of your business data) into high-performance memory tables, you can massively reduce the burden on your Db2 database.
Read the whole article to appreciate the beauty of in-memory optimization.
“Data fabric” is a much-discussed topic in DB2 circles, and with the need for businesses to easily access data dispersed across multiple systems (both on-premise and in-cloud), it’s here to stay. In his article on the subject, Robert Catterall demystifies the topic, providing a succinct explainer of the subject for Db2 for z/OS people.
In “What Db2 for z/OS People Should Know About Data Fabric” (♧ HINT) Catterall argues that data fabric is strategically very important for z/OS as a data-serving platform. But his core argument is that data fabric is not just a matter of leveraging technology. In Catterall’s words, “data fabric is culture…requiring organizational roles and new ways of thinking about and managing data.”
If you’re foggy on the concept of data fabric, or just like a good argument, check out the whole article, then come back and take the quiz:
Which of these statements summarizes the concept of “data gravity”?
- The idea that data usage actions should flow from the data
- The idea that data usage actions should flow to the data
- The increased value gained from an organization’s data assets
- The difficulty of working with data in a “cross-silo” way
Click to Reveal Answer
B: The idea that data usage actions should flow to the data
A data fabric enabled via Cloud Pak for Data delivers multiple benefits, including:
- Minimization of data replication costs
- Protection of data security and consistency
- Optimized performance
- All of the above
Click to Reveal Answer
D: All of the above
According to Strickland, what solution might shorten a database overhead path to help with sluggishness?
- Offload processing
- In-memory data buffering
- Unencumbered memory
- High-performance in-memory tables
Click to Reveal Answer
D: High-performance
in-memory tables
Strickland lists several benefits of high-performance in-memory technology. Which of the following does not belong on the list:
- Batch processing times reduced from eight hours to less than one hour
- Protection of data security and consistency
- Application I/O usage reduced by 99%
- CPU usage cut in half
Click to Reveal Answer
B: Protection of data security
and consistency