Mainframe Data on the Move: A Security-Centric Perspective, featuring Colin Knight

Oct 14, 2025

Amanda Hendley is the Managing Editor of Planet Mainframe and host of the Virtual Mainframe User Groups. With a career rooted in the technology community, she has held leadership roles at the Technology Association of Georgia, Computer Measurement Group (CMG), and Planet Mainframe. A proud Georgia Tech graduate, Amanda spends her free time renovating homes and volunteering with SEGSPrescue.org in Atlanta, Georgia.

At the September Virtual Db2 User Group, we hosted Colin Knight, longtime mainframer and Db2 Systems Programming Tech Lead at NatWest. With over 40 years in the field—36 of them working with Db2—Colin cut through the hype to focus on what matters most: security when data moves. It’s a BIG deal.

Watch the session recording.

Why “Data in Motion” Raises Risk

Big data keeps exploding. Estimates from the IDC Global DataSphere place the global datasphere near 181 zettabytes by 2025. Every mobile ping, video doorbell, and traffic camera adds to the flood. Avoiding data creation for even one day is nearly impossible. That increase over increase in volume amplifies risk.

As Cloudera said, “Data is no longer something we collect and analyze at a later time. Data in motion is data is on the move, being processed in real time to generate insights or trigger immediate actions.”

IBM Cost of a Data Breach Reports show that breaches involving data in motion are among the most expensive to remediate. As more organizations push sensitive data across hybrid environments, security for data in transit matters as much as security for data at rest.

“Security is a key element in everything we do.” Colin Knight, DB2 Systems Programming Technology Lead, NatWest Bank

Colin Knight’s overall recommendation: Don’t stop at securing data at rest. Also protect data in motion through encryption, synchronization, and monitoring across replication pipelines.

Making Db2 Data Analytics-Ready

Colin outlined NatWest’s options for moving Db2 data into analytics environments. Each has different trade-offs around latency, governance, and cost. The options include:

Unloads

It’s straightforward, batch-friendly, and reliable, but always behind the curve. Data freshness lags, making unloads best for non-time-sensitive reporting.

Q Replication

It is low latency and highly mature, though it carries operational overhead. It is ideal when you need reliable near-real-time updates between systems.

InfoSphere CDC → Kafka

This option is flexible, real-time, and powerful for streaming into analytics platforms. But it’s also CPU-hungry and schema-sensitive. Plan for capacity and schema-evolution gates to prevent CPU spikes and synchronization failures.

IBM Db2 Analytics Accelerator (IDAA)

It brings analytics closer to Db2, minimizing data export sprawl. Still, maintaining integrity requires strong synchronization discipline and careful rollback handling.

Latency, Rollbacks, and “Phantom” Records

Replication introduces risk beyond simple delays. A rollback in the source system can leave ‘phantom records’ in the target, stranded rows that corrupt downstream analytics.

Avoid phantom data through automated reconciliation, clear rollback policies, and data lineage tracking that aligns governance across both mainframe and distributed systems.

The Safer Path: Data Virtualization and APIs

Colin sees data virtualization as a key step toward reducing risk for data in motion. Instead of endlessly copying data into multiple locations, virtualization and REST APIs can expose mainframe data securely in near-real time. 

This approach reduces overhead, limits replication risk, and keeps the record system authoritative. Virtualization becomes the smarter, safer path when (if) SLAs allow near-real-time access without duplication.

He also highlighted IBM’s z17 AI acceleration, which brings analytical power closer to the mainframe itself and minimizes the security risks associated with movement and replication.

Key Takeaways

Mainframes are, and will remain, the system of record. However, as enterprises look to harness the power of big data, the focus cannot simply be on scale and speed.

  • Treat data in motion as a first-class security risk.
  • Match ingestion strategy to latency and governance needs: Unload, Q Replication, CDC/Kafka and/or Virtualization.
  • Prevent phantom records with rollback reconciliation and lineage automation.
  • Now and looking ahead, bring analytics to Z rather than taking data off it.

Colin reminds us that securing the journey of mainframe data into analytics is just as important as securing the mainframe itself.

Register for the Next Db2 Virtual User Group

Join the next Db2 Virtual User Group session to continue discussing Db2 intersection with mainframe modernization, data virtualization, and security in motion.

Sign Up for Nov 18

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Sign up to receive the latest mainframe information

This field is for validation purposes and should be left unchanged.

Read More

A Breakthrough in Mainframe Storage Efficiency

A Breakthrough in Mainframe Storage Efficiency

Broadcom has delivered a first in mainframe storage with the Virtual Storage Adapter (VSA) enhancement to CA 1™ Flexible Storage™. This new feature lets you have your cake and eat it too – achieve high-performance virtual tape storage without the traditionally higher...