How Geniez Is Turning Real-Time Mainframe Data Into Fuel for Modern LLMs
Amanda Hendley, Managing Editor of Planet Mainframe, met with Gil Peleg, CEO of Geniez, for a conversation about artificial intelligence (AI)—specifically, the future of AI on the mainframe.
A New Chapter in Mainframe Innovation
Gil Peleg has spent nearly three decades pushing innovation on the mainframe—from IBM storage development to major enterprise environments to founding Model9. After selling Model9 to BMC and spending time inside the company, Peleg left to start a new adventure.
A common thread throughout his career is moving the mainframe forward. When he left BMC, AI dominated every strategic conversation. Enterprises wanted to experiment with LLMs and find practical business uses for AI. But Peleg noticed a major gap: organizations wanted something as easy as ChatGPT, yet nothing worked that seamlessly inside mainframe environments.
“The question became,” asked Peleg, “How do we connect LLMs to mainframe environments to derive business value for the companies we work for?’”
Why the Mainframe Needed AI
The interest wasn’t theoretical. The mainframe already holds the most valuable data in the enterprise. It contains the most up-to-date information along with decades of business history. That includes DB2, IMS, VSAM files, OPERLOG, SMF data, and more. This data is accurate, governed, and essential to daily business.
But mainframes are notoriously difficult to connect to modern AI systems without building complex pipelines, duplicating massive datasets, or introducing unnecessary risk.
“Mainframe data is a treasure trove for AI. But connecting LLMs to that data in real time? Nobody had solved that.” – Gil Peleg
Gil and his co-founder, Dan Shprung, saw a clear opportunity—one that was needed, desired, and valuable. Geniez was created to meet that challenge.
An Inevitable Convergence
The convergence of mainframe and AI was bound to happen. The mainframe runs the world’s most critical workloads. AI is reshaping how individuals and enterprises operate. These realities must meet.
“Leadership is already pushing teams to show how they’ll use AI,” Peleg said. “Our job is to make sure the mainframe isn’t left out.”
What Geniez Actually Does
Geniez enables real-time, secure access from LLMs directly to mainframe data. In practice, users can communicate with mainframe systems directly through natural language—without moving data, replicating data, or refactoring data.
Most modernization platforms live outside z/OS, but Geniez runs on the mainframe itself.
“There are things you simply can’t do from the outside. Running on the mainframe is how we tackle the real challenges.” – Gil Peleg
“There are things you simply can’t do from the outside,” Peleg said. “Running on the mainframe is how we tackle the real challenges.”
Running on-platform unlocks several infrastructure advantages:
- Native security and RACF/ACF2/Top Secret integration
- Access to proprietary data formats
- Ability to leverage ZIP engines and Linux on Z
- Hardware acceleration on z16/z17
- Consistent performance and reliability
Customers describe using Geniez as “like magic,” which is fitting—but Peleg is quick to clarify that the magic comes from engineering, not illusions.
“There’s a lot of technology around securing the connection, doing it at scale, ensuring performance, and making it cost-effective,” he said. “It’s a whole framework, not just a bridge.”
MCP in the AI Ecosystem
A key factor in Geniez’s architecture is the Model Context Protocol (MCP)—a newer protocol gaining traction across the AI world. According to Microsoft:
“MCP is a protocol designed to let AI models interact seamlessly with external tools and services—essentially a universal connector for AI.”
By implementing MCP on the mainframe, Geniez brings z/OS directly into the modern AI ecosystem.
“If an AI leader releases something new tomorrow,” Peleg said, “it already works with the Geniez framework.”
Practical Business Value
Customers aren’t using Geniez for demonstrations—they’re using it to solve real problems:
- Query DB2, IMS, VSAM, logs, and SMF in natural language, improving accessibility and reducing reliance on specialists
- Use LLMs to troubleshoot system issues for operational insight and faster root-cause analysis
- Analyze SMP/E and audit data for security risks, improving detection and compliance
- Deliver AI-driven features faster—without writing new mainframe code
In an environment where mainframe staffing budgets rarely grow, this combination is powerful. According to the 2025 Arcati Mainframe User Survey, most enterprises are shifting toward automation, outsourcing, and efficiency-focused strategies rather than expanding mainframe teams.
Start Exploring Mainframe + AI
For teams interested in exploring AI use cases on the mainframe, Peleg offers straightforward advice:
- Learn the basics of generative AI
- Understand the security models and guardrails
- Look for augmentation opportunities—not complete rewrites
- Experiment with real use cases tied to existing data
Geniez shares demos and examples openly on its YouTube channel, making it easy to see the technology in action. Teams can also explore their recently published use-cases whitepaper.
What’s Next for Geniez
The company recently secured additional venture funding and is expanding its team and product roadmap. The message is clear: this is only the beginning.
“We’re growing fast and bringing new innovation into this market,” Peleg said. “You’ll see more from us soon.”
“If an AI leader releases something new tomorrow, it already works with the Geniez framework.”-Gil Peleg
Given the pace of AI and the value locked inside mainframe data, that seems like a sure bet.
Further Reading
Read how IBM z17 handles AI on the mainframe.
Read the Transcription
I’m Gil. I’ve been in the mainframe space for 27 years already. I worked for IBM in storage development and for some other client shops. Many people know me as the founder of Model9. I started Model9 in 2016—we were connecting mainframes to cloud. We sold Model9 to BMC three years ago, and after a short while at BMC, I left to start Genies.
Interviewer:
So what really led to you starting Genies?
Gil:
My entire career, I’ve focused on bringing innovative technologies to the mainframe world. When I left BMC, the hottest trend in the industry was AI. We were talking to customers, hearing their pain points, and realized there was an opportunity to do this again—this time with AI and the mainframe.
Interviewer:
Tell me more about the market gap you saw and what inspired you.
Gil:
When we looked at the AI–mainframe landscape, we saw vendors focusing on AI on z/OS—which is great—and others who aren’t even mainframe vendors trying to help with things like code and connectivity. But the big gap was this: we all use ChatGPT in our personal lives, but how do we bring that into our professional lives? More broadly, how do we connect LLMs to mainframe environments to derive business value for the companies we work for? That became something customers were interested in, and we went for it.
Interviewer:
Were there things you’d been working on for a while that led you here, or was it purely curiosity?
Gil:
At Genies, we try to leverage the unique assets companies have on the mainframe. I started in storage, evolved into data, and mainframe data is incredibly interesting and important. It’s the most up-to-date data from real-time transactions hitting the system and decades of business history stored on the mainframe. These are treasure troves for AI. We understand data and storage, and we saw an opportunity to build something valuable.
Interviewer:
For someone just hearing about Genies, what do you do?
Gil:
We call it connecting LLMs and AI agents to real-time mainframe data. Think of it as a simple, innovative way to connect mainframe data to the latest applications companies are building—today that means AI applications. They involve LLMs, agents, and bridging the gap to actual mainframe data to solve specific mainframe problems securely and at scale. Everything you’d expect from a mainframe product—that’s where we come in.
Interviewer:
So how are you connecting generative AI to the mainframe?
Gil:
It’s not one trick. Some customers first think it’s magic—plug it in and you’re done. I wish it were that simple. We leverage industry standards. From the AI side, we’re compatible and easy to plug in. One example is the MCP protocol we use. But that’s not the end of the story. There’s a lot of technology around securing the connection, doing it at scale, ensuring performance, and making it cost-effective. It’s a whole framework, not just a bridge.
Interviewer:
How is your approach different from how others approach AI on the mainframe?
Gil:
Our team has deep mainframe skills. Unlike companies trying to modernize the mainframe from the outside, we leverage the strengths and technology of the mainframe itself. We write code that runs on the mainframe. Our product runs on the mainframe. We tackle real mainframe challenges: accessing hard-to-reach data, securing it using native mainframe capabilities, and working within the platform’s architecture. Running on the mainframe gives us an advantage—there are things you simply can’t do from the outside.
Interviewer:
You mentioned this is about the data. What does that mean when you say you focus on the data first?
Gil:
The mainframe holds the most up-to-date data in the company, as well as all its historical data. This could be DB2, IMS, VSAM files, operational logs like OPERLOG, or SMF data. Customers tell us that accessing and leveraging that data in an AI application is a major challenge. So we focus on delivering that data to AI applications.
Interviewer:
You’re here at GSUK talking about Genies and what it does. Tell me what’s exciting customers and what sets you apart from other AI-on-mainframe attempts.
Gil:
Customers tell us that for the first time, they can use generative AI for their mainframe use cases and mainframe data. They already use ChatGPT personally, so the idea of asking natural-language questions about mainframe data feels like magic. For example, if a tax regulation changes, instead of writing new mainframe code, they can ask an LLM what changed and what it means for their system. Operationally, they can ask what’s happening in their system right now, look at logs, and ask where the problem is. One customer even said we picked a good name—it feels like a genie.
Interviewer:
Does this make mainframe data more accessible within a company?
Gil:
Yes. Customers implement our solution because they want real-time access to all mainframe data. Historically, companies tried to move some data off the mainframe—but no one moves all of it. There isn’t enough time in the day. And real-time replication solutions don’t scale to all mainframe data. Our approach queries data at the source on the mainframe in real time and returns results immediately.
Interviewer:
What are clients saying about the business value?
Gil:
Some use it to analyze system activity—when they hit a problem, they use an LLM to analyze audit logs or system logs and determine what happened. That reduces downtime, which has a clear dollar value. Others use it for productivity. If they don’t have to write new mainframe code to build an AI application, that’s faster time-to-market. It frees employees to do other work. And in a world where staffing budgets are constantly being cut, doing more with the same workforce is critical. AI is a race—leadership tells teams to show how they’ll use AI to become more efficient, serve customers better, and beat competitors. We plug into these initiatives.
Interviewer:
The z16 and z17 releases added AI acceleration on-chip. Are you leveraging that?
Gil:
One key differentiator for Genies is that we run on the mainframe. So we leverage all the hardware advancements—ZIP engines, Linux environments, and built-in acceleration, security, performance, and scalability. We benefit from all of it.
Interviewer:
You mentioned you developed and deployed MCP on the mainframe. Why is that significant?
Gil:
MCP is a new and important technology in generative AI because it ties everything together—the LLMs, the data, the agents. Without it, you miss out on a lot of value. But it’s not just MCP. Many Python packages used in AI aren’t available on the mainframe. Many tools aren’t available yet. Instead of trying to port everything to z/OS, we connect those tools to the mainframe data. If a gen-AI leader releases something new tomorrow, it already works with the Genies framework.
Interviewer:
What are you hearing from people actually using Genies?
Gil:
Many use cases. One customer asked if they could check security vulnerabilities in their system using Genies. Yes—meaning we process data from SMP/E, security logs, or other sources, feed it to an LLM, and provide analysis. That saves them time. Another said even when they get graphs from system data, they don’t understand the meaning. Now they can ask an LLM to explain what’s happening. It saves time and helps them do their jobs.
Interviewer:
Where do you see AI going next?
Gil:
AI and the mainframe will inevitably intertwine. The mainframe runs companies’ most critical workloads. At the same time, companies are pushing to leverage AI to serve customers better and operate more efficiently. These worlds must connect. We aim to help customers prepare for the rapid advancements in generative AI so they can leverage them in their mainframe environments.
Interviewer:
What advice would you give companies that want to use AI with their mainframe but don’t know where to start?
Gil:
Keep an open mind. Educate yourself about generative AI—what it can do, even outside the mainframe. Just like you learned ChatGPT in your personal life, take a course, come to events like GSUK, learn what AI can do. Once people understand the security controls and how solutions like ours connect safely to the mainframe, they’re more willing to trust it and try it.
Interviewer:
Where can people learn more about Genies or see the technology in action?
Gil:
We don’t just talk—we show. The best place is our YouTube channel, Genies AI, where you can see multiple demos of the product in action and the value it brings to mainframe professionals.
Interviewer:
What’s next for Genies AI?
Gil:
We’re very excited. We’re hearing great feedback from customers. We just received an investment from another VC. We’re bringing innovation into this market, we’re hiring, we’re growing, and soon you’ll see some very cool things coming from us.
Interviewer:
That sounds great. I’m looking forward to it.
Gil:
Thank you.









0 Comments