COBOL and Mainframe Computing Skills

With the advent of the cloud, artificial intelligence and the advancements in quantum computing, there appears (and it should be stressed that it’s only an ‘appearance’) to be less interest in mainframe computing than some time ago. Over the past few years, especially with the explosion of cloud services, many have repeatedly predicted the death of mainframe systems. Yet, they continue to thrive, and they remain highly active behind the limelight. Mainframe computers remain essential in fulfilling critical applications in multiple contexts. Consider that the world’s main banks, as many as 45 out of the 50 largest in the world, as well as top insurance and telecom companies continue to rely on mainframes. Indeed, the very idea that mainframe computing is somehow synonymous with obsolescence, and that it is a technology ready to be consigned to the archives of information technology history is wrong. Rather, it is still used intensely and perhaps more than ever, outperforming its newer alternatives. To this day, the mainframe is still by far the most reliable, secure and scalable platform in the world. Almost all financial transactions worldwide pass through mainframes and practically all the distributed architectures of modern computing (e.g. virtualization) derive from the mainframe ones invented decades earlier. And to understand why, it is useful to remember what a mainframe is and how it works. 

Still, the number of mainframe computer manufacturers has waned over the decades. IBM is closely associated with mainframe technology, and it’s almost synonymous with it. But the fact that it has a number of competitors (including HCL Technology, Unisys Corporation, Infosys Limited and Hitachi Vantara among others) and that the market is expected to continue growing from the current $2.4 billion to $3.9 billion certainly urges a shift in perception about the future and potential of mainframe technology. Modern mainframe remain powerful and reliable, but they are also more flexible and able to address a wide range of applications, thanks to their ability to accommodate dozens of server -grade CPUs, 40 TB of error-correcting code (ECC ) -capable RAM, and countless petabytes of redundant flash-based storage. Therefore, they are able to process vast quantities of data while maintaining almost 100% uptime. And they support COBOL, C, C++, FORTRAN, PL/1 and even Java applications.

As for programming languages, many of the mainframe applications are decades old and rely on old but good and efficient COBOL. This has meant that many large companies have progressively found themselves at a competitive disadvantage compared to younger and more agile competitors when migrating from a mainframe-centric to a hybrid approach. The cost of migrating a large number of critical applications is exorbitant. But, for this very reason, rather than moving away from the ‘old technology,’ the complexity, risk and cost of migrating data, at both equipment and personnel levels, has encouraged many large companies to continue using mainframes. It may seem anachronistic to use COBOL, CICS, VTAM, DB2, and VSAM which are all systems for character terminal interfaces on a mainframe. Yet large institutions cannot break away from it. The reason is simple, in environments where raw data and the central aspect of the core business are still the right tools to deal with large volumes efficiently and quickly. Of course, IBM and its competitors continue to update the platform and now, developers can also use Linux, Node.js, Python, Docker, and Kubernetes. And when it comes to computer ‘linguistic’ issues, the needs that mainframes address, are still best met by the traditional programming languages, about which there’s nothing obsolete. 

Mainframes are used to process data in a manner that is different from a PC (or internet). The emphasis for the mainframe is accuracy of presentation. For example, in a web form, for example basic data points such as name, age, SIN, etc., an electronic version of the Tower of Babel of languages is needed, from html, JavaScript, CSS, backgrounds, photos, and so forth. All of this is then processed by the server-side program that will treat the data as it comes in from the client such that to transfer a few dozen bytes of personal data, more than a few megabytes are needed to manage these. On a mainframe, rather, transferring data needs no descriptive outline taking up huge amounts of code; because, everything around it is optimized and managed by dedicated hardware and software subsystems. Thus, any companies relying on the processing of millions of transactions per second can only do this effectively by using mainframes. The fact that web servers are used to communicate with clients overshadows the fact that the crucial transactions, and millions of them, are run by the mainframe. Moreover, in PC or Web world, the software needs frequent updates and major overhauls every decade or so. Yet, it’s essential to ensure that the new software remain compatible with multiple iterations of the previous ones to avoid losing customers, data and money. In contrast, mainframe software written 40 or 50 years ago can still be run without problems: backward compatibility is always guaranteed. 

The Problem of Skills

Thus, if there’s any problem with mainframe it’s not one related to their flexibility or ability to fulfill essential tasks reliably. But, there is a problem of skills. The skills problem derives almost directly from the popular perception that mainframes are considered outdated; and, especially, that they need special and rare powers to manage, limiting their attraction to young programmers and developers, who avoid specializing in this field. But there’s a caveat to this trend as emerged from a Kyndryl study. At Kyndryl, mainframe programming remains the foundation. At Kyndryl, 2,100 people in India, remain enthusiastic about mainframes, and address customers’ most demanding challenges. But, they cover different fields and experiences so as to acquire skills to work in the cloud. But, the work teams are built on a foundation of mainframe skills, and everyone who works on the cloud must also function with mainframes in order to build transversal skills. 

Kyndryl encourages close-to-retirement employees, who have spent most of their careers working on mainframes, to pass on their legacy to new generations, who are eager to learn. By working this way there isn’t a big skill problem, but it’s important that clients work this way too. And as mentioned above, it has been possible to run Linux on mainframes for a while now. Using Linux makes the platform more energy efficient, and that has sparked a wave of interest from companies. Just this spring there was a case study: the Port of Barcelona used Linux One mainframes to consolidate your applications and get better environmental outcomes. Obviously the other aspect is that of skills, so there is already Linux skills elsewhere and that makes it easier to manage mainframes.

Mainframes are high-performance computers, able to process large volumes of data and provide reliable, scalable, and secure services in a variety of fields, whether inclined to government or business where reliability and security are vital. Yes, they are old, the first examples were built in the 1950’s, (for that matter, jet engines are also old, yet they’ve evolved to become far more efficient and powerful and aren’t going away any time soon), in the sense that mainframes are one of the very first types of computers and are still widely used in critical industries. Everything computer science has to offer today was first created on a mainframe; the mainframe is the very essence of computing, efficiency, robustness and scalability, based on reliable hardware, which can grow and adapt to a business without interruption. It is no coincidence that the stock exchange, banks, insurance companies, telephone companies, airlines, energy companies, and governments have not given them up: simply, mainframe machines can run continuously 24 hours a day without disruptions of any kind and handle enormous quantities of data traffic without fuss.

Legacy mainframe systems may seem completely anachronistic in 2024. But it’s an entirely misplaced perception. It’s clear that any company that wants to be ready to face the future will have to continue to use mainframes as an integral part of its information management infrastructure. Consequently, how it requires a very high level of attention and planning, to ensure that the new generations of programmers continue to be able to function within a mainframe base, even while adapting the technology to address modern realities such as cyberattacks. More importantly, the persistent use of mainframes means that COBOL is still fashionable. The world’s banking, business and government systems are underpinned by COBOL, even if it’s a programming language developed well over 50 years ago, but few people currently know how to use it. COBOL has stood the test of time with great success, with billions of lines of code still in use today. The beauty of COBOL, just as the beauty of mainframes, lies in its stability and reliability, which allowed the code to be widely accepted and adopted. And, despite cloud computing, the mainframe market is alive and well, and as a result, so is COBOL.

4 thoughts on “COBOL and Mainframe Computing Skills might be Old, but they remain Essential”
  1. Good article. However, one thing that should be noted is that none of the companies listed in the second paragraph actually makes a true mainframe. The closest one is Unisys. They call it a mainframe but it’s actually an enterprise-class server that is based on Intel processors. As such, it does not contain the same internal structure. It does not use the mainframe’s EBCDIC character set without performance-expensive translation and does not use the mainframe’s native instruction set which means it cannot directly run binaries created on a mainframe without an abstraction layer (again, very expensive performance-wise).
    A few decades ago, IBM had actual competition in the mainframe hardware space with companies like Amdahl, Fujitsu, and Hitachi making plug-compatible versions of IBM mainframes. While this was excellent in theory, it was not sustainable in practice. IBM is formidable when it comes to making technology advances. Unfortunately, the competitors were not able to keep up. One by one, they dropped off until IBM was the only manufacturer of true mainframes.

  2. Those who never performed a real understanding of mainframe hardware architecture and who never tried to perform programming under mainframe operating systems and specific compilers will never understand what power was accumulated behind mainframes capacities and capabilities since the beginnings of the former ’60s until today and how the transaction world relies today on their p9rocessing power yet not surpassed by other machines, not including supercomputers dedicated for other R&D purposes.

  3. Alessandro, I agree with you that the perception that languages like COBOL are obsolete leads to the false belief that mainframes are outdated, yet there are excellent reasons why COBOL remains one of the best languages for mainframe tasks and competes easily with more recent languages that, while well suited to client-side tasks. are less suited to back-end data management.
    It is true that relatively few people know COBOL, but it is widely available and tools like Micro Focus Enterprise Developer ( allow COBOL programs to be developed and debugged in Windows/Unix etc. environments with Visual Studio or Eclipse, just like Java, C+, etc. Tools like MANASYS Jazz ( that write COBOL for you (and JCL, JSON, C# etc.) provide an easy introduction to COBOL, major productivity gains, and powerful tools to assist with legacy modernization.
    If you or any reader want to discuss this further, please reply, or contact me directly. My contact details at

Leave a Reply

Your email address will not be published. Required fields are marked *