What It Actually Solves (And What It Doesn't)
Every few months the industry rediscovers mainframes. This time the catalyst is AI – specifically, the idea that large language models can finally decode decades of COBOL and unlock the "exit strategy" that enterprises have been chasing since the 1990s.
It's a compelling story. And like most compelling stories about legacy modernization, it's about half right.
Let's give credit where it's due. The code comprehension problem is genuinely better now.
COBOL applications written in the 1970s and 1980s often have no living author, sparse documentation, and business logic so deeply embedded in the code that untangling it requires reading thousands of lines of procedure division logic. AI tools can now surface intent, generate documentation, map data flows, and explain what a program does faster than any human review process.
This matters because the assessment phase of modernization projects has historically been brutally expensive. Organizations would spend six to twelve months and millions of dollars just trying to understand what they had before they could decide what to do with it. AI compresses that timeline significantly.
Lower assessment cost means more organizations can get to a serious go/no-go decision without betting the farm on a multi-year program before they understand the scope.
That is real progress. It's just not the whole story.
Here is the part that most AI modernization pitches quietly skip: COBOL is not the only language running on the mainframe. It is not even the only language running inside a single COBOL application.
Behind every COBOL transaction is an ecosystem of supporting infrastructure that nobody puts in the marketing slides:
Assembler. The system exits, the performance-critical subroutines, the I/O routines, the SVC handlers, the cross-memory services. Assembler code sits at the lowest level of the mainframe stack and touches everything. It is harder to read than COBOL by an order of magnitude. Most AI tools trained on modern languages struggle with it fundamentally – the idioms are different, the macros expand in ways that require understanding the macro library, and the register conventions are implicit knowledge that lives in the heads of systems programmers who learned them decades ago.
PL/I. Older than many of the developers being asked to modernize it, PL/I is still running critical workloads at insurance companies, government agencies, and financial institutions. It has a different structure than COBOL, different data types, different I/O models. AI tools that were trained primarily on COBOL often produce misleading analysis when pointed at PL/I because they pattern-match against the wrong language conventions.
CLISTs and REXX. The automation and glue code that holds the operational side of the mainframe together. CLIST procedures and REXX execs control ISPF dialogs, automate operator tasks, manage dataset allocation, drive batch job submission, and implement site-specific workflows that nobody documented because "everyone knows how it works." Except the people who knew have retired. These are not application code – they are operational infrastructure. Modernizing the COBOL without understanding the REXX and CLIST layer that supports it is like renovating a house while ignoring the plumbing.
JCL. Every batch process, every started task, every system procedure is defined in JCL. The relationships between jobs – which datasets flow from one step to the next, which condition codes trigger which paths, which GDG generations feed which downstream processes – are encoded in JCL that was written over decades by dozens of different people with different conventions. AI can parse JCL syntax. Understanding the intent behind a complex JCL procedure with nested PROCs, symbolic parameters, and conditional execution requires contextual knowledge that syntax parsing does not provide.
ISPF panels and skeletons. The user interface layer for online applications. Panel definitions, message members, skeleton JCL, file tailoring – these define how users interact with the system and how batch jobs are dynamically generated. They are tightly coupled to the CLIST/REXX layer and to the application logic.
Utilities. IDCAMS, IEBGENER, IEBCOPY, DFSORT, ICETOOL, SMP/E procedures – the standard IBM utilities that are called from JCL and controlled by utility-specific control statements. Each utility has its own syntax, its own conventions, and its own failure modes. A modernization assessment that inventories the COBOL programs but not the utility steps in the JCL is missing half the picture.
The point is not that AI cannot help with these. Some tools are beginning to address JCL and REXX analysis. The point is that the mainstream narrative – "AI reads your COBOL and tells you what it does" – dramatically understates the scope of what actually needs to be understood before a modernization decision can be made responsibly.
A typical mainframe application at a large enterprise is not a collection of COBOL programs. It is a COBOL application supported by assembler subroutines, automated by REXX and CLISTs, defined by JCL procedures, presented through ISPF panels, dependent on utility processing, and connected to subsystems like CICS, IMS, DB2, and MQ that each have their own configuration, tuning, and operational knowledge.
Understanding the COBOL is step one. There are nine more steps.
Most failed modernization projects didn't fail because the code was unreadable. They failed for reasons AI doesn't touch:
The business rules are the COBOL. In many organizations, the mainframe application isn't just running the business – it is the definitive record of how the business works. Edge cases, regulatory accommodations, decades of patches for situations nobody remembered to document. Decoding the code tells you what the program does. It doesn't tell you whether the target platform will handle those edge cases correctly under production load.
The dependency map is always worse than you think. A typical mainframe application at a large financial institution isn't a standalone program. It's connected to dozens of upstream data feeds, downstream report consumers, batch job schedules, and interfaces that were built over thirty years by people who are no longer with the company. AI can help map some of this. It can't negotiate with the downstream system owners or absorb the integration testing cost.
It's a business transformation, not a technical project. The organizations that succeed at modernization do so because they treat it as organizational change that happens to involve technology. The ones that fail treat it as a technology project that will somehow not disturb the organization. No amount of AI capability changes that dynamic.
Here's the part of the conversation that doesn't get enough airtime: the mainframe infrastructure itself is genuinely exceptional.
When people talk about "exiting the mainframe," the implicit assumption is that the destination – cloud-native, distributed, microservices – is obviously better. That assumption deserves more scrutiny than it usually gets.
Modern mainframes handle transaction volumes that most distributed architectures struggle to match. The RAS characteristics – reliability, availability, serviceability – reflect decades of engineering specifically optimized for workloads where failure is not an option. Cost-per-transaction at scale, when calculated honestly including the cloud infrastructure, licensing, operational complexity, and engineering labor required to achieve equivalent throughput and availability, often favors the mainframe more than the exit advocates acknowledge.
This isn't an argument for keeping COBOL forever. It's an argument that the destination-platform comparison should be done rigorously before a migration program is greenlit – and in practice, it usually isn't. The decision to exit is often driven by talent availability concerns (we can't hire COBOL developers) or vendor pressure, not by a genuine engineering analysis of what the replacement platform will cost to operate at equivalent service levels.
AI makes it easier to read the legacy code. It doesn't make the target platform better, and it doesn't make the comparison easier to do honestly.
What I find more interesting than the exit conversation is the modernize-in-place conversation.
AI tools that wrap COBOL applications in modern APIs, generate documentation that reduces the knowledge-dependency risk, improve observability, and enable developers who don't know JCL to contribute to mainframe systems – these create real value without requiring a bet-the-company migration program.
And critically, the same tools need to extend beyond COBOL to the full stack: generating documentation for assembler subroutines, mapping REXX automation workflows, analysing JCL dependency chains, and capturing the operational knowledge embedded in CLISTs and ISPF skeletons before the people who wrote them leave.
For many organizations, the right answer isn't "exit as fast as possible" or "stay forever." It's "reduce the risk and cost of the current state while building genuine optionality for the future." AI is genuinely useful for that – if applied to the full scope of what's actually running, not just the COBOL.
The silver bullet narrative is seductive because it offers a clean ending to a problem that's been expensive and embarrassing for decades. The reality is more nuanced: AI is a powerful tool that makes one hard part of modernization meaningfully easier, while leaving the harder parts – organizational, architectural, commercial – exactly as hard as they were before.
That's still worth something. It's just not the end of the story.
IMUAI analyses S0C7 abends, CICS failures, and batch job issues using runtime evidence – not just source code. It works with the full stack: COBOL, assembler, JCL, REXX, and system-level diagnostics.
Learn about IMUAIIM3270 is a modern 3270 terminal emulator for Linux. Free 60-day trial.
Download IM3270 Free