It’s no secret that many enterprises are already heavily invested in their mainframe environments. Legacy systems are associated with millions of dollars of sunk costs. Financially-minded teams are keen to ensure legacy investments maximize returns. After all, getting value out of the mainframe is harder these days. Assets can be difficult to access thanks to connectivity issues, syntactic and semantic issues at invocation, and the the infamous legacy skills gap. However, organizations that adopt an API model to integrate their mainframes with modern applications add years of functional life and value to their existing technology. Indeed, API-enabling your mainframe means doing more with less.
API-enabling your mainframe connects disparate systems and disparate groups
To start, an API approach to the mainframe offers a way to bridge modern and legacy IT systems. APIs can bridge gaps between people too. For one, it allows both old world and new world developers and programmers to work in their own environment using the coding languages that they know best.
At the center of the API model for mainframes is the application programming interface itself. It is fundamentally one or more lines of code, supplying parameters as laid out in a specified ‘API.’ The API drives a desired function in the underlying system software; a distinct, technical functionality that represents a tiny part of a larger business application. It is a way for one component to call some sort of activity provided by another. This is frequently in the sense of a front-end component driving a back-end component or system of record, like the mainframe.
Whether accomplished by third party developers or in-house, APIs power the rapid deployment of new ideas. In fact, they help companies deliver value faster in the form of real time data and analytics.
For example, a large German automobile manufacturer recently did work to combine their internal mainframe systems with the web. They took the integration to another level by extending it to hand held devices that dealers use in the showroom. When working with clients, dealers provide real-time information through the use of APIs. Customers and dealers alike have no idea that they’re interacting with legacy systems, and mainframe programmers don’t know that they’re talking to modern apps. API-enabling their mainframe meant that these two different groups of people could communicate seamlessly.
Starting an API-economy
In fact, API-enabling is the “how” in modern linkage techniques. The method is so fundamental in connecting various technologies that the API model leads to an “API economy,” or ecosystem of suppliers and developers who make readily available calls to their technology for other suppliers and developers.
Indeed, the API model is perfect for a company using their existing assets to be part of broader, multi-channel solution. But the key for mainframe users is to place an abstraction layer between the mainframe and modern technologies that can make calls into and out from the mainframe itself. With the API effectively in place and calling the mainframe, the data produced from those calls can be used to build widgets, web pages, mobile and desktop applications, and IoT devices
Recently, the world’s largest private educational testing and assessment organization needed a secure, modern way to process credit cards. The challenge came from a scheduling application living on the mainframe. The nonprofit used a no-code integration tool to create APIs for their legacy applications that called out to external APIs from the mainframe. With minimal changes to their legacy code, they were able to complete this task quickly. Users felt the impact immediately. Credit card transactions processed securely in real-time.
Where to start?
The answer is finding the right integration software. When a mainframe user recognizes the potential value of the API approach, they pick a modernization tool to build the APIs that sit between the mainframe, and say, that new app being launched.
The integration provides connectivity between requests and back-end systems. It handles any format and mapping requirements. Back end components deliver services. The integration securely authenticates and protects the system of record layer to satisfy systems management requirements for proper governance.
To learn more, download the whitepaper.
Director of Product Evangelism
Don Spoerke is the Director of Product Evangelism at GT Software. He is a 25-year veteran in the enterprise modernization space. Don collaborates with an impressive list of FORTUNE companies to intelligently integrate legacy mainframe assets for new business application initiatives.