top of page

The Mainframe is a Modern Platform

Updated: May 20, 2020

Mainframes continue to excel at traditional tasks such as high-speed transaction processing and running large batch workloads. Mainframes also can easily handle large amounts of data—multiple terabytes—such as is needed for analytical processing and machine learning. Indeed, the mainframe supports both traditional and modern applications and data management requirements, making it a key component of the IT infrastructure for most large enterprises.

However, the mainframe has a long legacy of applications and uses in its history, which stretches back more than 50 years. Other computing platforms do not have such a rich heritage. This means that some mainframe applications require modernization. This is often referred to as mainframe modernization, but a better term for this is mainframe application modernization, because the mainframe already is a modern platform.

Why is the mainframe modern? Well, like all technology and software, the mainframe and its ecosystem has been continuously updated and enhanced throughout its lifetime. If your image of a mainframe is a hulking, water-cooled machine with blinking lights and spinning tapes, you need to refresh that concept.

Today’s latest mainframe offering is the IBM z15, which is housed in a 19 inch rack, meaning it can be integrated into a standard server rack. The latest z15 chip comprises 9.2 billion transistors, 15.6 miles of wires and 26.2 billion wiring connections. And IBM claims a single z15 server can process 1 trillion web transactions per day. And there is no more reliable computing platform. IBM used to talk about five 9s of availability (that is 99.999% uptime) but with the new IBM z15, IBM can deliver seven 9s (that is 99.99999%)! That is 3.16 seconds of downtime per year, or only 60.48 milliseconds of downtime per week. Now that is impressive!

Furthermore, COBOL is not the only programming language for mainframe development. Today, many mainframe applications are being written in languages like Java and C. And many new applications are being written using JSON and REST APIs for the mainframe, too. The z/OS Connect solution provides a common and consistent interface for mobile access to the mainframe environment.

So, the platform is modern and it enables modern development. But it does have half a Century of applications written over the same period that it must support. That the mainframe can still run old code is a good thing though, right? I mean, have you ever tried to run a Windows 3 app on Windows 10?

Nevertheless, many older applications need to be modernized, for example, to create a modern web or GUI interface instead of the ubiquitous green screen interface common for older mainframe applications.

There are many different methods that can be undertaken, depending on the requirements, skills, and timeline for each mainframe application modernization project. Indeed, according to Don Dejewski, Director of Mainframe Product Development for Ensono, (an IT services provider with mainframe modernization expertise), “Some clients make the decision to replace applications, other are re-writing applications, and there are those that fall in-between these two.”

One example of an “in-between” project is where a client keeps the core of the application and provide access to the functionality via a mobile application. This is typically accomplished using APIS.

And, of course, there are many solutions available for refreshing the interface of online applications that use CICS screen or TSO/ISPF panels.

Gartner[1] has identified seven different approaches that can be taken to modernize legacy systems: encapsulation, rehosting, replatforming, refactoring, rearchitechting, rebuilding, and replacing. But few organizations are entirely replacing mainframe applications, opting for one of the other six approaches, because of the mainframe’s ability to provide high levels of availability and resiliency for mission-critical applications and workloads.

The mainframe will continue to deliver the things that it is famous for, like security, resiliency, high performance and throughput. But we must adapt the way in which we use it, build applications, and manage it, making it a more agile platform for mission-critical data and apps. And larger organizations are integrating the mainframe and its processes with management and orchestration across a hybrid multicloud environment. At the same time, the mainframe is being incorporated into projects running modern workloads using technologies like machine learning, other AI techniques, and blockchain.

Today’s challenge for modern enterprises is to be able to integrate, manage, and utilize a complex heterogeneous system of different platforms and technologies. And the mainframe is a vital components, because you can’t just abandon existing mission-critical workloads; after all, your enterprise relies on them to run the business.

The mainframe is here to stay, and wise organizations are evaluating their mainframe application portfolio to best determine how to modernize them for the expectations of today’s users and customers.

Author Bio

Craig S. Mullins is president & principal consultant of Mullins Consulting, Inc. He has over three decades of experience in all facets of database systems development including creating and teaching database classes, systems analysis and design, data analysis, database administration, performance management, and data modeling. Craig has worked with Db2 on the mainframe since Version 1 and he has written three books on Db2 and database administration, the most recent being A Guide to Db2 Performance for Application Developers.

9,664 views1 comment

Recent Posts

See All

1 Yorum

Margarita Jin
Margarita Jin
13 Oca 2023

KYC rules are not brand-new. Since quite a few years ago, people have had to provide identification credentials when establishing bank or stock trading accounts.

The distinction is that many bitcoin exchanges didn't use them when they first began.

However, as financial regulators began to pay notice, exchanges started to comply since failing to do so would result in their closure.

Accredited, experienced investors, on the other hand, are subject to somewhat stricter kyc crypto exchange standards since, depending on the jurisdiction, they must demonstrate their net worth and/or prior financial trading expertise.

bottom of page