On the surface, mainframe architecture seems relatively simple: A centrally located computer processes data through an input/output subsystem and stores its computations in memory. At the other end of the mainframe are printers and terminals that communicate with the mainframe through protocols.
For all of its apparent simplicity, mainframe architecture can be extraordinarily complex. Mainframes process thousands of transactions per second and require an extensive infrastructure to house and maintain them. The hundreds, or perhaps thousands, of terminals (think of a bank’s ATM network) require a message routing scheme that can quickly prioritize and route transaction requests.
Balancing costs and benefits
Large corporations still use mainframes because they provide powerful business intelligence and support many high-bandwidth applications, such as mobile banking, online reservation systems, and telecommunications. Mainframes are also more secure, reliable and scalable than distributed networks. Unfortunately, the cost of maintaining a mainframe can be very high. There are three primary reasons for this:
- The annual software licensing fees can be a real financial burden, and the lack of flexible licensing creates compatibility issues for organizations that require nonproprietary add-ons and updates. Workarounds often require the programming of expensive applications to accommodate new requirements.
- Maintenance is expensive because the mainframe is often the nexus of a large-scale infrastructure that requires expert care and precise environmental conditions. Older mainframes are often housed in cabinets or rooms that maintain temperature and humidity at very exacting levels.
- Hardware and software complexity also requires hiring expensive technicians and programmers throughout the life of the system, a problem compounded by the potential shortage of skilled personnel to implement the solutions and maintain these powerful systems. Many qualified COBOL programmers have reached or will soon reach retirement age. Perhaps offshoring can address some of these concerns, but the lack of adequate educational resources could still create a shortage of qualified personnel.
Given the IT mandate to “do less with more” and the strategic imperative to become more customer-centric and agile through a digital transformation, CIOs are under pressure to migrate their legacy applications to open systems and the cloud. Making decades of customer buying habits, preferences, purchase patterns, and other information available to marketing, sales, manufacturing, planning and customer services can have a profound impact on loyalty and lifetime value. Getting there, however, is the challenge.
Open systems and cloud platforms
Open systems conform to well-documented standards and avoid the proprietary issues so prevalent with mainframes. Companies can evolve their capabilities quickly through add-on technology, with the freedom to leave development time and costs to another party. In most cases, organizations need only purchase, install, and periodically upgrade the add-on. Apart from a few configuration tasks, little or no programming is required. All of these factors can lead to significant savings and an agile response to paradigm shifts.
The move to open systems also simplifies the journey to the cloud, which adds even greater value to these applications and databases. A common legacy modernization approach often begins with a move to open systems housed locally, and once the migration has proved stable, it is moved to the cloud.
However, migration to an open system is not without its considerations. Application licenses, for example, are more expensive for mainframe environments, but they are often easier to administer. Coming from this environment, the purchase and installation of multiple third-party licenses for each member of the organization can seem daunting.
The move from mainframes to open systems platforms and the cloud requires careful planning. While the potential exists to reduce costs and build compatibility, its realization depends on making fact-based, rational decisions. The first step in doing this is conducting a thorough discovery and examination of every aspect of your legacy portfolio, including:
- Test scripts
Then, focus on each application’s purpose, importance, and requirements in order to rate each application based on its value to the business versus how it fits within your strategic technical architecture. Once rated, you can easily plot each applicaton in a graph to provide a visual representation of each application’s state to assist in developing your plan of attack.
Using this information, you’ll be able to determine which applications are the best targets for modernization vs. elimination, as well as model modernization approaches to aid in selecting the right path for each application. Your analysis should assess the cost, risk and impact of each alternative until it identifies the right plan of action.
Craig Marble is the senior director of Astadia’s legacy modernization practice. Craig has spent over twenty five years in the information technology industry, most of which has been focused on legacy modernization projects. For more information about Astadia, a premier cloud consultancy for businesses expanding to the cloud, visit http://www.astadia.com and follow Astadia at @AstadiaInc, Facebook/AstadiaInc and LinkedIn/Astadia. For more on legacy modernization, visit www.astadia.com/insights/.