Why consider migrating to COBOL Version 6.3 and what challenges await you?
Did you know….
COBOL is a 60-year-old programming language that powers the global economy? Did you know that IBM studies reveal roughly 25% of customers migrating to COBOL version 6.3 encounter migration problems as a result of their COBOL programs processing invalid data at run-time? That data was viewed as “valid” before this migration. One client in the Distribution Industry had to identify, by hand, well over 600 programs that required changes due to COBOL Call Interface changes introduced in version 6.3.
Why should you upgrade to Version 6.3?
IBM Hardware Server Design’s tight integration with IBM Compiler Development
IBM’s Enterprise COBOL for z/OS version 4 release 2 was first announced on August 25, 2009. Back then, highlights included XML document parsing validation, compiler message severity customization, the ability for COBOL user defined words to include an underscore and support for Java 5 and Java 6. COBOL continues to serve several industries quite well and COBOL is responsible for the efficient, reliable, secure and unseen day-to-day operations of the world’s economy. So what is causing companies of all sizes to consider migrating off version 4 and onto IBM’s latest Enterprise COBOL compiler, version 6.3?
Enterprise COBOL version 6.3 introduces a new code generator and this provides an opportunity for you to produce more optimal code to complement your mainframe hardware investment. Optimal code generation provides both MIPS and MSU savings. As a reminder, MIPS is a hardware capacity number and MSU is a software capacity number. IBM and third party software providers price their solutions based on the MSUs consumed. To the extent you can generate optimized code that reduces your MSU consumption, that is a good thing!
So, one reason to migrate to this new compiler is to take advantage of potential MSU savings. Naturally there is a bit of recompilation work in front of you as the Enterprise COBOL Compiler Option ARCH must be changed. The default value for ARCH is a function of your operating system release. For example, if you are running z/OS 2.1, the default ARCH setting would be 7. This value produces code for mainframe servers announced back on July 2010, the IBM zEnterprise 196. This server has long since been withdrawn from the mainframe marketplace.
If you are curious about what that server looks like, consider the image that follows.
Figure 1 IBM zEnterprise 196 server, circa 2010
The latest ARCH level is 13, supporting IBM’s latest mainframe server the IBM z15 coupled with Enterprise COBOL version 6.3 running on z/OS 2.2 or higher.
The z15 server image follows.
Figure 2 IBM z15 server, current server platform
IBM Server Design and IBM Compiler Development work closely together. The fruits of their labor are exposed via the ARCH setting you specify when you compile your COBOL source programs. As stated earlier, if you are not specifying ARCH(n) as a compiler option, it will default to (7) for z/OS 2.1 clients; or (8) for z/OS 2.2 and above clients. Higher ARCH settings will enable the compiler to exploit features of the corresponding and all earlier hardware models in order to ensure your business logic achieves the best performance.
Did you know the average performance gain when moving from ARCH(8) to ARCH(13) is 23%1? For those still using COBOL and still on old IRON, you are leaving dollars, performance and scale on the table.
IBM Enterprise COBOL version 6.3 delivers advanced compiler support that allows you to specify ARCH(13) as a compiler option. This same COBOL version allows you to modernize your existing business critical applications. Modernization enables the reuse of your proven business logic and allows you to deliver code enhancements, quicker; add modern GUIs to business critical COBOL applications or extend them to work with web, cloud or mobile infrastructures.
Migrating to COBOL version 6.3 allows you to build on your proven applications versus rewriting from scratch.
Risk Assessment as a Change Agent
As impressive as optimized code generation and associated MSU savings are, there is another reason why clients will migrate to Enterprise COBOL version 6.3. That reason centers on Risk Avoidance as clients cannot afford to be on an unsupported compiler. Given the authors experience working with clients on this very topic, Risk Avoidance is their primary reason to migrate.
The challenge clients across all industries have in this space is “Where do we start”? Think about it for just a moment, even a small mainframe client in the Insurance Industry could easily have upwards of 15K COBOL Version 4.2 Source Programs.
Compiler technology consists of Front-End and Back-End components. Front-End components center on parsing and syntax checking; not a big deal from a migration standpoint. The Back-end component introduces a new optimizer that changes the way machine code is generated and the like. This is a big deal from a migration standpoint and the last time IBM introduced a new Back-end compiler was when VS COBOL II was introduced (circa 1985). Migrating from OS/VS COBOL to VS COBOL II clearly was a difficult migration.
Migration from Enterprise COBOL version 4.2 to version 6.3 is possible and not as difficult as that first migration decades ago. The decisions leading up to the program changes and testing are indeed far more critical to the success of your effort than anything else you can do to migrate to Enterprise COBOL v6.3.
What challenges await you?
There could be many, but most of them will not be new to you, some will be trivial if you’ve got reasonable control over your production source code, and some will fit right in with your current maintenance and enhancement activities. Let’s take a look:
Where do you start?
The impact of the Enterprise COBOL v6.3 compiler on your applications can be seen in two areas:
- Possible program changes that can be determined by the v6.3 compiler itself,
- Possible program changes that can only be determined at run time because they relate to data content that is handled differently in v6.3.
It is important to keep these two areas in mind when evaluating and planning for your migration project.
It helps to think about an Enterprise COBOL v6.3 migration as having three parts:
- Determine the scope and assess all in-scope programs for compile-time issues,
- Define a bounded pilot project to address run-time issues and build a migration project model for the remainder of the application suite,
- Conduct the main migration project according to the findings and model from the pilot.
Scope and Assess
The key business needs of your IT organization must be evaluated and prioritized in order to identify in-scope applications. Improving MSUs, decreasing batch run-time, streamlining your licensed software to eliminate redundant components, complying with regulatory requirements mandating that critical applications cannot operate on un-supported platforms or software – these are just a few of the issues your organization may be facing.
If optimizing MSUs is most important, then assessing the impact of a COBOL v6.3 migration would be focused on hot-spot programs, frequently run jobstreams, problematic batch window limitations and the suspects that run within them. Conversely if the requirement is to ensure that no mission-critical appliations are running on an un-supported compiler, then determining which apps are mission-critical and which of them must be migrated, is a more straightforward task.
There is no “one size fits all” answer to defining the scope of the effort. It will depend on your organization’s goals. IBM has provided a solution for programs that can’t be re-compiled with v6. Their IBM Automatic Binary Optimizer for z/OS (ABO) offering allows the binaries from older compilers to be optimized to the fullest extent possible, so that performance gains can be realized even if the source code cannot be migrated.
Assessing your portfolio from this perspective is step 1. Compiling the in-scope programs to determine v6.3 compliance impact is next.
How do you effectively compile thousands of programs?
Or perhaps, should you compile your entire COBOL portfolio? If your impact assessment needs to be done across the board, then this challenge could be real. And it should be answered with automation. If your COBOL portfolio is large, even if your focus is limited to mission-critical applications, this could still involve hundreds or thousands of programs.
Processes can be built that will automatically submit compile jobs, with the outputs saved to datasets for further analysis. That analysis can include extraction of relevant data such as error messages into files or spreadsheets that can be sorted, queried and counted. It is even possible to automate some of the code corrections.
The extent of any automation efforts should be based on the number of programs in scope. And while building or buying automation to assist in the impact assessment may seem like overkill on a small body of code, the risk of errors introduced by manual processes is great.
The bottom line is that leveraging automation during the compilation phase is extremely valuable and should be given serious consideration.
What is the cost associated with Mass Compilation?
In addition to staff costs, the effects of submitting hundreds or thousands of compiles must be evaluated and techniques adopted to mitigate impact on mainframe throughput, production jobs and users.
Care must also be given to the actual monetary costs especially in chargeback environments.
Special job classes, dispatching priorities and schedules can be set up to allow the compiles to remain queued until periods of lighter activity such as after nightly batch processes are complete, or on weekends or holidays. In today’s world of everything-as-a-service, organization now exist that offer mainframe COBOL compilation in the cloud on a pay as you go basis. This may also be an alternative, although the authors have no direct experience with this service or its relative cost.
How do you interpret the results?
The COBOL v6.3 compiler provides many options that flag potential migration issues. Rather than diving into a program and addressing each issue individually, a better approach is to collect all diagnostics from all compilations, and have them analyzed by a small group of COBOL programmers, enterprise architects and/or others who are responsible for the COBOL programming standards in your shop. This team should establish guidelines for how each type of issue should be handled. For example, most “Information” and many but not all “Warning” messages may be ignored. For these, along with the “Error” and “Severe” messages, the preferred approach should be documented for all programmers to follow when working to clean-compile the code. This up-front effort of perhaps 20 hours will save hundreds of hours of testing, debugging and re-testing later.
First, let us say that if the scope of your migration is small enough, a Pilot Project many not be necessary. How “small” is small? This is up to each organization to determine based on their deadlines, available resources and many other factors.
Assuming a pilot project is appropriate, the primary goal is to modify the programs determined from the assessment to require source code changes, and then test these programs to ensure that any run-time changes are identified, made and tested before the migrated programs go into production.
A secondary but very important goal of a pilot is to build a model for the remainder of the applications to be migrated. Understanding resource requirements and creating a sound timeline is easier and more likely to be accurate when it is based on actual experience.
Also, because of the nature of the potential run-time issues associated with the new compiler, it is impossible to know before starting the project, how many such issues will exist. For this reason, the Pilot Project should contain enough programs to be representative of your application(s) but no so many as to bog the project down in bulk.
Although no hard and fast rules have been identified thus far, it would seem that a pilot of 10% of an application’s COBOL programs would be reasonable. Remember to consider characteristics such as representative complexity, criticality of the functions performed, interfaces with other programs not yet upgraded or written in other languages when identifying the pilot programs.
Main Migration Project
This effort should be based on the Pilot Project model, with any changes needed to reflect lessons learned and other findings.
Unlike new development and even normal application maintenance, a migration project allows you to leverage the fact that you have a working application against so you always know what the results should be. Your current application is a ‘baseline’ against which the migrated program results can be compared using automated tools and processes. This saves significant time, eliminates any judgment calls as to whether a modified program is correct and significantly reduces the chance of errors going undetected.
The importance of testing migrated programs cannot be understated, which brings us to our next topic.
The role of testing and the need for automation.
What role will automation play?
If you go down the path of leveraging your source code library manager and then submitting compilations one COBOL Source File member at a time, it’s clearly going to take some time. Migrations such as these demand some level of automation. As shared earlier, the optimal approach is to mass compile. To accomplish this at scale and within a reasonable length of time, you must be able to perform Mass Compiles against the correct execution environment. Mass Compilation will also require the listings to be saved while the object code is discarded. After all, you do not want to impact production level libraries at this time.
It is possible to develop programs that will “read the listing”. This will allow you to capture the various messages by severity, carefully associating each COBOL Source File member with the messages associated with it’s compile and then separately slicing and dicing the messages to better understand your migration exposure. The actual listings can be archived using z/OS platform services and recalled by your COBOL Knowledge Workers should the need arise.
What role will testing play?
During your Assessment Phase, there will be no need to develop a separate test environment. The authors would submit that it is possible to leverage programmatic analysis and capture several migration challenges, even “parameter/argument size mismatch”. However, there is one significant v4.2 to v6.3 issue that can only be found through testing. That is is the presence of “invalid data in numeric USAGE DISPLAY data items” within your business logic.
During your Pilot to Production phase, you will upgrade an agreed to number of COBOL 4.2 programs based on the findings of the prior Assessment phase. You will then test those programs for functional equivalence against their v4.2 baseline. During this phase, your targeted programs will be subjected to:
- 2 to v6.3 clean compilation
- Side-by-side program testing across the required execution environments
- Error identification, remediation, retest, verification and sign-off
- Final v6.3 move to production with on-call production support.
As part of program selection, it is important to identify programs that form a logical unit of work that can be tested in a manner that matches the normal flow of the business process. Doing so will guarantee a sound v4.2 to v6.3 test-comparison process. Now is a good time to identify those Corporate COBOL projects planned for the next calendar year. The COBOL source programs targeted for those projects could easily seed your Pilot to Production phase. In addition, internal teams can be asked to budget additional time to support this additional migration and test activity. By leveraging these planned projects, you will limit the impact of a 4.2 to 6.3 upgrade on senior knowledge workers at your location as well as reduce cost and improve your risk posture accordingly.
What should your “test environment” look like?
During your Pilot to Production phase, testing matters. In support of this phase, you must stand up two distinct test environments. The odds are high you already have the first test environment in place today – an existing and planned for COBOL v4.2 Batch and Online test environment with data, job streams and the ability to capture and review the test output files.
You must also stand up a second test environment and the odds are high this one currently does not exist. That test environment will include the following – a new COBOL v6.3 Batch and Online test environment using the same data and job streams as your v4.2 environment. What differs within this second environment includes pointing to those set of libraries that include the v6.3 executables along with to-be-developed automation that will compare test runs to the fullest extent possible.
In addition, you must provide the ability to “replay your test job streams” and that implies you must have the ability to reload the input data files, cleanly recreate the associated output files and execute the same stream again pending successful comparison between the v4.2 and v6.3 executables.
In closing, IBM’s Enterprise COBOL version 6.3 is a giant technological step forward in delivering performance advances, interoperability with other languages and platforms, and ensuring the relevance of the COBOL languages for decades to come. If you have any of the more than 240 billion lines of COBOL that currently run the global economy, then migrating to v6.3 should be given serious consideration and priority.
It is not as simple as installing service against the z/OS VSAM component or applying patches to your database. Testing with data that mimics live production must be done and the results thoroughly verified. But this is what your IT team does every day, so the challenges and technologies are known and the learning curve is low.
The scope of your project should be based on the specific requirements and goals of your IT organization – there is no single “right” way to define the scope of the effort. Across the board, application by application, hot spots only – these are all valid. Which you choose should be based on the best fit for your enterprise.
- A pilot project is appropriate in all but the smallest of in-scope portfolios.
- Testing – with automation in a “before and after” model, is critical.
And finally, keep in mind the future benefits of interoperability and the knowledge that your critical applications are running on not merely a supported version of the COBOL compiler but one that makes your apps ready for the future.
Future articles centered on actual client migration experiences are forthcoming, please stay tuned!