Monolithic applications are defined as single-tiered software in which the user interface and data access code is combined into a single application for a single platform. Monoliths can impact your ability to create APIs, deliver capability quickly, and even perform routine application maintenance. Refactoring is the antidote to monolithic software. It can result in improved team agility and autonomy, plus it paves the way for API creation. Learn how DevOps for the Enterprise software can help you refactor- from discovery of your z/OS assets and impact analysis, to the modularization task itself, including editing, compiling, testing, and debugging.
13. How ADDI and ADFz
help you in this
Refactoring Process
“The GPS for your Mainframe
Developers”
13
14. Increase Agility by
Simplifying and
Standardizing Work
Item Estimation
• Use standardized Metrics (such as Maintainability
Index or Halstead Complexity) to make more
realistic estimations
• Callgraphs, Usage Reports and other Reports help
planning application modernization and make
application portfolio decisions
• Dead-Code Reports give indications about
possible dead-code to remove
14
Accelerate
Transformation
15. Navigate easily inside
large and complex
programs
• Program Flow provides a quick overview about
the parts of a specific program
• Find code lines accessing specific resources or
interacting with other programs
15
Accelerate
Transformation
16. Navigate easily inside
large and complex
programs
• Flow-Charts show an in-detail view of the inner
workings of programs
16
Accelerate
Transformation
17. Accelerate Refactoring
• Reduce guesswork about side-effects while
changing applications
• Navigate the code more effectively
• Understand unknown code faster
• Adopt a more structured approach to changes
using ADDI as a guidance
17
Accelerate
Transformation
Good morning, I am Paul Pilotto and I am a Solution Architect focusing on ADDI and DevOps on System z.
I have 30+ years of experience in the Enterprise Modernization space and had the opportunity to work with tons of mainframe customers from all over the world.
I will focus today on what Monolithic Applications are, what their bottlenecks are and how ADDI can help to break them down.
And I am Bill Alexander, an IBMer working in the Architecture and Development team for Application Delivery Foundation for z Systems (ADFz). I will discuss some new tooling we recently announced that will help you refactor those monoliths into modular components.
The agenda of today’s webcast is to define what we understand to be a monolithic application and how they, in a lot of cases, were created.
Once we understand the complexity of a monolithic application Bill and I will walk you through the different reasons and scenarios why refactoring becomes a need and we will show you some approaches on how our tooling can help with that.
What are the different Business Drivers that motivate companies to refactor.
First there is business agility, the speed that our marketing and business expects from IT to change faster their solutions and respond to faster to this disruptive business needs. Micro-services, which is in contradiction with monolithic applications, is today the answer to respond faster to these needs because often the core business processes don’t have to change but have to integrate with these new digital business needs
By accelerating the digital transformation you can provide more flexible and reusable capabilities to these needs.
Another important business driver is to get new peoples, the millennials or Generation Y, up to speed with our core business and give them the opportunity to learn faster what we have by providing them a more understandable architecture.
It also enables you to eventually rewrite into modern languages in an incremental approach based on smaller components to reduce the risk and the cost of an eventual rewrite.
Before client-server or browser based architectures became trend for our day-to-day IT thinking, Monolithic Applications were commonly used and accepted, not only in the Mainframe spectrum but common for all platforms.
The need for separating Presentation Logic from the Business Logic and separating Data Access logic from the Business Logic was not there and it was much more easier for a developer, in function of Logical Unit of Work or for ease of deployment, the logic that belonged to one screen was handled in one program.
Over the years these monolithic programs got changed over and over again because of new business needs, new business products, acquisitions and mergers, business regulations, year2K and so on.
Also often the original developer retired or moved to another job, within the company or left the company and these monolithic applications just grew and grew in size and functionality.
Successor developers were so afraid of making mistakes or breaking the spaghetti code, that they preferred to copy and paste portions of the logic and made their changes in the copied source and often left the original code inside the program which then became dead code
This resulted in difficult to maintain masterpieces to which a successor developer, millennials or outsourcers or even the original developer, was afraid to touch it because it functionally did what it had to do but became a monster to maintain.
Another problem came along when the Tester community popped up, they had to write testing scripts for these little monsters and had to understand how the code worked in order to guarantee an acceptable Code Coverage percentage. And lets not even think about creating an automation in testing these code icebergs, it is doable but the test case will be a monolith as well.
There is absolutely no agility in monolithic applications and every little code change involves a compile, involves full testing and involves a packaging of the full functioning monolith which all costs lots of CPU consumption, just because the impact of the change is hard to predict.
And let’s not forget the pain they cost when new business needs popup, for example to open some of the logic as a public API or integrate the business critical functionality with a new Mobile solution. Or even, when there is no business need, if IT wants to move forward with an hybrid DevOps solution … these monoliths are a left over from when teams deployed infrequently and they remain a pain.
If you don’t do anything about it
When we want to move forward into the API economy or we want to integrate mainframe development and deployment within our continues DevOps process, if we want to automate our testing process and make sure that during this testing, code coverage percentages get guaranteed.
Above all this we implement agility in the development process because we can parallelize development and deployment and testing we can conclude that REFACTORING is the solution to these monolith programs problems.
There are several business drivers for refactor, one of them is that we want to move forward into an API economy
Using Application Discovery and Delivery Intelligence to understand the monolithic application and use Application Delivery Foundation to refactor and thus make it possible to use z/OS Connect to make a public, or on premise used, API to please our business in order to make our valuable business critical logic available to our partners or other in house projects
A first scenario in this API journey is to break big applications into smaller ones
Like said earlier, ADDI can easily identify the code that is somewhat loosely coupled. By the use of ADFz we provide some refactoring wizards that make these loosely coupled code into an independent callable program that is more easy to unit test and to turn into a RESTfull API by running it through the z/OS connect wizards, passing it the interfaces such as a CICS Commarea for example. Application Delivery Intelligence will optimize the testing efforts by recommending which and how many testcases have to be executed when code is changed.
Another important business driver is the demand for creating and govern business rules. ADDI will capture business information, including business terms such as “a contract”, “a claim”, ”a financial transaction” etc.
In the current roadmap and near future ADDI will identify the candidate business rules.
Again ADFz will provide the refactoring wizards to turn those candidates Business Rules into manageable and executable logic through fore example IBM’s ODM
And last but not least we better refactor our monolithic applications because we want to integrate our valuable legacy applications into our hybrid DevOps process because smaller units are more agile and easier to maintain and manager and thus also easier run through testing automation tools. Code coverage percentages can also be guaranteed and followed up and code review can be made much agile.
This is the point in the refactoring process where ADDI and ADfz come in: they are basically the “GPS for your mainframe source-code”
AD will takes you through your applications without getting lost in the complexity of monolithic applications
IDz will help with new features of refactoring
And Delivery Intelligence will help in keeping you on track when it comes to agility in testing and making testing recommendations without going in all the testing details.
However, you still need to know where you want to go! Tools are not the „self-driving-car“ equivalent!
In AD we offer different reporting facilities which use industry standard metrics and algorithms. It is proven that these metrics make it more easy to make more realistic estimations to make a change
We can produce
A McCabe Complexity Report, which presents Cyclomatic complexity in the number of linearly-independent paths through a program module.
The Halstead Complexity Report which can be used to compare the complexity between two programs or two applications.
The Heuristic complexity Report determines the complexity of an application by assigning values to each type of statement available
A Maintainability Index Report is a measurement report that is intended to track maintainability to indicate when it becomes cheaper/or less risky to rewrite code instead of changing it.
Callgraphs and all kinds of Usage Reports should make the discovery process easier for a person who has to make refactoring decisions
And Dead-code Reports should remove the code that has being copied and paste over time for reducing the fear-factor of a developer who had to touch a working program by making a small change and who opted to copy the original code and made his change in the copied statements
Once we decided which programs are the typical Monoliths we want to refactor through running the reports, we can then go into discovering the programs individually by looking at them using graphs within the AD Analyze client as demonstrated yesterday by Omer.
The Program Flow graph presents the internal structure of a program, including all referenced files, SQL tables, screens, and other programs called. It visualizes the flow of Performs, GOTO’s, Calls, PL1 Procs etc.
By analyzing the Program Flow Chart we better understand the structure of the program on a statement and decision tree level. This gives the programmer a better understanding of the relationships within the logic of a monolithic program.
All these features help to accelerate the refactoring process because it reduces the guesswork about the side-effects that might be the result of a change.
AD has lots of features to navigate through the monolithic program more effectively and gives the person who analyses a better and faster understanding of the unknown.
AD offers a more structured approach for preparing the change because of all the visualization features it has.
Data captured at runtime, during testing, and static analysis data are combined together to produce an application composition map. This diagram is annotated with visual indicators highlighting programs which are violating the code complexity, maintainability and code coverage thresholds previously defined by your organization. This allows you to rapidly assess and plan changes to the problematic programs and also assess a high-level impact of making the program changes. This graphical analysis allows you to size the changes
and the overall maintenance effort, thereby reducing the risk in your DevOps process.
On Sept 8, 2017 besides the release of ADDI v5.0.3, we also released v14.1 of IBM Developer for z Systems.
Besides the Refactoring tools that I will discuss in more detail on the next slide, this new release of IDz also has:
- Additional integration features when Application Discovery is also installed, including in context analysis for CICS Transactions, Database tables, and MVS datasets
- Support for open DevOps toolchain products such as SonarQube through on-the-fly analysis of COBOL or PLI source code when the SonarLint plugins are added to IDz
- Day 1 support for the new COBOL and PLI compilers
- And many Requests For Enhancement submitted by our user community
If you would like to learn more about all the features in this new release I would recommend reading the blog located at the URL you see at the bottom.
After gaining a complete understanding of the program logic by using the impact analysis reports and visualization tools in Application Discovery, a developer can use the new Refactoring capabilities for COBOL source included in IBM Developer for z Systems version 14.1. Once the desired logic is selected the available options include:
Creating a new program
Creating a new copybook
Or extracting the logic into a new paragraph
By creating a new program from the chosen existing logic , the functionality provided by this logic is rapidly made available for use in new ways. The original monolith will still be able to invoke the new program thus preserving existing functionality, but the new program can also be exposed as a service. It does not matter whether you plan to restrict the use of this new service to internal applications only, or whether you intend to expose this new service as an external API. Creating individual functional services allows organization to quickly adapt to changing business requirements.
Too often business logic that could provide companies with a competitive advantage if it was reusable, is instead locked inside a monolithic application. Attempting to duplicate this logic on another platform or in a different language, can lead to errors and redundancy. Instead, refactoring the logic into a reusable module can create new business opportunities, as well as continue to support the existing application.
IBM z/OS Connect Enterprise Edition provides a single, common way of invoking services for modern Mobile, Web and Cloud applications.
The z/OS Connect API Editor can be used to define RESTful APIs, and map the API requests to traditional backend interfaces such as COBOL copybooks. It also allows for the assignment of static values or the removal of unwanted fields in order to simplify the API for the consumer.
Traditionally monolithic programs contain many different function paths each requiring different input data in order to produce their expected results. This scenario causes lengthy manual testing schedules even for small changes. It also makes the creation of automated test cases difficult if not impossible.
Refactoring program logic into individual components, each providing discrete functionality, makes it easy to create automated regression test cases as the input and expected output becomes well known. This often reduces overall testing time and improves the testability and amount of code covered during testing.
Automated test cases can uncover defects early in the software development life cycle as you make modifications to your programs. The earlier these defects can be found the less costly they are to fix.
There are several DevOps tools available to assist with test case creation and automation. Some of these offerings include:
zUnit which is a feature built directly into the developers IDE when using IBM Developer for z Systems
Rational Test Workbench which not only provides API testing, but also service virtualization
XaTester which is a Unit testing focused offering from IBM Business Partner Xact Consulting
and MF-Test which is an automated testing offering from IBM Business Partner MOST Technologies
Regardless of the tools you choose to use, creating and running automated test cases becomes easier after a monolithic application has been refactored into smaller components. Test automation improves the confidence that software changes can be made without causing regressions. This confidence allows for more frequent software delivery, accelerating the time to value for application changes.
On the last slide we discussed multiple tools that can assist with defining and running test cases. Another key benefit to refactoring is the ease with which code coverage results can be obtained and examined. ADFz provides a code coverage collector which can be used in conjunction with any of the tools I mentioned previously, or even in cases where you are conducting manual tests.
The code coverage results collected during test case execution can then be combined with the cognitive capabilities of Application Delivery Intelligence to produce trend information. ADI has the intelligence to guide the tester in deciding which test cases to execute based on the lines of code that were changed. This visualization clearly identifies gaps in your test cases which can prevent you from leaving the testing phase without properly testing the code changes that were just delivered. The guidance from ADI can also identify potentially redundant test cases allowing the same programs to be tested with close to or equal to a code coverage score as previous test runs without having the need to execute all the test cases. This knowledge can lead to a reduction in overall testing time while keeping the same high quality code that you desire.
Besides the opportunity to create APIs, or to improve testing, another benefit provided by refactoring is increased agility. Separating a monolith into smaller components allows multiple developers the opportunity to work concurrently. Parallel development can occur for each module that requires changing, rather than linear development where programmers are waiting for the single monolithic program to become available for change.
Deployment flexibility can also be achieved as often times only the component program that was modified is required to be deployed. Of course, this depends on the nature of the change, as sometimes the monolithic program which invokes the component program may also need to be deployed, but this will not always be the case.
In addition, as technologies change over time these loosely coupled components provide the flexibility for implementation changes without disruption to their consumers. For example, if an application was using a VSAM file previously and the data access logic has now been separated out to it’s own module, it becomes easier to replace the data store in the future if desired.
Mostly thus far I have been talking about creating component programs from a monolithic program. However, even simply creating copybooks for common code, or extracting logic into paragraphs to improve program flow and readability, can reduce the complexity of a program. Using these Refactor options in IDz can make the code easier to read by humans, and sometimes these techniques can also improve industry standard measurements.
Maintainability Index is a calculated measurement based primarily on Cyclomatic Complexity (which was introduced by Thomas J. McCabe, Sr back in 1976) and Halstead Volume (which was introduced by Maurice Howard Halstead in 1977). Application Delivery Intelligence (ADI) tracks trends in various metrics, including the Maintainability Index, Delivered Bugs, Lines of Code, and others. This information allows teams to quickly understand just how complex, or easy to maintain, their application code actually is.
In Summary, Refactoring is the process of improving the structure of software without changing its external behavior. And refactoring monolithic software into smaller functional components can improve the extensibility of a function by making it reusable either inside or outside of the current application.
Refactoring can also assist in adopting DevOps practices like Continuous Testing by making it easier to create automated test cases. These automated tests when combined with code coverage results can provide Continuous Feedback giving developers confidence in the changes they have delivered.
Having a high level of code quality and a separation of functional components can also allow development teams to deploy more rapidly. This differs from the more traditional infrequent deployments we have seen in the past with monolithic software.
And of course any developer who has taken over support for a pile of spaghetti code agrees that well structured code is much easier to understand, maintain and evolve as both functional and nonfunctional requirements change.
Whether your goal is creating APIs, or improving testing, or improving maintainability IBM tools like ADFz, ADDI and z/OS Connect EE can assist.