Your mainframe does hard work for your business, supporting essential computing transactions every day. However, mainframe data does not easily integrate with the cloud platforms driving data-driven, real-time, analytics-focused business processes. Integrating data from this critical technology often results in high costs and downtime. So, what can you do?
View this on-demand webinar to learn how Precisely Connect can help use the power of Apache Kafka to eliminate data silos and make cloud-based, event-driven data architectures a reality. Start your cloud transformation journey today, knowing you don’t need to leave essential transaction data behind!
During this webinar, you will learn more about:
· Where to begin your cloud transformation journey using mainframe data and Apache Kafka
· What you need to move mainframe data to the cloud while reducing costs, modernizing architectures, and using the staff you have today
· How Precisely Connect customers are using change data capture and Apache Kafka to deliver real-time insights to the cloud
2. Agenda
• Beginning your cloud transformation
journey
• Connecting the mainframe to the cloud
• Ingredients for success of mainframe to the
cloud
• Stark Denmark, a story of transformation
with mainframe, Apache Kafka and the
cloud
5. Connecting mainframe
to the cloud
Bring rich transaction data to
the cloud
Improve cloud analytics and
insights
Speed delivery of information
Scale with next-generation
initiatives
7. Ingredient 1: Log-based data capture
Did you know?
• Connect CDC can leverage
published Log or Journal
standards to identify and
capture the change before
copying to the Share Queue.
• The Connect CDC Queue
ensures that data integrity is
maintained and zero data loss
occurs in the event of a dropped
connection during file
transmission.
1
1
2
3
4
Changed data
Source DBMS
Change selector
Log/Journal Queue Retrieve/Transform/Send
Apply
1. Use of transaction logs or triggers eliminates the
need for invasive actions on the DBMS
2. Selective extracts from the logs and a defined
queue space ensures data integrity
3. Transformation in many cases can be done off
box to reduce impact to production
4. The apply process returns acknowledgement to
queue to complete pseudo two-phase commit
Target DBMS
8. Ingredient 2: Real-time data
pipelines
• Stream real-time application data from relational database, such as
DB2 for IBM i to mission critical business applications and analytics
platforms
• Other systems examples: mainframes, EDWs
• Business application examples: fraud detection, hotel reservations,
mobile banking, etc.
9. Ingredient 3: Flexible replication options
One Way Two Way
Cascade
Bi-Directional
Distribute
Consolidate
10. RDBMS
EDWs
Data Streams
Strategic Projects:
Real-time analytics, AI
and machine learning
Targets
Connect CDC is
cloud platform
enabled for
Ingest and Stream
Ingredient 4:
Precisely Connect
Other
Db2 (IBM i,
z, LUW)
Sybase
Oracle Informix PostgreSQL
MySQL
MS SQL Server
Flat Files
(delimited)
Mainframe
IMS Db2 for z/OS
VSAM
Precisely Connect
11. Scale to meet the needs of
your business
Self-service data integration through browser-based interface
• Design, deploy and monitor real-time change replication from a variety of traditional
systems (mainframe, IMS, RDBMS, EDW) to next-generation distributed streaming
platforms like Apache Kafka
• Enable the construction of real-time data pipelines
Resilient data delivery
• Fault tolerant – resilient to network/source/target/application server outages
• Protects against loss of data if a connection is temporarily lost
• Keeps track of exactly where data transfer left off and automatically restarts at that
exact point – no manual intervention
• No missing or duplicate data!
Maintain metadata integrity
• Integrates with Kafka schema registry
• On-demand metadata-driven pulls of data from a variety of database systems to next
generation data stores like the cloud and cluster
12. Customer Story
• Connect leverages transactional CICS events (committed changes)
within the mainframe, replicating the changes to Latitude’s
Confluent-based Kafka event bus
• Connect performs one-way, resilient replication to Confluent Kafka,
maintaining transactional integrity and ensuring proper data
delivery
• Simplified data transformation, COBOL copybook mapping,
REDEFINE handling, and more so that data is intelligible in Kafka
• Improved customer engagement and clearer insight into client
behavior
About
Australian financial services company with
headquarters in Melbourne, Victoria. Core business
is in consumer finance through a variety of services
including unsecured personal loans, credit cards,
car loans, personal insurance and interest free
retail finance. It is the biggest non-bank lender of
consumer credit in Australia.
Problem
Sought to modernize their transaction monitoring
capabilities in order to improve their client
engagement. This goal required gaining real-time
insight into client behavior, so that they could
provide alerting, notifications, offers, and reminders
as events happened. Unlocking this data from
mainframe VSAM was a critical step to achieving
success.
Solution
Precisely Connect
Confluent Kafka