1. @yourtwitterhandle | developer.confluent.io
What are the best practices to debug client applications
(producers/consumers in general but also Kafka Streams
applications)?
Starting soon…
STARTING SOOOOON..
STARTING SOON…
2. @yourtwitterhandle | developer.confluent.io
What are the best practices to debug client applications
(producers/consumers in general but also Kafka Streams
applications)?
Starting soon…
STARTING SOOOOON..
3. Tech Talk Q3 - Qlik
Unleashing the Potential of Qlik and Confluent for
Real-time Data Integration
Confluent Cloud Free Trial
New signups receive $400
to spend during their first 30 days.
4. Our Partner Technical Sales Enablement offering
Scheduled sessions On-demand
Join us for these live sessions
where our experts will guide you
through sessions of different level
and will be available to answer your
questions. Some examples of
sessions are below:
● Confluent 101: for new starters
● Workshops
● Path to production series
Learn the basics with a guided
experience, at your own pace with our
learning paths on-demand. You will
also find an always growing repository
of more advanced presentations to
dig-deeper. Some examples are below:
● Confluent 10
● Confluent Use Cases
● Positioning Confluent Value
● Confluent Cloud Networking
● … and many more
AskTheExpert /
Workshops
For selected partners, we’ll offer
additional support to:
● Technical Sales workshop
● JIT coaching on spotlight
opportunity
● Build CoE inside partners by
getting people with similar
interest together
● Solution discovery
● Tech Talk
● Q&A
6. Goal
Partners Tech Talks are webinars where subject matter experts from a Partner talk about a
specific use case or project. The goal of Tech Talks is to provide best practices and
applications insights, along with inspiration, and help you stay up to date about innovations
in confluent ecosystem.
11. Of the world’s
top 10 insurers
Of the
top 25 retailers
Of the
Fortune 500
Of the world’s
top 100 banks
Mainframes continue to power business critical
applications
92% 100% 72% 70%
*Skillsoft Report from Oct 2019
11
12. But they present a number of challenges
1. High, unpredictable costs
Mainframe data is expensive to access for
modern, real-time applications via traditional
methods (i.e. directly polling from an MQ). More
requests to the mainframe leads to higher costs.
Batch jobs & APIs
On-Prem
ETL App
Cloud
Legacy code
Much mainframe code is written in COBOL, a
now rare programming language. This means
updating or making to changes to mainframe
applications is expensive and time-consuming.
Complex business logic
Many business-critical mainframe apps have
been written with complex business logic
developed over decades. Making changes to
these apps is complicated and risky.
Mainframe
Application
Application
Application
Cloud Data
Warehouse
Database
12
13. Get the most from your mainframes with Confluent
Bring real-time
access to
mainframes
Capture and continuously
stream mainframe data in
real time to power new
applications with minimal
latency.
Accelerate
application
development times
Equip your developers to
build state-of-the-art,
cloud-native applications
with instant access to
ready-to-use mainframe
data.
Increase the ROI of
your IBM zSystem
Redirect requests away
from mainframes and
achieve a significant
reduction in MIPS and
CHINIT consumption
costs.
Future-proof your
architecture
Pave an incremental,
risk-free path towards
mainframe migration, and
avoid disrupting existing
mission-critical
applications.
13
14. Bring
real-time
access to
mainframes
Capture and
continuously stream
mainframe data in real
time
Break down data silos and enable the use of mainframe data for
real-time applications, without disruption to existing workloads
Mainframe On-premises database Cloud data warehouse
Fraud prevention engine
In-session web or app
personalization
Real-time analytics
Customer service
enablement
Inventory management
16. Mainframe “Crash”
Course
1
6
zIIP
Always function at the full speed of the
processor and "do not count" in software
pricing calculations for eligible workloads
(specifically JAVA)..
MQ/CDC Workloads are zIIP eligible
Move qualified workloads via Confluent MQ
Connector run locally in zIIP space.
17. z/OS
CICS
IMS
VSAM
Legacy Apps
zIIP
MQ Connector
Unlocking Mainframe Data via MQ
17
● Publish to Confluent to improve data reliability, accessibility, and
access to cloud services
● No changes to the existing mainframe applications
● Greatly reduce MQ related Channel Initiator (CHINIT) to move
data between the mainframe and cloud
18. IBM MQ Source / Sink on
z/OS Premium
Connectors
Allow customers to cost-effectively,
quickly & reliably move data between
Mainframes & Confluent
Reduce compute and networking
requirements that can add costs and
complexity, so that customers can
cost-effectively run their Connect
workloads on z/OS
Reduce data infrastructure TCO by
significantly bringing down compute
(MIPS) and networking costs on
Mainframes
Enhance data accessibility, portability,
and interoperability by integrating
Mainframes with Confluent and
unlocking its use for other apps & data
systems
Improve speed, latency, and
concurrency by moving from network
transfer to in-memory transfer
19. z/OS
zIIP
CDC Connector
Unlocking Mainframe Data via DB2 & CDC
19
● Publish to Confluent to improve data reliability, accessibility, and
access to cloud services
● No changes to the existing mainframe applications
● Many different CDCs: IBM IIDR, Oracle Golden Gate, Informatica,
Qlik, tcVision, ecc.
CICS
IMS
VSAM
Legacy Apps
22. Original Implementation Scope for current SAP ERP
22
CRM PLM
SCM
MES
SRM
MDM
Systems of
Record
Message Oriented Middleware - Event Driven Data Movement with Ephemeral Message Persistence
23. Role of SAP ERP in Digital Enterprise
23
CRM PLM
SCM
MES
SRM
MDM
Systems of
Record
Message Oriented Middleware - Event Driven Data Movement with Ephemeral Message Persistence
Systems of
Differentiation
Systems of
Innovation
Digital
Products
IIoT
Connected
Smart
Products
Direct 2
Consume
r
Customer
360
Operational
Intelligence
ML/AI
Omni
Channel
24. Digitalization in an existing IT Landscape
Systems of
Record }Running
the Business
25. Digitalization in an existing IT Landscape
Systems of
Differentiation
Systems of
Record }Running
the Business
Systems of
Differentiation
26. Digitalization in an existing IT Landscape
Systems of
Differentiation
}Running
the Business
}Influencing
the Business
Systems of
Differentiation
Systems of
Innovation
Systems of
Record
27. Data Sharing Challenges for Digitalization with Bimodal IT
Systems of
Innovation
Systems of
Differentiation
Systems of
Record
Mode 1
Mode 2
Agility
Reliability
28. Data Sharing Challenges for Digitalization with Bimodal IT
Systems of
Innovation
Systems of
Differentiation
Systems of
Record
Mode 1
Mode 2
Agility
Reliability
Findability
Accessibility
Interoperability
Reusability
30. Data in Motion Integration Approach
Turning the Database Inside Out Sociotechnical
Data in Motion
Data Replication
Materialized View Data as a Product
Data Ownership
&
Responsibility
32. But getting to a cloud data warehouse can be a
complex, multi-year process
32
1. Batch ETL/ELT
Batch-based pipelines use batch ingestion, batch
processing, and batch delivery, which result in
low-fidelity, inconsistent, and stale data.
2. Centralized data teams
Bottlenecks from a centralized, domain-agnostic data
team hinder self-service data access and innovation.
3. Immature governance & observability
Patchwork of point-to-point pipelines has high
overhead and lacks observability, data lineage, data and
schema error management.
4. Infra-heavy data processing
Traditional pipelines require intensive, unpredictable
computing and storage with high data volumes and
increasing variety of workloads.
5. Monolithic design
Rigid “black box” pipelines are difficult to change or port
across environments, increasing pipeline sprawl and
technical debt.
32
Batch Jobs & APIs
On-Prem
Legacy Data
Warehouse
ETL/ELT
SAP
Cloud
SaaS App
DB /
Data Lake
ETL/ELT
CRM
Google
BigQuery
33. Unleash real-time, analytics-ready data in
BigQuery with Confluent streaming data pipelines
1. Connect
Break down data silos and stream hybrid,
multicloud data from any source to your Google
BigQuery using 120+ pre-built connectors.
2. Process
Stream process data in flight with ksqlDB and
use our fully managed service to lower your
cloud data warehouse costs and overall data
pipeline total cost of ownership.
3. Govern
Stream Governance ensures compliance and
data quality for BigQuery, allowing teams to
focus on building real-time analytics.
Data Lake
SaaS App
Real-time connections & streams
On-Prem Cloud
SAP
Google
BigQuery
Govern
to reduce risk and
ensure data
quality
Connect
with 120+ pre-built
connectors
Process
with ksqlDB to
join, enrich,
aggregate
On-Prem Data
Warehouse
33
34. 4 Use-Cases for Data in Motion with SAP®
ERP
34
1.
SAP®
data ingest
for Continuous
Intelligence 2.
Fuel Digital
Channels with
SAP®
Master Data
3.
SAP®
participating in
Business Workflows
through Event
Collaboration
4.
Tracing Production
with IIoT Data to
SAP®
managed
Customer Orders
35. Modern, hybrid data streaming powers
business critical Continuous Intelligence
35
On Premises or
any cloud
Kafka Streams
& ksqlDB - real-time
stream processing
and transformations
Data Science
Workspace
Legacy Data Stores:
SAP ERP, Netezza,
Teradata
Oracle, Mainframes
Databases
Sensor & Behavioral
Data Streams
Event Streaming
and Processing
Sinks
Sources
Event Streaming
Platform
built on Kafka
On Premises
or any cloud
BI Workspace
Kafka Connect
&
Connectors
Kafka
Connect
&
Connectors
36. Copyright 2020, Confluent, Inc. All rights reserved. This document may not be reproduced in any manner without the express written permission of Confluent, Inc.
There is no silver bullet for SAP integration
“The following will explore
different integration options
between Kafka and SAP and
their trade-offs. The main focus
is on SAP ERP (old ECC and
new S4/Hana), but the overview
is more generic, including
integration capabilities with other
components and products.”
Kai Waehner
36
please see Blog Kai Waehner
37. Partnering with the ecosystem to deliver results faster
Cloud
…most of our
Partners
have a SAP
practice
System Integrator
Technology
Confluent
Professional
Services
38. 38
38
38
38
38
38
BI and
Visualization
Apps
Cloud
Storage
Database
Applications
MAINFRAME
Qlik Replicate
TARGET SCHEMA
CREATION
BATCH TO CDC
TRANSITION
HETEROGENEOUS
DATA TYPE MAPPING
FILTERING
DDL CHANGE
PROPAGATION
TRANSFORMATIONS
IN-MEMORY
Confluent Cloud
(Managed Kafka)
Confluent
Platform
ksqlDB
Machine
Learning
Schema
Registry
Confluent
REST Proxy
Confluent
Control Center
Kafka
Connect
Full Load
Log-Based
CDC
Qlik and Confluent
Automated Real-Time Data Delivery
39. @yourtwitterhandle | developer.confluent.io
What are the best practices to debug client applications
(producers/consumers in general but also Kafka Streams
applications)?
Starting soon…
STARTING SOOOOON..
40. Confluent EMEA
Sales Best Practice
Robert Zenkert
Principal Solution Architect, CoE EMEA
Christoph Möhrlein
Senior Evangelist Qlik Data Integratiom
July 2023
41. 41
38,000+ customers
100+ countries
2,000+ employees
Market Momentum
Double Digit Growth
Double Digit EBITA
~$800m Revenue
Global Ecosystem –
1,700 Partners
Accenture, Deloitte, Cognizant
Microsoft, AWS, Google,
Databricks Snowflake, Confluent
Industry Leader
Gartner Magic Quadrant
for 12 years in a row
Who We Are
42. 42
42
42
42
42
42
Modern Analytics Data Pipeline
Our approach
Raw
Data
DATA WAREHOUSE
RDBMS
SAAS
APPS
FILES
MAINFRAME
SAP
Informe
d
Action
Ingest & Store
Real-Time
Updates from
Multiple
Systems
Insights &
Outputs
Descriptive,
Prescriptive &
Predictive
Analytics Collaboration
Embed into
Processes
and
Application
Free it.
Find it.
Understand it.
Action it.
DATA INTEGRATION
DATA MANAGEMENT
ANALYTICS
AI/ML
DATA LITERACY
Real-time, up-to-date, trusted information transformed into informed action
Organize &
Synthesize
via Catalog
ALERTS &
AUTOMATED ACTIONS
43. 43
43
43
43
43
43
Qlik Cloud
Qlik’s Platform for Active Intelligence
Hybrid Data
Delivery
Application
Automation
Data
Transformation
Data Warehouse
Automation
Augmented
Analytics
Visualization
& Dashboards
Embedded
Analytics
Alerting
& Action
Data Services Analytics Services
FOUNDATIONAL SERVICES
Catalog & Lineage Artificial Intelligence Associative Engine
Orchestration Governance & Security Collaboration Developer & API
Hybrid Cloud
Data
Warehouse Data Lake Stream
SaaS
RDBMS Apps Mainframe Files
On-premises
Universal Connectivity
45. 45
45
Flexible Data Integration (DI) Deployment Options
Qlik Data Integration
Generate
CDC Streaming
Data Warehouse Automation
Data Lake Creation
Prepare
Qlik Catalog
Qlik Data Analytics
Conversational
Analytics
Mobile
Analytics
Interactive
Dashboards
Self-Service
Analytics
Reporting
& Alerting
Embedded
Analytics
Other BI Tools
Advanced
Analytics
&
Data
Science
Free it. Find it. Understand it. Action it.
Find it. Understand it. Action it.
Deliver Refine & Merge
Shop
Publish
46. 46
46
46
46
46
46
BI and
Visualization
Apps
Cloud
Storage
Database
Applications
MAINFRAME
Qlik Replicate
TARGET SCHEMA
CREATION
BATCH TO CDC
TRANSITION
HETEROGENEOUS
DATA TYPE MAPPING
FILTERING
DDL CHANGE
PROPAGATION
TRANSFORMATIONS
IN-MEMORY
Confluent Cloud
(Managed Kafka)
Confluent
ksqlDB
Machine
Learning
Self-Managed
Kafka
Schema
Registry
Confluent
REST Proxy
Confluent
Control Center
Kafka
Connect
Full Load
Log-Based
CDC
Qlik and Confluent
Automated Real-Time Data Delivery
47. Example: CDC from DB via Kafka to Qlik
Confluent Cloud
(Managed Kafka)
Confluent
Platform
Replicate
CDC
Full
Load
Sort,
Filter,
Transform
ksqlDB
Machine
Learning
51. 51
Deliver Any Source To Kafka
• One solution for all sources
• Easy to use GUI
• Log-based change data capture
• Supports on-premise and cloud
• Transform data in flight
• Filter at both table and column level
• Proven solution that supports
production loads
Amazon RDS
Azure SQL
Managed Instance Amazon Aurora
Amazon RDS
Google Cloud SQL
Amazon Aurora
Amazon RDS
Google Cloud SQL
Amazon RDS
Amazon RDS
DB2 for zOS
DB2 for iSeries
DB2 LUW
52. 52
What does your SAP
data do for you?
Your Business Runs on SAP
• Processes Orders and Revenue
• Controls and Moves Inventory
• Pays Vendors and Employees
• Maintains Company Financials / GL
SAP Data is the lifeblood
of your business
53. 53
Your SAP data could
do MORE but….
• Difficult to comprehend.
Proprietary data formats.
• Hard to understand. Thousands of
tables with intricate relationships.
• Limited access. Complex licensing
that can be time consuming and
costly.
• Not designed for analytics. Built
for transactional performance not
real-time data interactions.
Your data has a
story to tell
54. 54
54
Over 200 enterprises use Qlik
for SAP data movement
• Real-time data replication
- Simplified mapping of complex SAP data
model
- Decode the proprietary source structures
- All core and industry-specific SAP modules
- Integrate real-time with all major targets
• Automate the data warehouse and data
lake lifecycle
- Easily deliver SAP data to Data Lakes,
Cloud, et al
• Move external data into SAP HANA
DATABASE
SAP HANA
SAP
Make SAP data accessible and understandable
Qlik Data Integration delivers real-time data to the cloud
55. 55
Qlik Data Integration
SAP and Near Real-time Data Warehouse Automation
Easy to Find and Free the Right SAP Data
Automated Data Warehouse Build Process
Model-driven Logical SAP Data Warehouse
Wizard-driven Star Schema/Data Mart Creation
56. 56
56
Realtime Integration Architecture for SAP
Log Based *
Standard / Custom Extractors
Trigger Based * (only HANA)
INSERTs, UPDATEs, DELETEs, DDLs
Transformation, Filter
Metadata Transformations
Runtime Parameters
Replicate Task
Replication Server
Enterprise Manager
Replication Server
In-Memory-Processes
SAP ODP API
DSO/ ADSO
Cube
Multiprovider
SAP HANA Information Views
Attribute View
Analytic View
Calculation View
SAP BW Data Source (Extractors)
SAP HANA CDS Views
SAP HANA tables through SAP SLT
• Rapid installation
• Rapid implementation
• High level of automation
• High reliabilty
• Central Monitoring
SAP BW Object
58. 58
58
OBJECTIVES
• Create data architecture that integrates both core and new
applications.
• Provide up to date, high quality data to the right people, at the
right time, via multiple channels in real-time.
• Improve effectiveness of IT service delivery with adoption of
agile development methodology.
VISION
• Modernize data architecture to improve customer engagement
and enable faster, more agile development.
SOLUTION
• Qlik Replicate and Confluent with Microsoft
Joint Success Stories
Integrate and modernize the most valuable and complex enterprise data
OBJECTIVES
• Medifast produce, distribute, and sell weight loss and
health-related products through websites, multi-level
marketing,
• To fuel an explosive growth in revenue (40% percent) in
the wellness industry during the Covid-19 Pandemic,
• Medifast decided to embrace the agility of the Cloud
• Our modern and automated Cloud solutions and
partnership with AWS and Confluent played a key role for
selecting
• Solution deployed was Qlik Replicate into Confluent with
AWS