SlideShare una empresa de Scribd logo
1 de 33
Bring Your SAP and Enterprise
Data to Hadoop, Apache Kafka
and the Cloud
Lessons from Fortune 100 Companies about Data Ingestion
John Hol – Attunity
Mike Hollobon – IBT
How effective is your organization at
leveraging data and analytics to power
your business models?
Analytics have become mission critical to
not just business decisions but long term
planning and reporting to the markets.
Winners will:
• Respond rapidly to changing
business threats and
opportunities
• Arm their business teams with
accurate & timely data
• Leverage data for competitive
advantage
The Stakes Are High
40 percent of today's Fortune 500
companies on the S&P 500 will no
longer exist in 10 years.
- Study by John M. Olin School of Business at
Washington University
Since 2000, 52 percent of the
names on the Fortune 500 list are
gone, either as a result of
mergers, acquisitions or
bankruptcies.
- Fortune Magazine 2000 to 2015
Technology Investment Areas 2016 2015 2014
BI/Analytics 1 1 1
Cloud 2 3 5
Mobile 3 5 3
Digitization/Digital Marketing 4 6 7
Infrastructure and Data Center 5 2 2
ERP 6 4 4
Security 7 7 7
Industry Specific Applications 8 10 9
CRM 9 9 10
Networking/Voice-Data Comm’s 10 8 6
Top CIO Priorities:
Sources: Gartner CIO Surveys 2013-2015
Big Data adoption
Migration & Analytics
We are moving from IT to Business
Technology (BT)
Trends
MIS
IT
BT
Self-service Real-time
Automation
Limited
data
Digital
business
© 2016 Forrester Research, Inc.
Lots of
data
Batch
Why is Data Management so Important?
IBT – A Little About Us
Attunity’s Authorised Partner for Australia
We sell, implement and support all Attunity products
 Established 1997 to provide PeopleSoft Systems Integration and Services
 ERP, BI and Data Management Services & Solutions – cloud and on-premise
 Authorised Attunity Partner in Australia since 2014
Data Integration for Analytics and IoT
Attunity Product Suite
Replicate Compose Visibility
Universal Data Availability Data Warehouse Automation Data Usage Profiling & Analytics
Move
data to any platform
Automate
SQL/ETL/EDW
Optimize
performance and cost
On Premises / Cloud
Hadoop FilesRDBMS EDW SAP Mainframe
Data Integration & Big Data Management Software
• Corporate Data Lake Initiative for Real-time Compliance
• Reduce fraudulent activity, reputational damage
Global Bank
• IoT Data Lake for New Analytics
• Predictive Maintenance
Global Auto Maker
• IoT Data Lake integrated with SAP data
• Improve food production efficiencies and fleet maintenance
Global Food Processing
• Hortonworks Data Lake for real-time access to SAP & Psoft data
• Centralized for real time reporting, lower costs, valuable insights
Global Telco
• Data Lake for Design Analytics with Real Time Data
• Expects 3x improvement in manufacturingAirline Manufacturer
Examples – Fortune 100 Customers
Building Data Lakes for Analytics and IoT
• Real time CDC solution to its data lake on Azure and Event Hubs
• Provide excellent performance for both latency and throughput
International Healthcare
• Robust solution to capture defined changes and replicate to AWS
• Modernise, easy to maintain, scalable and secure
Iconic Motoring Org
• Leverage HP Vertica cluster DW for operational reporting
• Cope with large row counts i.e. single table had 4.3 billion rows
State Energy Provider
• Urgently need to integrate data into a Oracle target
• Improve mission critical reporting
Toll Road Operator
• Real time data migration for timely analytics yet with no impact
• Modernise and take advantage of the latest technologySandstone University
Examples – Australian Customers
• 100s to 1000s of Data Sources
• Business and Machine data
Analyze
Everything
• On-premise or in the Cloud
• In DB, DW, Hadoop, In-Memory, etc.
Analyze Anywhere
• Capture new/changing data
• Process/stream in motion
Analyze in Real-
time
• Capture new/changing data
• Process/stream in motion
Analyze in Real-
time
‘Be Prepared’ – build architecture so you
can:
Transfer
TransformFilterBatch
CDC Incremental
In-Memory
File Channel
Batch
Attunity Replicate
Hadoop
Files
RDBMS
Data Warehouse
Mainframe
Cloud
On-prem
Cloud
On-prem
Hadoop
Files
RDBMS
Data Warehouse
Kafka
Management
Automation
1. Easy – configurable, powered by automation (no dev or scripting)
2. CDC – Efficient. Low Latency. No Downtime
3. Zero footprint – no agent on source database server
4. Heterogeneous – supports many data sources & targets
5. High Performance – Integrate with native APIs and in-memory processing
6. Security - secured data handling and transfer
7. High scale - simplified massive ingestion from thousands of sources
Attunity Replicate – Highlights & Differentiators
Universal Integration with Attunity Replicate
One Pane of Glass – Any Major Platform
RDBMS
Oracle
SQL Server
DB2 iSeries
DB2 z/OS
DB2 LUW
MySQL
Sybase ASE
Informix
Data
Warehouse
Exadata
Teradata
Netezza
Vertica
Actian Vector
Actian Matrix
Hortonworks
Cloudera
MapR
Pivotal
Hadoop
IMS/DB
SQL M/P
Enscribe
RMS
VSAM
Legacy
AWS RDS
Salesforce
Cloud
RDBMS
Oracle
SQL Server
DB2 LUW
MySQL
PostgreSQL
Sybase ASE
Informix
Data
Warehouse
Exadata
Teradata
Netezza
Vertica
Pivotal
Actian Vector
Actian Matrix
Sybase IQ
Hortonworks
Cloudera
MapR
Pivotal
Hadoop
MongoDB
NoSQL
AWS RDS/Redshift/EC2/
EMR
Google Cloud SQL
Google Cloud Dataproc
Azure SQL Data Warehouse
Azure SQL Database
Cloud
Kafka
MS Event Hubs
MapR-Streams
Messaging
Targets
Sources
SAP
SAP on Oracle
SQL
DB2*
SAP
HANA
Needs & Challenges in SAP Analytic Environments
Real-time access to SAP data
for reporting and analytics
Flexibility to access and
analyse SAP data across
different BI tools and
environments like Cloud, other
DW & Hadoop
Remove lock-in and
dependency on SAP BW
Reduce complex and costly
projects to integrate SAP data
Decode Complex SAP data for
business users for common
data model integration
Attunity Replicate for SAP
Extending Attunity Replication Leadership and SAP Integration
RDBMS | EDW | Hadoop | Kafka
On Premises | Cloud
Core and Industry-Specific
SAP Modules
Attunity Replicate
RDBMS | EDW | Hadoop
On Premises or Cloud
Bulk Load CDC
Use Cases
1. Data Lakes: Live SAP Data Ingest for Hadoop Data Lakes / Kafka
2. Cloud Analytics: Live SAP Data Ingest/Migration for Cloud Analytics
3. ODS: Create an SAP ODS (operational data store) for Real-time BI
4. Real-time data warehousing: with SAP application data
Configurable, Pre-defined Automation
• Intuitive and easy to use web-based
interface
• Simple to configure and manage
replication tasks
• Single interface for any supported
source to target
• Easy management with alerts and
notifications
Intuitive User Experience
Manage Enterprise Replication At Scale
Thousands of End Points; Hundreds of Tasks
Automatic discovery
Centralized monitoring and
design
Real-time event notification
Replicate
Server
Replicate
Server
Replicate
Server
Replicate
Server
Enterprise Manager:
Single point of control
Replicate for SAP – App-Level Replication
• Business object level metadata
• Real-time CDC aligned with
business object
• Unpacks pooled and clustered
tables
In Memory and File Optimised Data Transport
Enterprise-class CDC for SAP
Flexible and optimized CDC
options
• Transactions applied in real-time
and in order
• Changes applied in optimised
batches
• Integration with data warehouse
native loaders to ingest and merge
• Message encoded streaming of
changes (for Kafka message
broker)
R1
R1
R2
R1
R2
R1
R2Batch CDC
Data Warehouse
Ingest-Merge
SQL
n 2 1
SQL SQL
Transactional CDC
Message Encoded
CDC
SAP certified agent to decode complex SAP data structures with metadata for replication
Automated SAP Data selection via UI from pool, clustered or indexed tables
Enable data transformation during replication for all SAP data sets
Support for SAP ERP (ECC), CRM, and custom SAP modules
Maintaining SAP data integrity during replication and during CDC
Minimal performance impact on SAP system by adding support for RFC calls
Key Enablers for SAP Environments
Replicatefor SAP
TransformFilter
Batch
CDC Incremental
In-Memory
File Channel
Batch
Architecture
Persistent Store
Extract relationships for Pool and Cluster Tables
Navigate and
select SAP
business objects
Automated ABAP Mapping
and Change-Data-Capture
for Pool and Cluster tables
1. Replicate for SAP UI
2. Replicate for SAP RFC Calls
RDBMS
(Oracle, DB2, etc.)
Redo/ Archive
logs
or
Journal
File
----------------
Transparent
Tables
On Premises
Hadoop RDBMS
Data
WarehouseKafka
Cloud
SAP Platforms Supported by Attunity Replicate
Supported SAP Versions Supported DBs for SAP
• Primarily SAP ECC 6.0 + all EhP
levels
• Can also support ECC 5.0, 4.7
Enterprise and 4.6C
SAP Applications Supported by Attunity
Replicate
ERP / ECC (Enterprise Resource Planning / ERP Core Components)
CRM (Customer Relationship Management)
SRM (Supplier Relationship Management)
GTS (Global Trade System)
MDG (Master Data Governance)
• Ability to realise new analytics value in native target schema aka Hive, Kafka, Cloud
• SAP data access via BI reports running directly against the replicated
data with logical naming
• End-to-end automation from SAP data selection to loading data with transformations via
intuitive “Click to Load” interface
• Faster performance with less overhead on SAP system by limiting
RFC calls
• Real-time change data capture (CDC) applied to SAP data sets
Key Value of Attunity Replicate for SAP
Rapid Data Access for More Users and Systems
SOURCES
OLTP, ERP,
CRM Systems
Documents,
Emails
Web Logs,
Click Streams
Social
Networks
Machine
Generated
Sensor
Data
Geolocation
Data
Data Integration & Ingest
Attunity Replicate for HDP and HDF
Accelerate time-to-insights by delivering solutions faster, with
fresher data, from many sources
- Automated data ingest
- Incremental data ingest (CDC)
- Broad support for many sources
Integrating SAP Data into the Hortonworks
Connected Data Platform
Data Ingest for Hadoop Data Lakes
Attunity
Replicate
Batch
CDC Incremental
Batch
Cloud
On-prem
Cloud
On-
prem
Persistent Store
Hadoop
Files
RDBMS
Data Warehouse
Mainframe
Automate Your Data Pipeline Automate ingest, schema creation and CDC for faster value
Universal Data Ingestion Ingest data into your data lake on premises, in the cloud, or hybrid
Easy Data Structuring and Transformation Automatically generate schema and structures in Hive for ODS and HDS with
no manual coding
Continuous Updates Use CDC for real time analytics, parallel threading & time based partitioning for less overhead and more
confidence
Ensure Data Consistency ACID MERGE operation to process data insertions, updates and deletions and ensure integrity and
avoid user impact.
Slowly Changing Dimensions Supports Type 2 slowly changing dimensions. With time stamps easily perform trend/ time-
oriented analysis
Attunity Compose for the Hive
Your Fastest Way to Analytics-Ready Data Lakes
 44 Active Customers out of Fortune 100
 7 of Top 10 Manufacturers
 6 of Top 10 Health Care and Pharmaceutical Companies
 5 of Top 10 Financial Services Organizations
 4 of Top 10 Automotive Companies
Attunity Is Trusted and Used by…
Trusted Technology and Partner
 Trusted by Microsoft with 3 OEMs, bundled inside SQL Server
 Trusted by Amazon (AWS) with technology licensing for cloud migration service
 Trusted by IBM and Oracle with respective OEMs of Attunity technology
 Trusted by Teradata as a reseller for Data warehouse and Hadoop market
 Trusted by HP as a reseller for Data warehouse and analytics market
 Trusted by Accenture, Capgemini and Cognizant as SI partners
 Trusted by Hortonworks, Cloudera and MapR for Hadoop solutions
 Trusted by over 2000 customers in over 65 countries
Thanks!
attunity.com www.ibt.com.au/

Más contenido relacionado

La actualidad más candente

Cloudy with a chance of Hadoop - real world considerations
Cloudy with a chance of Hadoop - real world considerationsCloudy with a chance of Hadoop - real world considerations
Cloudy with a chance of Hadoop - real world considerationsDataWorks Summit
 
Hadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop SummitHadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop SummitDataWorks Summit
 
Build Big Data Enterprise Solutions Faster on Azure HDInsight
Build Big Data Enterprise Solutions Faster on Azure HDInsightBuild Big Data Enterprise Solutions Faster on Azure HDInsight
Build Big Data Enterprise Solutions Faster on Azure HDInsightDataWorks Summit/Hadoop Summit
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNDataWorks Summit
 
Big Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo Clinic
Big Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo ClinicBig Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo Clinic
Big Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo ClinicDataWorks Summit
 
Insights into Real-world Data Management Challenges
Insights into Real-world Data Management ChallengesInsights into Real-world Data Management Challenges
Insights into Real-world Data Management ChallengesDataWorks Summit
 
The DAP - Where YARN, HBase, Kafka and Spark go to Production
The DAP - Where YARN, HBase, Kafka and Spark go to ProductionThe DAP - Where YARN, HBase, Kafka and Spark go to Production
The DAP - Where YARN, HBase, Kafka and Spark go to ProductionDataWorks Summit/Hadoop Summit
 
Internet of things Crash Course Workshop
Internet of things Crash Course WorkshopInternet of things Crash Course Workshop
Internet of things Crash Course WorkshopDataWorks Summit
 
Sharing metadata across the data lake and streams
Sharing metadata across the data lake and streamsSharing metadata across the data lake and streams
Sharing metadata across the data lake and streamsDataWorks Summit
 
Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...
Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...
Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...DataWorks Summit
 
YARN: the Key to overcoming the challenges of broad-based Hadoop Adoption
YARN: the Key to overcoming the challenges of broad-based Hadoop AdoptionYARN: the Key to overcoming the challenges of broad-based Hadoop Adoption
YARN: the Key to overcoming the challenges of broad-based Hadoop AdoptionDataWorks Summit
 
Luo june27 1150am_room230_a_v2
Luo june27 1150am_room230_a_v2Luo june27 1150am_room230_a_v2
Luo june27 1150am_room230_a_v2DataWorks Summit
 
Addressing Enterprise Customer Pain Points with a Data Driven Architecture
Addressing Enterprise Customer Pain Points with a Data Driven ArchitectureAddressing Enterprise Customer Pain Points with a Data Driven Architecture
Addressing Enterprise Customer Pain Points with a Data Driven ArchitectureDataWorks Summit
 
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...DataWorks Summit/Hadoop Summit
 
Analytics Modernization: Configuring SAS® Grid Manager for Hadoop
Analytics Modernization: Configuring SAS® Grid Manager for HadoopAnalytics Modernization: Configuring SAS® Grid Manager for Hadoop
Analytics Modernization: Configuring SAS® Grid Manager for HadoopHortonworks
 
High Performance Spatial-Temporal Trajectory Analysis with Spark
High Performance Spatial-Temporal Trajectory Analysis with Spark High Performance Spatial-Temporal Trajectory Analysis with Spark
High Performance Spatial-Temporal Trajectory Analysis with Spark DataWorks Summit/Hadoop Summit
 

La actualidad más candente (20)

Cloudy with a chance of Hadoop - real world considerations
Cloudy with a chance of Hadoop - real world considerationsCloudy with a chance of Hadoop - real world considerations
Cloudy with a chance of Hadoop - real world considerations
 
Hadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop SummitHadoop crash course workshop at Hadoop Summit
Hadoop crash course workshop at Hadoop Summit
 
Build Big Data Enterprise Solutions Faster on Azure HDInsight
Build Big Data Enterprise Solutions Faster on Azure HDInsightBuild Big Data Enterprise Solutions Faster on Azure HDInsight
Build Big Data Enterprise Solutions Faster on Azure HDInsight
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
 
Filling the Data Lake
Filling the Data LakeFilling the Data Lake
Filling the Data Lake
 
Big Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo Clinic
Big Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo ClinicBig Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo Clinic
Big Data Platform Processes Daily Healthcare Data for Clinic Use at Mayo Clinic
 
Insights into Real-world Data Management Challenges
Insights into Real-world Data Management ChallengesInsights into Real-world Data Management Challenges
Insights into Real-world Data Management Challenges
 
The DAP - Where YARN, HBase, Kafka and Spark go to Production
The DAP - Where YARN, HBase, Kafka and Spark go to ProductionThe DAP - Where YARN, HBase, Kafka and Spark go to Production
The DAP - Where YARN, HBase, Kafka and Spark go to Production
 
Internet of things Crash Course Workshop
Internet of things Crash Course WorkshopInternet of things Crash Course Workshop
Internet of things Crash Course Workshop
 
Keys for Success from Streams to Queries
Keys for Success from Streams to QueriesKeys for Success from Streams to Queries
Keys for Success from Streams to Queries
 
Splice machine-bloor-webinar-data-lakes
Splice machine-bloor-webinar-data-lakesSplice machine-bloor-webinar-data-lakes
Splice machine-bloor-webinar-data-lakes
 
Sharing metadata across the data lake and streams
Sharing metadata across the data lake and streamsSharing metadata across the data lake and streams
Sharing metadata across the data lake and streams
 
Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...
Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...
Coexistence and Migration of Vendor HPC based infrastructure to Hadoop Ecosys...
 
YARN: the Key to overcoming the challenges of broad-based Hadoop Adoption
YARN: the Key to overcoming the challenges of broad-based Hadoop AdoptionYARN: the Key to overcoming the challenges of broad-based Hadoop Adoption
YARN: the Key to overcoming the challenges of broad-based Hadoop Adoption
 
Deep Learning using Spark and DL4J for fun and profit
Deep Learning using Spark and DL4J for fun and profitDeep Learning using Spark and DL4J for fun and profit
Deep Learning using Spark and DL4J for fun and profit
 
Luo june27 1150am_room230_a_v2
Luo june27 1150am_room230_a_v2Luo june27 1150am_room230_a_v2
Luo june27 1150am_room230_a_v2
 
Addressing Enterprise Customer Pain Points with a Data Driven Architecture
Addressing Enterprise Customer Pain Points with a Data Driven ArchitectureAddressing Enterprise Customer Pain Points with a Data Driven Architecture
Addressing Enterprise Customer Pain Points with a Data Driven Architecture
 
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...
 
Analytics Modernization: Configuring SAS® Grid Manager for Hadoop
Analytics Modernization: Configuring SAS® Grid Manager for HadoopAnalytics Modernization: Configuring SAS® Grid Manager for Hadoop
Analytics Modernization: Configuring SAS® Grid Manager for Hadoop
 
High Performance Spatial-Temporal Trajectory Analysis with Spark
High Performance Spatial-Temporal Trajectory Analysis with Spark High Performance Spatial-Temporal Trajectory Analysis with Spark
High Performance Spatial-Temporal Trajectory Analysis with Spark
 

Similar a Bring Your SAP and Enterprise Data to Hadoop, Apache Kafka and the Cloud

Accelerating Big Data Analytics
Accelerating Big Data AnalyticsAccelerating Big Data Analytics
Accelerating Big Data AnalyticsAttunity
 
Using the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid Warehouse
Using the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid WarehouseUsing the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid Warehouse
Using the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid WarehouseRizaldy Ignacio
 
Track B-1 建構新世代的智慧數據平台
Track B-1 建構新世代的智慧數據平台Track B-1 建構新世代的智慧數據平台
Track B-1 建構新世代的智慧數據平台Etu Solution
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overviewRohit Jain
 
Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...
Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...
Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...MapR Technologies
 
Skillwise Big Data part 2
Skillwise Big Data part 2Skillwise Big Data part 2
Skillwise Big Data part 2Skillwise Group
 
Feature Store as a Data Foundation for Machine Learning
Feature Store as a Data Foundation for Machine LearningFeature Store as a Data Foundation for Machine Learning
Feature Store as a Data Foundation for Machine LearningProvectus
 
Pivotal deep dive_on_pivotal_hd_world_class_hdfs_platform
Pivotal deep dive_on_pivotal_hd_world_class_hdfs_platformPivotal deep dive_on_pivotal_hd_world_class_hdfs_platform
Pivotal deep dive_on_pivotal_hd_world_class_hdfs_platformEMC
 
Modernizing Your Data Warehouse using APS
Modernizing Your Data Warehouse using APSModernizing Your Data Warehouse using APS
Modernizing Your Data Warehouse using APSStéphane Fréchette
 
Actian Analytics Platform - Hadoop SQL Edition
Actian Analytics Platform - Hadoop SQL EditionActian Analytics Platform - Hadoop SQL Edition
Actian Analytics Platform - Hadoop SQL EditionAlessandro Salvatico
 
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...Hortonworks
 
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016StampedeCon
 
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...Pentaho
 
Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Precisely
 
MDS ap_OEM Product Portfolio Intorduction to the DT & Analytics
MDS ap_OEM Product Portfolio Intorduction to the DT & AnalyticsMDS ap_OEM Product Portfolio Intorduction to the DT & Analytics
MDS ap_OEM Product Portfolio Intorduction to the DT & AnalyticsMDS ap
 
Ibm integrated analytics system
Ibm integrated analytics systemIbm integrated analytics system
Ibm integrated analytics systemModusOptimum
 
Whither the Hadoop Developer Experience, June Hadoop Meetup, Nitin Motgi
Whither the Hadoop Developer Experience, June Hadoop Meetup, Nitin MotgiWhither the Hadoop Developer Experience, June Hadoop Meetup, Nitin Motgi
Whither the Hadoop Developer Experience, June Hadoop Meetup, Nitin MotgiFelicia Haggarty
 
Hitachi Data Systems Hadoop Solution
Hitachi Data Systems Hadoop SolutionHitachi Data Systems Hadoop Solution
Hitachi Data Systems Hadoop SolutionHitachi Vantara
 

Similar a Bring Your SAP and Enterprise Data to Hadoop, Apache Kafka and the Cloud (20)

Accelerating Big Data Analytics
Accelerating Big Data AnalyticsAccelerating Big Data Analytics
Accelerating Big Data Analytics
 
Using the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid Warehouse
Using the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid WarehouseUsing the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid Warehouse
Using the Power of Big SQL 3.0 to Build a Big Data-Ready Hybrid Warehouse
 
Track B-1 建構新世代的智慧數據平台
Track B-1 建構新世代的智慧數據平台Track B-1 建構新世代的智慧數據平台
Track B-1 建構新世代的智慧數據平台
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overview
 
Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...
Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...
Hadoop in 2015: Keys to Achieving Operational Excellence for the Real-Time En...
 
Skilwise Big data
Skilwise Big dataSkilwise Big data
Skilwise Big data
 
Skillwise Big Data part 2
Skillwise Big Data part 2Skillwise Big Data part 2
Skillwise Big Data part 2
 
Feature Store as a Data Foundation for Machine Learning
Feature Store as a Data Foundation for Machine LearningFeature Store as a Data Foundation for Machine Learning
Feature Store as a Data Foundation for Machine Learning
 
Pivotal deep dive_on_pivotal_hd_world_class_hdfs_platform
Pivotal deep dive_on_pivotal_hd_world_class_hdfs_platformPivotal deep dive_on_pivotal_hd_world_class_hdfs_platform
Pivotal deep dive_on_pivotal_hd_world_class_hdfs_platform
 
Modernizing Your Data Warehouse using APS
Modernizing Your Data Warehouse using APSModernizing Your Data Warehouse using APS
Modernizing Your Data Warehouse using APS
 
Actian Analytics Platform - Hadoop SQL Edition
Actian Analytics Platform - Hadoop SQL EditionActian Analytics Platform - Hadoop SQL Edition
Actian Analytics Platform - Hadoop SQL Edition
 
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...
 
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016
 
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
 
Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?Which Change Data Capture Strategy is Right for You?
Which Change Data Capture Strategy is Right for You?
 
MDS ap_OEM Product Portfolio Intorduction to the DT & Analytics
MDS ap_OEM Product Portfolio Intorduction to the DT & AnalyticsMDS ap_OEM Product Portfolio Intorduction to the DT & Analytics
MDS ap_OEM Product Portfolio Intorduction to the DT & Analytics
 
Ibm integrated analytics system
Ibm integrated analytics systemIbm integrated analytics system
Ibm integrated analytics system
 
Hortonworks.bdb
Hortonworks.bdbHortonworks.bdb
Hortonworks.bdb
 
Whither the Hadoop Developer Experience, June Hadoop Meetup, Nitin Motgi
Whither the Hadoop Developer Experience, June Hadoop Meetup, Nitin MotgiWhither the Hadoop Developer Experience, June Hadoop Meetup, Nitin Motgi
Whither the Hadoop Developer Experience, June Hadoop Meetup, Nitin Motgi
 
Hitachi Data Systems Hadoop Solution
Hitachi Data Systems Hadoop SolutionHitachi Data Systems Hadoop Solution
Hitachi Data Systems Hadoop Solution
 

Más de DataWorks Summit

Floating on a RAFT: HBase Durability with Apache Ratis
Floating on a RAFT: HBase Durability with Apache RatisFloating on a RAFT: HBase Durability with Apache Ratis
Floating on a RAFT: HBase Durability with Apache RatisDataWorks Summit
 
Tracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFi
Tracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFiTracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFi
Tracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFiDataWorks Summit
 
HBase Tales From the Trenches - Short stories about most common HBase operati...
HBase Tales From the Trenches - Short stories about most common HBase operati...HBase Tales From the Trenches - Short stories about most common HBase operati...
HBase Tales From the Trenches - Short stories about most common HBase operati...DataWorks Summit
 
Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...
Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...
Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...DataWorks Summit
 
Managing the Dewey Decimal System
Managing the Dewey Decimal SystemManaging the Dewey Decimal System
Managing the Dewey Decimal SystemDataWorks Summit
 
Practical NoSQL: Accumulo's dirlist Example
Practical NoSQL: Accumulo's dirlist ExamplePractical NoSQL: Accumulo's dirlist Example
Practical NoSQL: Accumulo's dirlist ExampleDataWorks Summit
 
HBase Global Indexing to support large-scale data ingestion at Uber
HBase Global Indexing to support large-scale data ingestion at UberHBase Global Indexing to support large-scale data ingestion at Uber
HBase Global Indexing to support large-scale data ingestion at UberDataWorks Summit
 
Scaling Cloud-Scale Translytics Workloads with Omid and Phoenix
Scaling Cloud-Scale Translytics Workloads with Omid and PhoenixScaling Cloud-Scale Translytics Workloads with Omid and Phoenix
Scaling Cloud-Scale Translytics Workloads with Omid and PhoenixDataWorks Summit
 
Building the High Speed Cybersecurity Data Pipeline Using Apache NiFi
Building the High Speed Cybersecurity Data Pipeline Using Apache NiFiBuilding the High Speed Cybersecurity Data Pipeline Using Apache NiFi
Building the High Speed Cybersecurity Data Pipeline Using Apache NiFiDataWorks Summit
 
Supporting Apache HBase : Troubleshooting and Supportability Improvements
Supporting Apache HBase : Troubleshooting and Supportability ImprovementsSupporting Apache HBase : Troubleshooting and Supportability Improvements
Supporting Apache HBase : Troubleshooting and Supportability ImprovementsDataWorks Summit
 
Security Framework for Multitenant Architecture
Security Framework for Multitenant ArchitectureSecurity Framework for Multitenant Architecture
Security Framework for Multitenant ArchitectureDataWorks Summit
 
Presto: Optimizing Performance of SQL-on-Anything Engine
Presto: Optimizing Performance of SQL-on-Anything EnginePresto: Optimizing Performance of SQL-on-Anything Engine
Presto: Optimizing Performance of SQL-on-Anything EngineDataWorks Summit
 
Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...
Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...
Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...DataWorks Summit
 
Extending Twitter's Data Platform to Google Cloud
Extending Twitter's Data Platform to Google CloudExtending Twitter's Data Platform to Google Cloud
Extending Twitter's Data Platform to Google CloudDataWorks Summit
 
Event-Driven Messaging and Actions using Apache Flink and Apache NiFi
Event-Driven Messaging and Actions using Apache Flink and Apache NiFiEvent-Driven Messaging and Actions using Apache Flink and Apache NiFi
Event-Driven Messaging and Actions using Apache Flink and Apache NiFiDataWorks Summit
 
Securing Data in Hybrid on-premise and Cloud Environments using Apache Ranger
Securing Data in Hybrid on-premise and Cloud Environments using Apache RangerSecuring Data in Hybrid on-premise and Cloud Environments using Apache Ranger
Securing Data in Hybrid on-premise and Cloud Environments using Apache RangerDataWorks Summit
 
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...DataWorks Summit
 
Computer Vision: Coming to a Store Near You
Computer Vision: Coming to a Store Near YouComputer Vision: Coming to a Store Near You
Computer Vision: Coming to a Store Near YouDataWorks Summit
 
Big Data Genomics: Clustering Billions of DNA Sequences with Apache Spark
Big Data Genomics: Clustering Billions of DNA Sequences with Apache SparkBig Data Genomics: Clustering Billions of DNA Sequences with Apache Spark
Big Data Genomics: Clustering Billions of DNA Sequences with Apache SparkDataWorks Summit
 

Más de DataWorks Summit (20)

Data Science Crash Course
Data Science Crash CourseData Science Crash Course
Data Science Crash Course
 
Floating on a RAFT: HBase Durability with Apache Ratis
Floating on a RAFT: HBase Durability with Apache RatisFloating on a RAFT: HBase Durability with Apache Ratis
Floating on a RAFT: HBase Durability with Apache Ratis
 
Tracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFi
Tracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFiTracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFi
Tracking Crime as It Occurs with Apache Phoenix, Apache HBase and Apache NiFi
 
HBase Tales From the Trenches - Short stories about most common HBase operati...
HBase Tales From the Trenches - Short stories about most common HBase operati...HBase Tales From the Trenches - Short stories about most common HBase operati...
HBase Tales From the Trenches - Short stories about most common HBase operati...
 
Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...
Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...
Optimizing Geospatial Operations with Server-side Programming in HBase and Ac...
 
Managing the Dewey Decimal System
Managing the Dewey Decimal SystemManaging the Dewey Decimal System
Managing the Dewey Decimal System
 
Practical NoSQL: Accumulo's dirlist Example
Practical NoSQL: Accumulo's dirlist ExamplePractical NoSQL: Accumulo's dirlist Example
Practical NoSQL: Accumulo's dirlist Example
 
HBase Global Indexing to support large-scale data ingestion at Uber
HBase Global Indexing to support large-scale data ingestion at UberHBase Global Indexing to support large-scale data ingestion at Uber
HBase Global Indexing to support large-scale data ingestion at Uber
 
Scaling Cloud-Scale Translytics Workloads with Omid and Phoenix
Scaling Cloud-Scale Translytics Workloads with Omid and PhoenixScaling Cloud-Scale Translytics Workloads with Omid and Phoenix
Scaling Cloud-Scale Translytics Workloads with Omid and Phoenix
 
Building the High Speed Cybersecurity Data Pipeline Using Apache NiFi
Building the High Speed Cybersecurity Data Pipeline Using Apache NiFiBuilding the High Speed Cybersecurity Data Pipeline Using Apache NiFi
Building the High Speed Cybersecurity Data Pipeline Using Apache NiFi
 
Supporting Apache HBase : Troubleshooting and Supportability Improvements
Supporting Apache HBase : Troubleshooting and Supportability ImprovementsSupporting Apache HBase : Troubleshooting and Supportability Improvements
Supporting Apache HBase : Troubleshooting and Supportability Improvements
 
Security Framework for Multitenant Architecture
Security Framework for Multitenant ArchitectureSecurity Framework for Multitenant Architecture
Security Framework for Multitenant Architecture
 
Presto: Optimizing Performance of SQL-on-Anything Engine
Presto: Optimizing Performance of SQL-on-Anything EnginePresto: Optimizing Performance of SQL-on-Anything Engine
Presto: Optimizing Performance of SQL-on-Anything Engine
 
Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...
Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...
Introducing MlFlow: An Open Source Platform for the Machine Learning Lifecycl...
 
Extending Twitter's Data Platform to Google Cloud
Extending Twitter's Data Platform to Google CloudExtending Twitter's Data Platform to Google Cloud
Extending Twitter's Data Platform to Google Cloud
 
Event-Driven Messaging and Actions using Apache Flink and Apache NiFi
Event-Driven Messaging and Actions using Apache Flink and Apache NiFiEvent-Driven Messaging and Actions using Apache Flink and Apache NiFi
Event-Driven Messaging and Actions using Apache Flink and Apache NiFi
 
Securing Data in Hybrid on-premise and Cloud Environments using Apache Ranger
Securing Data in Hybrid on-premise and Cloud Environments using Apache RangerSecuring Data in Hybrid on-premise and Cloud Environments using Apache Ranger
Securing Data in Hybrid on-premise and Cloud Environments using Apache Ranger
 
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
 
Computer Vision: Coming to a Store Near You
Computer Vision: Coming to a Store Near YouComputer Vision: Coming to a Store Near You
Computer Vision: Coming to a Store Near You
 
Big Data Genomics: Clustering Billions of DNA Sequences with Apache Spark
Big Data Genomics: Clustering Billions of DNA Sequences with Apache SparkBig Data Genomics: Clustering Billions of DNA Sequences with Apache Spark
Big Data Genomics: Clustering Billions of DNA Sequences with Apache Spark
 

Último

CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 

Último (20)

CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 

Bring Your SAP and Enterprise Data to Hadoop, Apache Kafka and the Cloud

  • 1. Bring Your SAP and Enterprise Data to Hadoop, Apache Kafka and the Cloud Lessons from Fortune 100 Companies about Data Ingestion John Hol – Attunity Mike Hollobon – IBT
  • 2. How effective is your organization at leveraging data and analytics to power your business models? Analytics have become mission critical to not just business decisions but long term planning and reporting to the markets.
  • 3. Winners will: • Respond rapidly to changing business threats and opportunities • Arm their business teams with accurate & timely data • Leverage data for competitive advantage The Stakes Are High 40 percent of today's Fortune 500 companies on the S&P 500 will no longer exist in 10 years. - Study by John M. Olin School of Business at Washington University Since 2000, 52 percent of the names on the Fortune 500 list are gone, either as a result of mergers, acquisitions or bankruptcies. - Fortune Magazine 2000 to 2015
  • 4. Technology Investment Areas 2016 2015 2014 BI/Analytics 1 1 1 Cloud 2 3 5 Mobile 3 5 3 Digitization/Digital Marketing 4 6 7 Infrastructure and Data Center 5 2 2 ERP 6 4 4 Security 7 7 7 Industry Specific Applications 8 10 9 CRM 9 9 10 Networking/Voice-Data Comm’s 10 8 6 Top CIO Priorities: Sources: Gartner CIO Surveys 2013-2015 Big Data adoption Migration & Analytics
  • 5. We are moving from IT to Business Technology (BT) Trends MIS IT BT Self-service Real-time Automation Limited data Digital business © 2016 Forrester Research, Inc. Lots of data Batch
  • 6. Why is Data Management so Important?
  • 7. IBT – A Little About Us Attunity’s Authorised Partner for Australia We sell, implement and support all Attunity products  Established 1997 to provide PeopleSoft Systems Integration and Services  ERP, BI and Data Management Services & Solutions – cloud and on-premise  Authorised Attunity Partner in Australia since 2014
  • 8. Data Integration for Analytics and IoT
  • 9. Attunity Product Suite Replicate Compose Visibility Universal Data Availability Data Warehouse Automation Data Usage Profiling & Analytics Move data to any platform Automate SQL/ETL/EDW Optimize performance and cost On Premises / Cloud Hadoop FilesRDBMS EDW SAP Mainframe Data Integration & Big Data Management Software
  • 10. • Corporate Data Lake Initiative for Real-time Compliance • Reduce fraudulent activity, reputational damage Global Bank • IoT Data Lake for New Analytics • Predictive Maintenance Global Auto Maker • IoT Data Lake integrated with SAP data • Improve food production efficiencies and fleet maintenance Global Food Processing • Hortonworks Data Lake for real-time access to SAP & Psoft data • Centralized for real time reporting, lower costs, valuable insights Global Telco • Data Lake for Design Analytics with Real Time Data • Expects 3x improvement in manufacturingAirline Manufacturer Examples – Fortune 100 Customers Building Data Lakes for Analytics and IoT
  • 11. • Real time CDC solution to its data lake on Azure and Event Hubs • Provide excellent performance for both latency and throughput International Healthcare • Robust solution to capture defined changes and replicate to AWS • Modernise, easy to maintain, scalable and secure Iconic Motoring Org • Leverage HP Vertica cluster DW for operational reporting • Cope with large row counts i.e. single table had 4.3 billion rows State Energy Provider • Urgently need to integrate data into a Oracle target • Improve mission critical reporting Toll Road Operator • Real time data migration for timely analytics yet with no impact • Modernise and take advantage of the latest technologySandstone University Examples – Australian Customers
  • 12. • 100s to 1000s of Data Sources • Business and Machine data Analyze Everything • On-premise or in the Cloud • In DB, DW, Hadoop, In-Memory, etc. Analyze Anywhere • Capture new/changing data • Process/stream in motion Analyze in Real- time • Capture new/changing data • Process/stream in motion Analyze in Real- time ‘Be Prepared’ – build architecture so you can:
  • 13. Transfer TransformFilterBatch CDC Incremental In-Memory File Channel Batch Attunity Replicate Hadoop Files RDBMS Data Warehouse Mainframe Cloud On-prem Cloud On-prem Hadoop Files RDBMS Data Warehouse Kafka Management Automation
  • 14. 1. Easy – configurable, powered by automation (no dev or scripting) 2. CDC – Efficient. Low Latency. No Downtime 3. Zero footprint – no agent on source database server 4. Heterogeneous – supports many data sources & targets 5. High Performance – Integrate with native APIs and in-memory processing 6. Security - secured data handling and transfer 7. High scale - simplified massive ingestion from thousands of sources Attunity Replicate – Highlights & Differentiators
  • 15. Universal Integration with Attunity Replicate One Pane of Glass – Any Major Platform RDBMS Oracle SQL Server DB2 iSeries DB2 z/OS DB2 LUW MySQL Sybase ASE Informix Data Warehouse Exadata Teradata Netezza Vertica Actian Vector Actian Matrix Hortonworks Cloudera MapR Pivotal Hadoop IMS/DB SQL M/P Enscribe RMS VSAM Legacy AWS RDS Salesforce Cloud RDBMS Oracle SQL Server DB2 LUW MySQL PostgreSQL Sybase ASE Informix Data Warehouse Exadata Teradata Netezza Vertica Pivotal Actian Vector Actian Matrix Sybase IQ Hortonworks Cloudera MapR Pivotal Hadoop MongoDB NoSQL AWS RDS/Redshift/EC2/ EMR Google Cloud SQL Google Cloud Dataproc Azure SQL Data Warehouse Azure SQL Database Cloud Kafka MS Event Hubs MapR-Streams Messaging Targets Sources SAP SAP on Oracle SQL DB2* SAP HANA
  • 16. Needs & Challenges in SAP Analytic Environments Real-time access to SAP data for reporting and analytics Flexibility to access and analyse SAP data across different BI tools and environments like Cloud, other DW & Hadoop Remove lock-in and dependency on SAP BW Reduce complex and costly projects to integrate SAP data Decode Complex SAP data for business users for common data model integration
  • 17. Attunity Replicate for SAP Extending Attunity Replication Leadership and SAP Integration RDBMS | EDW | Hadoop | Kafka On Premises | Cloud Core and Industry-Specific SAP Modules Attunity Replicate RDBMS | EDW | Hadoop On Premises or Cloud Bulk Load CDC
  • 18. Use Cases 1. Data Lakes: Live SAP Data Ingest for Hadoop Data Lakes / Kafka 2. Cloud Analytics: Live SAP Data Ingest/Migration for Cloud Analytics 3. ODS: Create an SAP ODS (operational data store) for Real-time BI 4. Real-time data warehousing: with SAP application data
  • 19. Configurable, Pre-defined Automation • Intuitive and easy to use web-based interface • Simple to configure and manage replication tasks • Single interface for any supported source to target • Easy management with alerts and notifications Intuitive User Experience
  • 20. Manage Enterprise Replication At Scale Thousands of End Points; Hundreds of Tasks Automatic discovery Centralized monitoring and design Real-time event notification Replicate Server Replicate Server Replicate Server Replicate Server Enterprise Manager: Single point of control
  • 21. Replicate for SAP – App-Level Replication • Business object level metadata • Real-time CDC aligned with business object • Unpacks pooled and clustered tables
  • 22. In Memory and File Optimised Data Transport Enterprise-class CDC for SAP Flexible and optimized CDC options • Transactions applied in real-time and in order • Changes applied in optimised batches • Integration with data warehouse native loaders to ingest and merge • Message encoded streaming of changes (for Kafka message broker) R1 R1 R2 R1 R2 R1 R2Batch CDC Data Warehouse Ingest-Merge SQL n 2 1 SQL SQL Transactional CDC Message Encoded CDC
  • 23. SAP certified agent to decode complex SAP data structures with metadata for replication Automated SAP Data selection via UI from pool, clustered or indexed tables Enable data transformation during replication for all SAP data sets Support for SAP ERP (ECC), CRM, and custom SAP modules Maintaining SAP data integrity during replication and during CDC Minimal performance impact on SAP system by adding support for RFC calls Key Enablers for SAP Environments
  • 24. Replicatefor SAP TransformFilter Batch CDC Incremental In-Memory File Channel Batch Architecture Persistent Store Extract relationships for Pool and Cluster Tables Navigate and select SAP business objects Automated ABAP Mapping and Change-Data-Capture for Pool and Cluster tables 1. Replicate for SAP UI 2. Replicate for SAP RFC Calls RDBMS (Oracle, DB2, etc.) Redo/ Archive logs or Journal File ---------------- Transparent Tables On Premises Hadoop RDBMS Data WarehouseKafka Cloud
  • 25. SAP Platforms Supported by Attunity Replicate Supported SAP Versions Supported DBs for SAP • Primarily SAP ECC 6.0 + all EhP levels • Can also support ECC 5.0, 4.7 Enterprise and 4.6C
  • 26. SAP Applications Supported by Attunity Replicate ERP / ECC (Enterprise Resource Planning / ERP Core Components) CRM (Customer Relationship Management) SRM (Supplier Relationship Management) GTS (Global Trade System) MDG (Master Data Governance)
  • 27. • Ability to realise new analytics value in native target schema aka Hive, Kafka, Cloud • SAP data access via BI reports running directly against the replicated data with logical naming • End-to-end automation from SAP data selection to loading data with transformations via intuitive “Click to Load” interface • Faster performance with less overhead on SAP system by limiting RFC calls • Real-time change data capture (CDC) applied to SAP data sets Key Value of Attunity Replicate for SAP Rapid Data Access for More Users and Systems
  • 28. SOURCES OLTP, ERP, CRM Systems Documents, Emails Web Logs, Click Streams Social Networks Machine Generated Sensor Data Geolocation Data Data Integration & Ingest Attunity Replicate for HDP and HDF Accelerate time-to-insights by delivering solutions faster, with fresher data, from many sources - Automated data ingest - Incremental data ingest (CDC) - Broad support for many sources Integrating SAP Data into the Hortonworks Connected Data Platform
  • 29. Data Ingest for Hadoop Data Lakes Attunity Replicate Batch CDC Incremental Batch Cloud On-prem Cloud On- prem Persistent Store Hadoop Files RDBMS Data Warehouse Mainframe
  • 30. Automate Your Data Pipeline Automate ingest, schema creation and CDC for faster value Universal Data Ingestion Ingest data into your data lake on premises, in the cloud, or hybrid Easy Data Structuring and Transformation Automatically generate schema and structures in Hive for ODS and HDS with no manual coding Continuous Updates Use CDC for real time analytics, parallel threading & time based partitioning for less overhead and more confidence Ensure Data Consistency ACID MERGE operation to process data insertions, updates and deletions and ensure integrity and avoid user impact. Slowly Changing Dimensions Supports Type 2 slowly changing dimensions. With time stamps easily perform trend/ time- oriented analysis Attunity Compose for the Hive Your Fastest Way to Analytics-Ready Data Lakes
  • 31.  44 Active Customers out of Fortune 100  7 of Top 10 Manufacturers  6 of Top 10 Health Care and Pharmaceutical Companies  5 of Top 10 Financial Services Organizations  4 of Top 10 Automotive Companies Attunity Is Trusted and Used by…
  • 32. Trusted Technology and Partner  Trusted by Microsoft with 3 OEMs, bundled inside SQL Server  Trusted by Amazon (AWS) with technology licensing for cloud migration service  Trusted by IBM and Oracle with respective OEMs of Attunity technology  Trusted by Teradata as a reseller for Data warehouse and Hadoop market  Trusted by HP as a reseller for Data warehouse and analytics market  Trusted by Accenture, Capgemini and Cognizant as SI partners  Trusted by Hortonworks, Cloudera and MapR for Hadoop solutions  Trusted by over 2000 customers in over 65 countries

Notas del editor

  1. Thanks John Good morning everyone, thanks for choosing to listen to our presentation. As John said, I am the business development director for IBT, or other words, SALES, so it’s a sobering thought then that I am probably the least intelligent person in the room. So hopefully you’ll cut me some slack when we get to the techie part. It’s ok, I’m a father of 4, so I’m used to being told it on a daily basis.
  2. No doubt you’ve heard this before but I find it incredible to know that my 7 year old has already been exposed to as much information, if not more, than my grand parents had throughout the whole of their lives. In fact more than 90% of all the data there has ever been has been produced in the last few years. But only 0.5% of it is ever analysed. And the value of that data, so I’m told, decreases by almost 90% within the first 5 minutes!
  3. So the stakes are high and it’s vitally important to manage that data as effectively as possible. The need is real. Companies that can’t keep pace will no longer be able to compete. They will not see or anticipate the disruption in their industry. They MUST respond rapidly, they MUST make fact-based decisions, and they MUST leverage their data assets to remain viable. The stakes are too high not to.
  4. So it’s no wonder that CIO’s rank analytics and cloud adoption as their 2 foremost priorities. Now this survey was taken back in 2015 but it still rings true today, particularly in Australia.
  5. But more than that, the trend for self service and automation is turning IT into BT – and I’m not talking about British Telecom, for those of you who’ve been to the UK, for that would be truly awful. Business Technology, meaning business users want access to data not just the IT departments. They want to agility and veracity to mine data for themselves. Find links, try things, fail and try again.
  6. Because for the first time real value is being extracted from it. The author of this report had been surveying Fortune 1000 companies since 2012 and for the first time nearly 50% reported achieving measurable results from their big data investments, with 80.7% of executives characterizing their big data investments as “successful.” And this is why we are here today. To talk about how we can help you achieve similar results. Using the power of todays technology - Hadoop and Attunity and others – so that your business can keep ahead of its competition.
  7. So who is IBT? Well we’re Attunity’s authorised partner for Australia. We sell, implement and support all Attunity products. So we are the face you see and the people you deal with for support and assistance.
  8. But we are here to talk about Attunity. The No1 independent provider of data integration in the market. With 2,500 customers globally, 44 of the top Fortune 100 included, recognised as a challenger on Gartner’s Magic Quadrant, and a partner of choice for most of major software vendors, data integration and data management is their business … and only their business.
  9. Flagship products are Attunity Replicate, the product we’ll be mostly talking about today, moves you data from almost any source to almost any target, in fact the largest supported end points in the market, without any need to code. As it is so simple to set up, you see an almost immediate return on investment. Attunity Compose – enables you to automate the process of creating data warehouses, data marts, Hadoop and now HIVE data lakes with no manual ETL coding, saving time, ensuring compliance and lessening the burden on your budgets. Attunity Visibility – provides insight how data is used by the business and impacting your systems - enabling you to make intelligent and informed data management decisions, saving some customers millions of dollars
  10. These are some examples of what customers are using the technology for. As we are here to talk about SAP and Hadoop, the two that jump out at me are : The international food processor that needed to merge data from multiple sources, including SAP, into a centralized data lake for analytics. To improve efficiency and reduce costs. Their challenge was to decode that data from the complex underlying structures. They chose Attunity Replicate to continuously load data from the first sales order ERP module into a Hadoop data lake. Business analysts were then able to select and add custom Z tables, including indexed columns. Attunity Replicate accelerated data integration — saving significant time and labor costs. The global telco (Verizon) found getting real time data for financial reporting time consuming & labour intensive. Using a multi-tier process to pull data from 2 SAP and 1 PeopleSoft system and integrating that into its Hadoop data lake was extremely challenging. They had to wait a whole day to see data changes which was not sustainable. With Replicate, all divisional and corporate financial reporting teams now have access to ERP data in one place. As data changes, it is automatically uploaded to the data lake in less than an hour rather than overnight so that the most up-to-date data is available for daily reports. The solution also eliminated the middle-tier database, reducing maintenance time and cost, and easing the strain on valuable developers. They now gain new insights, and make more strategic decisions for different parts of the business, accelerating key business initiatives.
  11. And here in Australia, we are little behind the curve when it comes to Hadoop. But it is finding its way into our customer sites. The Healthcare company needed a real time CDC solution for its customer transformation program and to build its data lake in SQL Server in Azure, and simultaneously to Event Hubs for streaming. And the state owned Energy business needed to replicate and capture changes from its source systems into Vertica. Speed was essential as some tables were huge, 4.3 billion rows (that’s since grown to over 6bn) – was taking their incumbent solution 3 days. Replicate did it in 5 hours. So the idea being we are streaming real time data so that you can analyse it downstream, NOW, not tomorrow.
  12. So with this in mind, to stay competitive, Enterprises must be prepared to: Analyze data from 100s if not 1000s of sources of data, Analyze data anywhere — on-premises or in the cloud from databases, data warehouses, Hadoop systems, in-memory structures, etc., and Analyze data in real-time to capture new and changing data and process it as a stream. Your most valuable data resides in your operational systems. It’s what’s running your organisation. It’s great that you can pull data from social media, and other external data, but if you can’t get at that transactional data that is running the business, what are you enriching it with? It’s great you can see who loves a good cat meme, but if it isn’t linked to the data that is driving the business where is the real dollar value? So for true insight, you need all the data from all your sources. As an example, a global auto manufacturer has taken this real-time integration approach for 4500 applications, from SAP, DB2, Oracle, and streaming it into its Hortonworks data lake. Using Replicate’s CDC process  they maintain true real-time analytics with less overhead. So the ability to deliver this is where Attunity help organisations today.
  13. So how do we do it? How do we help you analyse, everything, anywhere in real time? So as an example, what is the most efficient way of getting data out of an Oracle system say? It is off platform, meaning it is stand-alone with nothing installed on the source. You need to read the change logs in memory and push them into the target. In other words turn the database into a stream. This is critical for building new analytics. If you’re living in a batch world with end of day processing and you want to see a bank balance say at midnight when it’s quiet, you take an ETL snap shot and all you’re seeing is a point in time, you don’t get to see the full transactional flow as it accrues in real time out of operational systems. So by turning the database into a stream your analytics team can build out a fuller picture for risk, anti-money laundering, credit card fraud etc, and then build micro services out of Kafka, as an example, to deliver changes to the business. And you still have your historical data in Hadoop, if someone has a question. This is what Attunity Replicate does. It provides high speed data replication – for batch and full loads whilst at the same time running CDC or change data capture. It supports the widest end points on the market, and optimizes the data transfer using parallel loading and in-memory streaming. The end-to-end process is completely automated with no manual coding and there are no agents needed on your source systems - meaning lower IT costs, less maintenance and a faster return on investment. And you can access the information through its web based user interface to configure, control and monitor your replication tasks, whether on-premise, across a WAN, in the cloud and back again, if need be. It provides an immutable store of all the transactional history, so you can go back and replay the changes on your analytical platform.
  14. So its highlights are that it is EASY. It’s written for DBA’s, not developers, so no scripting is involved. There’s no footprint on your production systems, so nothing to maintain - imagine having to maintain 1000’s of agents on all your sources? How much time and cost would that involve? It supports over 45 different end points, including SAP, Kafka, all the versions of Hadoop and it’s fast, secure using AES 256 encryption and its scalable, up to thousands of applications and sources.
  15. From legacy systems to the major data warehouses, the cloud and more
  16. But let’s look more deeply into SAP. It’s not your typical database system is it. It brings more challenges than most with its pooled and clustered tables, its stored blobs and clobs that have no meaning at the database layer. It only means anything if you talk at the ABAP level of the SAP structure to understand what the data is. So accessing and unpacking that data is critical. Replicate does a query based read initially then takes the data from the transactions logs to stream changes in real time. The Flexibility is the choice you have. You can merge that transactional data with all your other sources to find new insights in the cloud, Hadoop and other external targets. And you don’t have to be dependent on SAP modules. The data in BW may already have gone through the ETL process, so you might want data straight from the operational system, your CRM say, and stream that into a Kafka platform or directly into Hadoop. Then you can build out your analytics. You can then move it all back into HANA if necessary. That said, we do also work with SAP to enable users to move external data to HANA. And its because of this we are Gold Certified and you won’t face a hefty legal bill by using an external tool. So you’re not locked into the SAP world and dependent on BW only. It’s because Attunity is seen as being technology agnostic, that they work with all the major software providers, that Gartner lists them as the leading independent vendor for data integration. Attunity decodes SAP by using a subset of its SAP certified add-on product called Gold Client, a tool that has been part of the SAP ecosphere for 20 years. It creates QA and test / dev environments so that you can replicate between SAP instances and its this tool that gives us the ability to demystify the data. And its EASY. The whole process is automated, you don’t even need to know how to code in ABAP. Imagine that, take the whole mundane ETL process, that’s fraught with fat fingers and human error, that takes time and eats into your budget, and replace it with an end-to-end solution that is up and working in as little as 2 hours! That development budget can be put to better use finding insights in the analytical layer. And you’d have happier staff. And maybe a CFO Attunity can do this because it’s all we do. Attunity focuses only on making data available – whatever the system - and after 20 years they’ve become pretty good at it. Its something I tell my kids to do. Focus and Practice. If only I had when I was younger. I could have been a contender
  17. So Replicate for SAP moves application data in bulk or real-time for Big Data Analytics. All your Documents, transactions and business data. All core and industry-specific SAP modules. The topics, your table structures, the schema definition, all automated and predefined so you can .. Integrate with all major targets - Databases, data warehouse and Hadoop, On premises or in Cloud and move external data into SAP HANA As well as streaming services like Apache Kafka, and others like Event Hubs or MapR Streams Why is this important? Because Attunity’s CDC engine is a natural fit. If you plan on building applications and business processes out of Kafka and using it as the core ingest system for multiple business units, Attunity streams the data directly into it, all automated, so that it is just another end point. It just works
  18. And these are the use case we are seeing globally. Building data lakes within Hadoop and using streaming tools like Kafka. Building analytics platform in the cloud and setting up operational data stores outside of SAP to merge data from other ERP sources so that corporate financial reporting and BI teams have access to all ERP data in one place. In near real-time. More value is generated by delivering real-time insights and analytics in combining various data sets with the financials, and new insights can be discovered by bringing additional data sets to the data lake.
  19. You can control it all via Replicate’s web based interface. It’s Point and click, that alerts you to issues, and makes it easy to configure and manage your replication tasks. And if, like the car builder I mentioned earlier, you have hundreds of tasks across multiple data centres running at the same time ..
  20. … Then Enterprise Manager module that sits on top of it lets you centralize design, management, & control all from one pane of glass. It requires only network connections to the Replicate servers to automatically discover the environment, and does not require any additional agents to be installed on the database server.
  21. Traditionally the Replicate product looks at the database level, but with SAP you need to talk directly to the ABAP level. You want the business objects so you can see at the SAP layer what it is you want to replicate. Then we decode those pools and cluster tables to give you exactly what you want in the target. So the business object level is being met, but in the target you get valuable information – product ID, customer, discount, price etc – rather than the structure that at the database level is just a blob that has little meaning if you just transferred it straight to the target.
  22. Not all CDC engines are the same. Some systems use flags and triggers and need agents on the source, all of which puts a lot of pressure on those production systems, which can impact them negatively and cause instability. Attunity CDC has no agents to install, it is log based only and the most optimized way to unlock your production systems and make that data available to the business.
  23. So the enablers I have to stress is: Attunity is SAP certified. You are not breaking any of your SAP license agreements. Attunity is seen as a good partner of SAP as their products help to move data around and into the SAP platforms. The boffins at SAP are not daft, they know they can’t turn their back on things like Hadoop and Kafka. They need to show their value with Hana and real time engines whilst understanding the benefits of the Hadoop platforms around that. So we help the flow of moving data to where it needs to be AS WELL AS moving that data to the SAP systems.
  24. So here is the slightly altered architecture when it comes to SAP. Again there are no agents at the database level, but we do need the gold client subset add on within SAP to talk to the ABAP engine. So that we can understand the data. The Replicate server makes API calls to the gold client add-on to connect to the SAP system as well as at the database layer. Replicate reduces the impact on the source and makes RFC calls if the data can be found at the database layer. From then on, Replicate works as normal so that you get meaningful data on the target side. The data has the structure around it so you can build analytics quicker and efficiently, rather than having to go back and unlock what all that data means.
  25. Replicate for SAP supports ECC 6, but it can also support older versions. Anything before 4.6C and we have a challenge on our hands but hopefully we won’t be seeing too many of those cases.
  26. Because Gold Client has been playing in the SAP space for a long time, Attunity has over 150 customers using the product for moving data within the SAP environment SO we do have insight into specific customisations for the core modules within SAP. We don’t want to take data from BW because the data has probably already gone through its clean up process, and that would be like taking data out of the data warehouse. We want to move the operational data and merge that with others on to newer external platforms for even more insight
  27. So the KEY Values then for opening flexible access to real time SAP data are: Better decisions support with real-time BI analysis of SAP data Agility and flexibility to use SAP data in new and valuable ways Lower IT costs and reduce IT\Dev dependency for access to SAP data Faster time to value by realizing new analytics value for SAP Data
  28. So why Hortonworks in particular? This is an older slide as they’ve added cloud in the hexagon, but it’s the HDP and HDF layer that is so important – the actionable insights – both historical and perishable, that makes the Hortonworks and Attunity partnership so important. It’s how Attunity streams data in real time so you can get more than just historical insight, you get the perishable insight as well. If you just want to know what you’ve sold this year compared to last year, great, pop it into HDP. That’s something you can do with BW, SQL Server, Teradata etc. But what if you have a piece of equipment that is found in all your stores, depots, warehouses, perhaps globally, and a fault has been detected. You can pick that up in real time, and you now know you have a unique code or part number that you can do predictive analytics across that so you can alert the whole business to the problem. You can flag it immediately so that all parts of your business knows this perishable insight and can do something about it before it becomes a huge issue. So with HDF you are driving the change to business using your most valuable data.
  29. So Attunity is an enabling platform to bring all your critical data to the newer platforms available today. Attunity Replicate supports all versions of Hadoop, both as a source and a target, and works with all major software vendors
  30. And before I move on, I just wanted to mention a recently released product that works specifically with Hortonworks 2.6 and utilises its ACID Merge operation. Attunity Compose, the data warehouse automation tool I mentioned earlier, is now available for HIVE. So for the first time you can easily set up your historical data store as well as an Operational data store with no need to code. This means that you’ve got the historical data including various views, but now you have an Operational store that you can run real time insight from without having to go back to the source. It really is the fastest way to build an analytics ready data lake.
  31. So to conclude, Attunity is trusted by almost half of the Fortune 100 companies, many of whom have been customers for years ….
  32. And is trusted by all the major software vendors. Microsoft, Oracle and IBM OEM Attunity technology to help theirs work. SQL Server 2012 onwards, for instance, uses Attunity technology. And Attunity has close partnerships with data warehouse vendors, like Teradata & HP as well as all the Hadoop solutions. So we may be small in comparison, but we carry some clout.
  33. So that’s it, thanks again for your time. If you’d like to know more come swing by the stand, or better still let’s go for a beer, god knows I need one. Oh and I am available for children’s parties.