SlideShare una empresa de Scribd logo
1 de 102
Descargar para leer sin conexión
1
Agenda9:00am - ONLINE CONFERENCE
• Introductory keynote
• How banking and financial institutions can leverage on Denodo Data Virtualization?
• Success stories in the financial sector
10:00am – ONLINE HANDS-ON SESSION
• Use cases & Successful implementations
• Performance and optimization
• Governance and security
• Live demo with Denodo
• Next steps
• Q&A
11:00am - CONCLUSIONS & END OF SESSION
ONLINE CONFERENCE
❖ 26 Brands
❖ 172 M Turnover
❖ 600 Clients references
❖ 1575 Experts around the world
AAA BeNeLux
❖ 5 Brands
❖ 20 M Turnover
❖ 60 Clients references
❖ 180 Experts
AAA ecosystem federates various consulting
companies and expertise across the world
Dynafin / Satisco Data Competence Center history
A long lasting story of successful achievements
A changing World
The financial sector faces some major challenges,
impacting all domains and levels of companies:
STRATEGY COMMERCIAL
TECHNOLOGY
FINANCEREGULATION
ORGANISATION
TALENT
PROCESSES
DATA MOBILITY
DATA SECURITY
DATA MANAGEMENT
DATA MONITORING
Dynafin / Satisco Data Competence Center history
Banking IT Integration
services
Partner in Financial
services
❖ Impact analyses
❖ Data centric strategy
❖ Data modelling
❖ Data governance strategy
❖ Change management
❖ Project management
❖ Testing
❖ Architecture design
❖ Technical analyses
❖ Integration strategy review
❖ Integration (re)development
❖ IT Testing
❖ DevOps
❖ Data Virtualization strategy
❖ Data Virtualization modelling
❖ Solution implementation follow-up
❖ Data governance strategy follow-up
❖ Data security strategies
❖ Change Management
❖ Denodo expertise
❖ Denodo Product Evolution
❖ Denodo Product Support
❖ Denodo Product Training
❖ Denodo User Meetings
❖ Denodo Use Cases Sharing
Digital Influencer and
data enabler
Solution provider
A multidisciplinary team to guide you in your business transformations
DATA
CENTRIC
OFFER
CREATION
Dynafin / Satisco Data Competence Center history
HOW BANKING AND FINANCIAL INSTITUTIONS CAN
LEVERAGE ON DENODO DATA VIRTUALIZATION?
WEBINAR 29 OCTOBER
Speakers
Aly Wane Diene
Senior Solution
Consultant
Alain Kunnen
Chairman &
Associate
Vincent Boucheron
Data Influencer &
Managing Partner
HOW BANKING AND FINANCIAL INSTITUTIONS CAN
LEVERAGE ON DENODO DATA VIRTUALIZATION?
WEBINAR 29 OCTOBER
A BIT OF CONTEXT….
• More and More data to deal with (IoT or Not)
• More and more heterogeneous type of data (cf rise of unstructured data)
• Mobility (data at your fingertips wherever you are)
• Security & Regulations & Sustainability & Reputations
• Real time Closure (versus the day after)
• Move to Cloud(s) in hybrid mode (or not) to benefit from scalability and agility
• Digital tools booming ( ex Ia, Rpa, Smart Sensors, 5g, etc)
• Move to modular applications (cf containers)
• A stormy IT ecosystem
MAJOR MARKET TRENDS
Business
Trends
Technology
Trends
BE PREPARE TO SURF
ON THE NEW BIG WAVE
© DIAMS. All rights reserved.
DATA AND PERSONAS OF EXISTING ECOSYSTEMS
Sales
HR
Apps/API
Executive
Marketing
Data Science
AI/ML
SIMPLIFIED PICTURE
90% OF DEMANDS REQUIRES
NEAR REAL TIME
DATA SECURITY AND GOVERNANCE?
75% OF STORED DATA
NEVER USED
2020-2050 IT challenges are no longer a matter of
user digital awareness or Technical Limitations
……….
It’s a matter of data federation
MAJOR MARKET TRENDS
© DIAMS. All rights reserved.
HOW BANKING AND FINANCIAL INSTITUTIONS CAN
LEVERAGE ON DENODO DATA VIRTUALIZATION?
WEBINAR 29 OCTOBER
DATA VIRTUALIZATION 2.0
Timeline
1992: First version of Crystal Reports embedding a DV
layer
1991-1992: First version of Skipper-BO
1994: MS release Access 2 with DV layer (links) and
Foxpro query optimizer
2003 : BO purchase Crystal Decisions
2008 : SAP purchases Business Objects
2019: ex MS SQL comprise a DV layer to mix SQL and
Hortonworks, Thomas Siebel launches C3.ai suite
leveraging on a DV embedded layer to analyze data …
• « Data Virtualization securely connects and federates
any type of data whatever their location »
• The need to federate data is not new ……&
Data virtualization is not new either ….
• Data virtualization is often hidden in software packages
DATA VIRTUALIZATION
DEFINITION
Data Virtualization
▪ Provides an easy access to disparate data (sources,
formats)
▪ Hiding the technical aspect of the storage (location,
formats)
▪ While data is not being moved physically
Comparaison
« MS Access expose the data but misses BO universes
functionalities (and vice versa) »
« Denodo is the new BO version we would have had if BO
had not been acquired by SAP »
© DIAMS. All rights reserved.
DATA VIRTUALIZATION – HOW IT WORKS?
CONNECT, COMBINE & CONSUME
Sales
HR
Executive
Marketing Apps/API
Data Science
AI/ML
Connect
Combine
Consume
COMBINE & INTEGRATE INTO BUSINESS
DATA VIEWS
Basic Core
modules
Performance
Data
Gouvernance
Security
• New generation of Data virtualization tools disrupts due to extra
functionalities unleashing new business and technical use cases.
• Core modules (Connect-Combine-Consume)
• Performance (query plan optimization, load balancing, cloud distribution
and delocalization)
• Data Governance modules (data catalog, physical/logical model
synchronization, data dictionaries / synonym interface)
• Security features (auditability, lineage, protection of underline data
sources)
All Data virtualization product do not cover all axis….
DATA VIRTUALIZATION
DATA VIRTUALIZATION 2.0
© DIAMS. All rights reserved.
HOW BANKING AND FINANCIAL INSTITUTIONS CAN
LEVERAGE ON DENODO DATA VIRTUALIZATION?
WEBINAR 29 OCTOBER
DATA VIRTUALIZATION 2.0 -> THE PROMISE
DV REQUIRES GOVERNANCE AND A MINIMUM LEVEL OF DATA MATURITY
• Data virtualization is not a toy for IT geeks !!!!!!
• It’s a business tool to support and secure
operations involving client or company data !!!!!
• Remember that a lot of companies expected BO
Universes to solve quality issues by magic with
low governance …..
• Data virtualization tool should mirror data
governance organization and deeply involve its
actors (ex data owners, data stewards, data
office)
DATA VIRTUALIZATION : THE PROMISE
Business, CDO, CIO,
CISO must team up to
deliver this promise
CDO Team
CISO Team
IT & DATA
ARCHITECTS
© DIAMS. All rights reserved.
• The capacity of Data virtualization to protect its underline
sources in terms of security and performance will allow
to create a molecular architecture where internal and
external referentials and golden sources are accessed
directly and efficiently by operations and back offices.
• Such capacity will allow to
• Swap rapidly modules (ex IA, containers) and infra structure (ex AWS, AZURE, on
premises)
• Reduce reconciliation need
• Offload physical data transport
• Federate ecosystems and foster marketplaces
• Simplify and make SI more understandable for its users
DATA VIRTUALIZATION : THE PROMISE
THE SUPPORT OF NEW TYPE OF BUSINESS INTERACTION
BI TOOLS, IA,
RPA,…
Referential
X
External
Data
DATA Virtualization
Product
Referential
© DIAMS. All rights reserved.
DV
Layer
Service
DV
Data Virtualization hub
DV
Layer
Service DV
Layer
Service
Azure ML
DV
Layer
Service
US Zone
EMEA Zone
On premice(s)
DATA VIRTUALIZATION 2.0
THE SUPPORT OF A NEW TYPE INFRASTRUCTURE
SUCCESS STORIES IN THE FINANCIAL SECTOR
WEBINAR 29 OCTOBER
SUCCESS STORIES IN THE FINANCIAL SECTOR
WEBINAR 29 OCTOBER
DATA VIRTUALISATION FOR WHO ….
24
DATA VIRTUALISATION : THE DATA TOOL EVERYBODY LOVES
DATA VIRTUALISATION FEDERATES DATA AND PEOPLE
Several Data virtualization functionalities will appeal to all the players
managing data. All user will for example love its ability to connect to the
golden source in a secure, non-intrusive way and the ability to cross-
reference data in an optimized way. At the implementation level, a
majority of players appreciates the speed of the solution's
implementation, its affordability and scalability.
Data virtualization federates not only data but also business lines and
therefore almost all the players in companies are likely to benefit from it,
from "operational business" (process and quality) to data scientists, CIOs,
CDOs, architects and security players.
© DIAMS. All rights reserved.
25
Persona 1 : CIO
CIO will appreciate the data virtualization for the following nice use cases
1. Support for business transformation (360 client, merge, digital transformation, IOT etc), Providing sandboxes with raw and
fresh data, quickly and securely for data scientists.
2. Controlled transition of the IS (migration to the cloud or reengineering of the IS)
3. Cost optimization (ex storage)
4. Protection of source systems to guarantee or refine a level of service
5. Information on the use of the data that allows the implementation of a pay per data usage system for the IS or the
identification of obsolete systems
© DIAMS. All rights reserved.
26
Persona 2 : CDO
The CDO will appreciate
1. The ability to quickly map golden sources, golden copies and related processes.
2. The ability to easily and cost-effectively reconcile the physical model of business objects with the data dictionary.
3. The ability to federate business around common concepts by easily creating an inter-application synonym dictionary,
4. He will also like the possibility of easily upgrading to a specialized dictionary.
5. The DV's ability to support data governance by providing a simple means of comprehensive quality controls on processes will
also certainly appeal to him
© DIAMS. All rights reserved.
27
Persona 3 : CISO
The data virtualisation will allow the CISO to
1. Improve its ability to detect real anomalies by easily cross-referencing data directly on the golden sources (versus copied
data).
2. Move towards a more predictive mode due to its near real-time connection to the sources.
3. Build even more relevant alerts by cross-referencing structured data (such as firewall intrusion type) and/or unstructured data
(e.g. building protection cameras).
4. Limit the maintenance costs of the alert system
5. Go beyond the limits of human control by coupling these alerts with machine learning or iam systems to avoid false alarms,
6. Evolve its protection system towards a model that is more data centric than application centric.
© DIAMS. All rights reserved.
28
Persona 3 : ARCHITECT
The data virtualisation will allow the architect to
1. Beter understand data flows and golden source and simplify the SI
2. Bring flexibility to the SI by fostering web services capacities for intraday exchanges
3. Propose a pragmatic approach to deal with legacy system obsolescence
4. Propose progressive approach durting transformation phases to better control transition
phasis
5. Satify business requirements in tems of rapidity and scalabiliy
6. Comply with high standards in terms of security and IT practices
© DIAMS. All rights reserved.
29
Persona 3 : DATA SCIENTIST
The data scientist will love data virtualization for
1. Its capacity to expose and query bruto data and cleansed data and more generally any type of data
2. Its capacity to rapidly integrate new sources and/or new fields (no need to wait for IT)
3. Its flexibility and compatibility with almost any reporting or data scientist tool of now (and probably tomorrow)
4. The possibility to work on non filtered production datasets (if they have the right credentials)
5. The capacity to use the scheduler, audit trail and Denodo to align data scientist sandbox and explain rapidly deviance
between individual results.
6. The coherence between sandbox and production and the simplicity of replicability of analytics accross historical
versions of data set,
© DIAMS. All rights reserved.
SUCCESS STORIES IN THE FINANCIAL SECTOR
WEBINAR 29 OCTOBER
DATA VIRTUALISATION FOR WHAT AND WHEN ….
31
DATA VIRTUALISATION FOR WHAT
FROM A BUSINESS PERSPECTIVE
CORPORATE
USE CASES
IT
USE CASES
COMPLIANCE & SECURITY
USE CASES
DIGITAL INITIATIVES
HOLISTIC VIEWS (PRODUCT, CLIENT)
OPERATION OPTIMISATIONS
DATA GOVERNANCE
BUSINESS TRANSFORMATIONS
COST OPTIMISATIONS
ARCHITECTURE SIMPLIFICATION
IT MIGRATIONS &
TRANSFORMATIONS
COST OPTIMISATIONS
HOLISTIC VIEWS (RISK)
DATA PROTECTION
COORDINATION
HELPING CXO TO BRING VALUE AND SUPPORT
© DIAMS. All rights reserved.
32
Tactical approach
Leverage on
• Database protection
• Migration
Reactive approach
Leverage on
• Regulation
• Security Threats
Strategic visionary approach
Leverage on
• Business Maturity & Objectives
Plan
DATA VIRTUALISATION FOR WHEN
BE OPPORTUNIST : PICK YOUR BATTLES
© DIAMS. All rights reserved.
SUCCESS STORIES IN THE FINANCIAL SECTOR
WEBINAR 29 OCTOBER
DATA VIRTUALISATION SUCCESS STORIES….
How Financial Institutions Are Leveraging Data Virtualization to Overcome their Business Challenges (EMEA)
How Financial Institutions Are Leveraging Data Virtualization to Overcome their Business Challenges (EMEA)
DATA MARKET PLACE : TRANSFORM YOUR SI USER EXPERIENCE
Use Case : LEVERAGE ON DV CAPACITIY TO PROTECT UNDERLINED DATA BASE TO ALLOW USERS AND APPLICATIONS TO ACCESS DATA MORE EASILY
AND SECURELY .
SALES
RERERENTIAL
STAFF
REFERENTAL
SALES GUI
HR
GUI
Data Marts
IAM
Tool
Data Virtualization Data Catalog
BI++
Sales Data
dictionary
PROCUREMENT
GUI
EXTERNAL
STAFFF
REFERENTAL
BI++
security
IAM
GUI
CLAIMS or
INCIDENT
REFERENTAL
CLAIMS
GUI
Firewaal Logs
or
Censors
or
Video
Or Audio
Market
data
ChatBot
Sales
ChatBot
Security
BI++
Complia
nce
Use Case : Build customised views associated to each
group of user needs
Step 1 : Implement data virtualization layer between
consumers and sources in a direct non intrusive way
Step 2 : Build queries and view associate to each user or
applications need
Step 3 : Build data catalog and data dictionary
Step 4a : Monitor and control usage
Step 4b : Introduce a pay per use principle for
maintenance costs,
© DIAMS. All rights reserved.
KYC & Client 360 VIEWS: EXTEND YOUR VISION OR CONTROLS IN A RAPID AND FLEXIBLE MANNER
Use Case : LEVERAGE ON DV CAPACITIY TO PROTECT UNDERLINED DATA BASE TO ALLOW USERS AND APPLICATIONS TO ACCESS DATA MOREEASILY
AND SECURELY .
CLOUD &
Internet
EXTERNAL
SOURCES
CLIENT
REFENTIAL
CLIENT
GUI
Data Marts
IAM
Tool
Data Virtualization Data Catalog
BI++
Sales Data
dictionary
EXTERNAL
STAFFF
REFERENTAL
Data
scientists
IAM
GUI
FORBIDEN
CLIENTS
Website
content
or
PDF
or
Video
Or Audio
flies
Market
data
ChatBot
KYC
ChatBot
360
BI++
Complia
nce
Use Case : Build customised views associated to each group
of user needs
Step 1 : Implement data virtualization layer between
consumers and sources in a direct non intrusive way
Step 2 : Build queries and view associate to KYC or 360
requirements
Step 3 : Build data catalog and data dictionary
Step 4a : Monitor and control usage
Step 4b : Add rapidly any type of sources or fields
Step 4c : Create and Secured Sandbox for data scientists
© DIAMS. All rights reserved.
Use Case : LEVERAGE ON DV MULTISOURCE CAPACITIES MOVE DATA AND BUSINESS MORE EASILY AND SECURELY .
Application
B
Application C
Data Catalog
Data
dictionary
&
Synonym
Application
A
Use Case : Build customised views associated to each group of user
needs
Step 1 : Implement data virtualization layer between legacy and new
consumers/sources in a direct non intrusive way
Step1b : build data dictionary and data synonym
Step 2 : Build queries to verify easily data integration in new systems by
ETL and integration layers
Step 3 : Complement and offload ETL with web services capacity on new
system
Step 4 : Push bask data to slave system during transition period
Step 5 : Add rapidly any type of sources or fields to fix temporary ETL
delivery unforeseen issues (ex new source or field to be added urgently)
Step 6 : Decommission Gui A and B but use DV to expose Application A
and B archived data
Data Virtualization
ETL
Merger and Acquisition – Business Transformation : Allow a secured and progressive move
GUI
A
GUI
C
Check
Migration
// Run
GUI
B
Web service
near real
time
capacity
Master
slave feed
back post
migration
Urgent
temporary
KAD
Fix
BI++
Archived
data
© DIAMS. All rights reserved.
UNIFIED DATA LAYER : USE DV TO REGAIN CONTROL OF SILOED SI
Use Case : Leverage on regulators pressure, obsolescence or cloud migrations to
put in place data virtualization to regain control of data exposure
Step 1 : Implement data virtualization layer between consumers and sources in a
direct non intrusive way
Step 2: Analyze query and identify golden copy and inappropriate query
Step 3 : Build action plan based on hall of horrors area
Step 4 : Kill bad golden copies and redirect easily users to golden sources or
replicate golden source set of data control on golden copies to monitor and
communicate on data quality issues.
Step 5 : Simplify SI architecture by allowing new application to access securely and
directly to golden source tables, avoiding integration tables, reducing ETL usage
and duplicate data storage.
Step 6 : Analyze data usage and move security from application centricity to data
centricity
Step 7 : Use DV data catalog to help build data dictionaries and data synonyms
Step 8 : communicate on success per data source, business lines or data objects
Focus on data source and data usage
Ex regain control of siloed SI
© DIAMS. All rights reserved.
DATA GOVERNANCE : USE DV TO DISCOVER AND SOLVE DATA ISSUES
Use Case : Use DV to build a simple but efficient data flow monitoring tool
to discover and solve any process issues,
Step 1 : Chose a process and identify important phasis/steps
Step 2: Associate each steps to points of controls (in and out) associated to
expected values and metadata
Step 3 : identify data sources associated to this value and metadata and plug
data Virtualization connectors to these sources
Step 4 : Build queries retrieving data and metadata on regular basis
Step 5 : Trigger alerts based on fixed acceptable limits or dynamic values
based on historical data logs
Step 6 : Use BI tools “plugged” on Dv queries to build efficient dashboards
showing values and trigger alerts.
Step 7 : Analyze alerts to take corrective actions
Step 8 : Use IA to help business take preventive actions according to context
(ex shift of resources to secure deadlines or cut off times)
Focus on data process
Ex Implement Data Flow Monitoring
Step1 Step2 Step3
Step4a
Step4b
Data Virtualization
© DIAMS. All rights reserved.
DATA SECURITY : BUILD A DOME TO ENABLE A DATA CENTRIC PROTECTION
Use Case 2 : Team up with security teams to support their projects and/or implement
affordable data protection measures.
Step 1 : Implement data virtualization layer between consumers and sources in a direct
non intrusive way
Step 2: Analyze query logs to identify sensible data usage and their associated users
Step 3a : Create user groups to create data centric access control (versus application
centric)
Step 3b : Create security dashboards monitoring data usage and alerts based on DV logs,
Step 4a : Connect DV to data sources and progressively build a DV protection dome over
your data monitoring data usage for sensible data whatever their type and location,
Step 4b : Use data lineage and/or ETL logs to extend the DV dome protection to golden
copies.
Step 4c : Easy trigger alerts if applications or users try to extend their data perimeter
consumption
Step 4d: Include application built if APIs or DV create APIs into DV dome protection
THE BUILDING OF SUCH DOME CAN BE BUILD ON A SUBSET OF THE SI
(ex DWHs and/or DATALAKEs)
Offer a Holistic view od data to your
security/compliance/risk officers
Remember that in most organisation ETL does not handle data
security metadatas leading to a « Kerviel risk » …..
© DIAMS. All rights reserved.
DATA MIGRATION & HYBRID DATA FABRIC :
SECURE YOUR SI TRANSFORMATION WITH AGILITY TO BENEFIT FROM BEST OF BREED TECHNICS
Use Case : Take advantage of any type of transformation projects (obsolescence,
migration to cloud, merger) to benefit from best of bread technics while putting
improving rapidly data governance and services.
Step 1 : Implement data virtualization layer between consumers and sources in a
direct non intrusive way. Database base logical and physical views remain active.
Step 2: Run legacy system and new system in // and analyse query log to secure
data source performance
Step 3 : Kill legacy views
Step 4 : Start building new query views bases on joins between data sources.
Step 5 : Implement web services base on join based queries.
Step 6 : Implement new data sources (ex hortonworks, clouds,)
Step 7 : Redirect customer to new sources when appropriate with progressively
and with no major impact
Step 8 : Offload ETL transfer when appropriate and bring more “real time”
services,Oracle
DWH Intraday
Sybase
DWH end of day
Data MartsViews
Horton
Works
CLOUD 1
Data Virtualization Data Catalog
BI++
IA
RPA Apps
BI++
BI++
IA
IA
IA
RPA
RPA
Apps
Apps
Data
dictionary
CLOUD 2
CLOUD 3
Cloudera
© DIAMS. All rights reserved.
LOGICAL DATAWAREHOUSE/DATALAKE & APIFICATION :
SIMPLIFY YOUR SI TO INCREASE FLEXIBILITY AND OPTIMIZE COST
Use Case : Leverage on IT (ex Migration to Cloud) or Business projects (ex
Client 360) to rethink data architecture and become data centric
Step 1 : Focus on golden sources and referential and expose golden source
data directly to users and application via data virtualization platform as a
single point of access.
Step 2 : Progressively move toward a molecular SI where Golden source
can be added, modified or replaced more easily and where data
replication and data reconciliation are reduced.
Step 3 : Use DV API features to create API on legacy applications and build
an Api catalog,
Gartner Vision of Data Architecture Evolution
© DIAMS. All rights reserved.
SUCCESS STORIES IN THE FINANCIAL SECTOR
WEBINAR 29 OCTOBER
TIME AND MEANS….
45
DATA VIRTUALISATION : FACTS AND FIGURES
TIME AND MEANS
Denodo infrastructure
• Windows or linux
• On premise or Cloud base (AWS and Azure Available)
Licence & Infrastructure Costs
• On Premises : Depend on client infrastructure
• HW Cloud Base + Client Own License : Depend on client context
• Cloud base :from 88 ke to 188 ke according to context
Implementation time
• On Premise : 3 to 6 month Based on BNPP implementations
• Cloud based : Less than 6 month in average
46
DATA VIRTUALISATION : FACTS AND FIGURES
LOW RISK AND MAXIMUM RETURN
Several ways to adopt data virtualization
1. Denodo Express : free but with some limitations
2. Free Cloud Test drive to discover predefined use cases
3. Free Trial Cloud License with 14 days limitation
4. Diams Starter Kits
DIAMS has created a set of « starter kits » proposition enabling its client to test
data virtualization on a specific scope for a small period of time and a limited
budget.
These « easy » offers of service are intended to be scalable to allow clients to
leverage on first positive results.
Don’t hesitate to contact us to get more details.
• Data Maturity Assessment -DMM
• Dataflow monitoring- DFM
• Easy Line of Production Monitoring - LPM
• Easy KPI for Industry -KP2I
• Easy Compliance & Risk Monitoring -ECRM
• Easy-Security Alert and Monitoring -SAM
• Easy KPI for Corporate- KPIC
• Easy Transition Transformation and Migration- T2M
• Easy-Robotic Process Automation Implementation - RPAI
• Easy-Articial Intelligence Implementation -A2I
• Integration & Data Exposure As a Services -IDEAS
DIAMS « Starter Kits » offers
TRY IT
&
LOVE IT
© DIAMS. All rights reserved.
TAKE AWAY
• We have all the technical tools needed to reach success
• Do not forget data governance and change
• DV federates data so try to co finance your projects with
other CxO
• Pick the right battles and data, dream big, deliver fast
• You are not alone DIAMS & Denodo can help you ….
Q&A
THANK YOU
Vincent BOUCHERON
+32 496-08-77-33
Vincent.Boucheron@diams-it.com
www.diams-it.com
ONLINE HANDS-ON SESSION
Uses cases and successful
implementations
52
Data Virtualization use cases
Sales
HR
Executiv
e
Marketin
g
Apps/A
PI
Data
Science
AI/ML
53
Financial Services Customers
54
Case Study : Unified view into regulatory risk
Business Need
• Need a controlled data environment to support tougher
regulatory requirements.
• Information does not tie across all data silos.
• Need a smart data governance initiative to avoid garbage-
in-garbage-out problem.
• c
Benefits
• Enables faster time-to-market and incremental information
delivery.
• Helps CIT realize value from data - successfully access all
data through provisioning point instead of through legacy
point-to-point integration.
• Minimize data replication and proliferation by eliminating
data redundancy.
Solution
55
Case Study : Logical Data Warehouse
Business Need
• Accelerate business operations in loans, deposits, and
other departments through use of self-service reports and
dashboards.
• Establish a central information delivery platform to easily
add new data sources from acquired companies.
• Prior cloud-based data warehouse was inflexible to
accommodate new data sources.
Benefits
• Improved efficiency through self-service for business users
in loans, deposits, fraud, credit, and risk departments.
• Reporting turnaround time improved from 2-3 days to 2
hours.
• Business operations such as loan processing is handled in
real-time.
Solution
Performance and optimization
57
Performance and optimization
“What about performance?”…usually first question we get about
Data Virtualization
Many factors affect performance
• Data sources, network latency, complexity of query and
processing, consumer ingestion rates, etc.
Overall performance driver…
• Minimize data moved through network
• Maximize ‘local’ data processing
‘Move processing to the data’
1
1
58
Query Optimization Pipeline
Parts of the optimization pipeline
Query
Parsing
• Retrieves execution capabilities and restrictions for views involved in the
query
Static
Optimizer
• Query delegation
• SQL rewriting rules (removal of redundant filters, tree pruning, join
reordering, transformation push-up, star-schema rewritings, etc.)
• Data movement query plans
Dynamic
Optimizer
• Picks optimal JOIN methods and orders based on data distribution statistics,
indexes, transfer rates, etc.
Execution
• Creates the calls to the underlying systems in their corresponding protocols
and dialects (SQL, MDX, WS calls, etc.)
59
Performance Optimization Techniques
Query Plans
SQL is a declarative language:
• Queries specify what users want, not how to get the data.
• There are potentially many ways of executing a query.
A query plan specifies a set of steps for executing a query or subquery.
The optimizer has the goal of selecting the best plan.
• First generate multiple query plans for each query.
• Estimate the cost of each plan.
• Select the plan with the minimum cost.
60
Performance Optimization Techniques
Static vs. Dynamic Optimization
Static optimization:
• Based on SQL transformations.
• Rewrite query in more optimal way.
• Remove redundancies, inactive sub-
trees, etc.
• Push-down delegation:
• Optimize query by pushing down sub-
trees to underlying data sources.
Dynamic optimization:
• Use statistics and indices to estimate costs of
alternative execution plans.
• Select Join methods and Join ordering.
61
Performance Optimization Techniques
Query Delegation
Objective: Push the processing to the data.
• Utilize power and optimizations of underlying data sources.
• Especially relational databases and data warehouses.
• Minimize expensive data movement.
Delegation mechanisms:
• Vendor specific SQL dialect.
• Function delegation.
• Configurable by data source.
• Delegate SQL operations.
• e.g. Join, Union, Group By, Order By, etc.
62
Performance Optimization Techniques
Query Rewriting
Goal: Rewrite query in an optimal way before the query is executed. Typical
optimizations:
• Simplify partitioned unions.
• Remove redundant sub-views.
• Remove unused join branches due to projections.
• Transform outer joins to inner joins.
• Static join reordering to maximize delegation.
• Reordering of operations.
• Full and partial aggregation push-down.
63
Performance Optimization Techniques
Source Constraint Optimization
Denodo Platform optimization has to work across multiple diverse data source
types:
• Not just relational databases.
• Not all data sources have same capabilities.
Recognize and optimize for constraints in underlying data sources:
• e.g. MySQL can be ordered for Merge join… but a delimited file cannot.
64
Performance Optimization Techniques
Data Movement
Typically used when one dataset is significantly smaller and aggregations performed
on joined data:
Execute query in
DS1 and fetch its
data
1 3 When step 2 is completed,
execute the JOIN in DS2 and
return the results to the DV layer
Create a temporary
table in DS2 and insert
data from step 1
2
DS1 DS2
65
Caching
Real time vs. caching
Sometimes, real time access & federation not a good fit:
• Sources are slow (e.g.. text files, cloud apps. like Salesforce.com)
• A lot of data processing needed (e.g.. complex combinations, transformations, matching, cleansing, etc.)
• Limited access or have to mitigate impact on the sources
For these scenarios, Denodo can replicate just the relevant data in the cache
66
Caching
Overview
Based on an external relational database
• Traditional: Oracle, SLQServer, DB2, MySQL
• MPP: Teradata, Netezza, Vertica
• Cloud-based: Amazon Redshift, Snowflake
• In-memory storage: Oracle TimesTen, SAP HANA
Works at view level
• Allows hybrid access (real-time / cached) of an execution tree
Cache Control
• Manually – user initiated at any time
• Time based - using the TTL or the Denodo Scheduler
• Event based - e.g. using JMS messages triggered in the DB
67
Caching
Caching options
Denodo offers two different types of cache
• Partial:
• Query-by-query cache
• Useful for caching only the most commonly requested data
• More adequate to represent the capabilities of non-relational sources, like web services or APIs
with input parameters
• Full:
• Similar to the concept of materialized view
• Incrementally updateable at row level to avoid unnecessary full refresh loads
• Offers full push-down capabilities to the source, including group by and join operations
• Supports hybrid incremental queries for SaaS data sources (next slide)
68
Caching
Incremental Queries
Merge cached data and fresh data to provide fully up-to-date results with minimum
latency
Leads changed /
added since 1:00AM
CACHE
Leads updated at
1:00AM
Up-to-date Leads
data
1. Salesforce ‘Leads’ data cached in Denodo at
1:00 AM.
2. Query needing Leads data arrives at 11:00 AM
3. Only new/changed leads are retrieved through
the WAN.
4. Response is up-to-date but query is much
faster.
69
MPP Query Acceleration
• Denodo 7.0 supports using MPP cluster to accelerate queries
• Hive, Spark, Impala, Presto
• Operations that can be parallelized can be moved to MPP cluster
• e.g. GROUP BY aggregations
• Data is copied to cluster and operation is delegated for processing
• Data copied in Parquet file
• Results returned to Denodo Platform
• Does not require any special commands from user
70
Example Scenario
Sales by state over the last four years.
Scenario:
• Current data (last 12 months) in EDW
• Historical data offloaded to Hadoop
cluster for cheaper storage
• Customer master data is in the RDBMS
Very large data volumes:
• Sales tables have hundreds of millions of
rows
join
group by State
union
Current Sales
68 million rows
Historical Sales
220 million rows
Customer
2 million rows (RDBMS)
71
MPP Query Acceleration
Current Sales
68 M rows
Customer
2 M rows
join
group by State
System Execution Time Optimization Techniques
Others ~ 19 min Simple federation
No MPP 43 sec Aggregation push-down
With MPP 26 sec
Aggregation push-down + MPP integration (Impala 4
nodes)
5. Fast parallel execution
Support for Spark, Presto and Impala
for fast analytical processing in
inexpensive Hadoop-based solutions
Hist. Sales
220 M rows
4. Integration with local data
The engine detects when data
is cached or comes from a
local table already in the MPP
2. Integrated with Cost Based Optimizer
Based on data volume estimation and
the cost of these particular operations,
the CBO can decide to move all or part
of the execution tree to the MPP
group by ID
2M rows
(sales by customer)
1. Partial Aggregation
push down
Maximizes source processing
dramatically Reduces network
traffic
3. On-demand data transfer
Denodo automatically generates
and upload Parquet files
72
Example: Denodo 7.0 MPP Acceleration
‘Non-optimized’ Execution Optimized Execution with MPP Acceleration
Execution Time: 19 mins
(1,142 seconds)
Execution Time: 26.6 secs
(26,640 ms)
Governance and security
74
Governance
• Governance is a very broad topic
• More than just data access and delivery
• Data Virtualization can play an important role in overall data governance
• But it’s not the whole story by itself
• End-to-end governance, metadata management, data lineage, etc. require other
tools
• Data Virtualization has “1 degree of visibility”
75
Enterprise Governance
Data Lineage
• Find source of ‘truth’ – top down – shows where data comes from and/or how it is derived.
Source Refresh
• Detect changes in underlying data sources and propagate to the affected data services.
Impact Analysis
• Analyze impact of metadata changes in workflows where the modified view is used.
Catalog Search
• Have a complete understanding of each of the views and data services created in Denodo.
76
Metadata Management
Data Virtualization Platform collects lots of metadata
• Data source metadata
• By introspection or configuration
• Operational metadata
• How data changes as it flows through Data
Virtualization layer
• Generated by the Data Virtualization Platform based
on ‘model’ built by developers
• Business metadata
• Enriched metadata either imported or added by
‘data steward’
• e.g. view and field descriptions
Metadata
Categories
Technical
Metadata
Operational
Metadata
Business
Metadata
77
Metadata Introspection
• Denodo Platform gathers metadata from data sources:
• Automatically or via configuration.
• Maps native data types to ‘Denodo types’.
• Inspects indexes in the sources.
• Analyzes source query capabilities and abstracts them into common model.
• Stores all metadata and configuration data in Metadata repository:
• Uses built-in Apache Derby database.
• Small size – only stores metadata…actual data is retrieved in real time from sources or cache.
78
Data Lineage
• Graphical view for showing data lineage
for any field in any virtual view.
• Trace source of any field:
• Includes any functions
applied to field
contents.
• Trace source of calculated fields:
• View calculations used
to create new fields.
79
‘Used By’ Tool
• Graphical view for showing
where a view is used.
• “Big picture” view of usage.
• Useful tool for seeing impact of
changes on whole system.
80
Impact Analysis example:
Adding a new field
1
Views affected
by the change
2
Web Services
affected by the
change
3
Option to propagate
new field individually
per view
4
Preview of the
Tree view of the
affected views
81
Authentication
Register Denodo Asset Type
Publish Assets & Flows
Metadata Integration
• Export all metadata – Technical,
Operational, Business
• APIs and Stored Procedures
• Integration with Governance Tools
• IBM IGC, Collibra, Informatica Enterprise
Information Catalog (EIC)
• Data Virtualization Platform
represents the “as implemented”
data asset
Denodo Governance
Bridge
Information Server
IGC
REST
Information
Governance
Catalog
82
Data Quality & Integrity
• Data Virtualization can help with data quality
• Apply data quality functions as data is requested
• e.g. address lookup and validation routines
• But…serious data cleansing – e.g. matching and
deduping – is not recommended
• Use a DQ tool or MDM
• Data Virtualization forces you to think about the
best source of accurate data
• ‘Customer’ view – which are the best sources for
customer data?
• Manual process to decide and build views
83
Data Access
• Making data available to the users who need it
• Based on need, not access to databases or applications
• Managing and auditing data access
• Ensure (and prove) compliance with security policies
• Regulatory, geograp hic, contractual, and organizational
data access compliance
84
Denodo Data Catalog
85
Denodo Data Catalog
86
Denodo Data Catalog
87
Security
Unified Security Management through Data Virtualization.
• Data Virtualization offers an abstraction layer that decouples sources from
consumer applications.
• Single Point for accessing all the information avoiding point-to-point connections to
sources.
• As a single point of access, this is an ideal place to enforce security:
• Access restrictions to sources are enforced here.
• They can be defined in terms of the canonical model (e.g. access restrictions to “Bill”, to
“Order”, and so on) with a fine granularity.
88
Security Architecture
89
Secure Access
Data Virtualization secures the access from consumers to sources:
• Consumer to Denodo Platform (northbound):
• Communications between consumer applications and the Data Virtualization layer can be secured,
typically using of SSL (data in motion).
• Denodo Platform to Sources (southbound):
• Communications between the Data Virtualization layer and the sources can be secured too.
• Specific security protocol depends on the source: SSL, HTTPS, sFTP, … (data in motion).
• Data can be both read and exported encrypted (data at rest).
90
Denodo Platform Authentication – Northbound
• Client application -> Denodo Platform.
• Three options:
• Usernames and passwords defined within the Denodo Platform.
• Delegate the authentication to an external LDAP/AD server.
• Use Kerberos for Single Sign On.
91
Denodo Platform Authentication – Southbound
• Denodo Platform -> data source.
• Three options (for each individual source):
• Use a service account for the source.
• The admins create a user account in the source.
• The Denodo Platform always uses those credentials.
• Use Kerberos authentication.
• Use credentials pass-through.
• Access the data source with the username and password combination or the Kerberos ticket that was used to authenticate
with the Denodo Platform northbound.
92
Denodo Platform Authorization
• Role-based Authorization.
• Users/roles can be defined in the Data Virtualization layer and assigned specific permissions.
• Fine-grained authorization.
• Several permissions scopes:
• Virtual Database level (e.g. credit risk database, etc.).
• Views level (e.g. “Regional Risk Exposure”, etc.).
• Row level (filter rows that are not authorized)
• Column level:
▪ Grant/block access.
▪ Data masking (hiding sensitive fields).
93
Role-Base Data Privacy
• Control what data is visible based on user role
• e.g. Admin sees everything, Analyst has PII masked
• Masking can be encryption, tokenization, partial masking, redaction
• Built-in and custom functions allow partial masking, tokenization, etc.
• More complex logic also possible
• e.g. HIPAA Safe Harbor zip code handling using in-memory look-up maps
• e.g. anonymization of CC owners and transactions for pattern analysis
94
Role Based Access Controls - Permissions
95
Policy Based Security
• Custom Policies allow developers to
provide their own access control rules.
• Developers can code their own custom
access control policies and the
administrator can assign them to one (or
several) users/roles in a view in Denodo
(or to a whole database). DATA SOURCES
Custom
Policies
POLICY SERVER
(e.g Axiomatics)
Accept
+ Filter
+ Mask
Reject
Condition
s Satisfied
DATA CONSUMERS USERS, APPS
96
Policy Based Security : Example
Dynamic Authorization based on
policies
• Example : set limits to the number of
queries executed by a certain user/role;
determine if a query can be executed
depending on the time of the day or
leveraging the access policies in an
external policy server.
97
Denodo Platform Authorization
Hierarchical role definition
• A role can inherit and redefine an existing role
at any level in the tree.
Live demo with Denodo
Next steps
100
Next Steps
Access Denodo Platform in the Cloud!
Take a Test Drive today!
www.denodo.com/TestDrive
GET STARTED TODAY
Q&A
Merci !
www.denodo.com
info.emea@denodo.com
+33 (0)1 42 68 51 27
www.satisco.be
first@satisco.com
+32 (0)22 060 710
www.dynafin.be
info@dynafin.be
+32 (0)2 210 57 40
www.diams-it.com
vincent.boucheron@diams-it.com
+32 496 08 77 33

Más contenido relacionado

La actualidad más candente

‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management
‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management
‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality ManagementAhmed Alorage
 
Migrating data: How to reduce risk
Migrating data: How to reduce riskMigrating data: How to reduce risk
Migrating data: How to reduce riskETLSolutions
 
Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)DATAVERSITY
 
LDM Slides: How Data Modeling Fits into an Overall Enterprise Architecture
LDM Slides: How Data Modeling Fits into an Overall Enterprise ArchitectureLDM Slides: How Data Modeling Fits into an Overall Enterprise Architecture
LDM Slides: How Data Modeling Fits into an Overall Enterprise ArchitectureDATAVERSITY
 
Capturing Data Requirements
Capturing Data RequirementsCapturing Data Requirements
Capturing Data Requirementsmcomtraining
 
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532Ray Février
 
Data quality architecture
Data quality architectureData quality architecture
Data quality architectureanicewick
 
Chapter 3: Data Governance
Chapter 3: Data Governance Chapter 3: Data Governance
Chapter 3: Data Governance Ahmed Alorage
 
Building a Data Quality Program from Scratch
Building a Data Quality Program from ScratchBuilding a Data Quality Program from Scratch
Building a Data Quality Program from Scratchdmurph4
 
Data Modeling Fundamentals
Data Modeling FundamentalsData Modeling Fundamentals
Data Modeling FundamentalsDATAVERSITY
 
Data Architecture - The Foundation for Enterprise Architecture and Governance
Data Architecture - The Foundation for Enterprise Architecture and GovernanceData Architecture - The Foundation for Enterprise Architecture and Governance
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
 
‏‏Chapter 8: Reference and Master Data Management
‏‏Chapter 8: Reference and Master Data Management ‏‏Chapter 8: Reference and Master Data Management
‏‏Chapter 8: Reference and Master Data Management Ahmed Alorage
 
Master data management
Master data managementMaster data management
Master data managementZahra Mansoori
 
Master Data Management's Place in the Data Governance Landscape
Master Data Management's Place in the Data Governance Landscape Master Data Management's Place in the Data Governance Landscape
Master Data Management's Place in the Data Governance Landscape CCG
 
Data Catalogues - Architecting for Collaboration & Self-Service
Data Catalogues - Architecting for Collaboration & Self-ServiceData Catalogues - Architecting for Collaboration & Self-Service
Data Catalogues - Architecting for Collaboration & Self-ServiceDATAVERSITY
 
SAP Teched 2019 - Deployment Options with Business Continuity for SAP HANA
SAP Teched 2019 - Deployment Options with Business Continuity for SAP HANASAP Teched 2019 - Deployment Options with Business Continuity for SAP HANA
SAP Teched 2019 - Deployment Options with Business Continuity for SAP HANATomas Krojzl
 
MDM for Customer data with Talend
MDM for Customer data with Talend MDM for Customer data with Talend
MDM for Customer data with Talend Jean-Michel Franco
 
Advanced data modeling
Advanced data modelingAdvanced data modeling
Advanced data modelingDhani Ahmad
 
Incorporating ERP metadata in your data models
Incorporating ERP metadata in your data modelsIncorporating ERP metadata in your data models
Incorporating ERP metadata in your data modelsChristopher Bradley
 

La actualidad más candente (20)

‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management
‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management
‏‏‏‏‏‏‏‏‏‏Chapter 12: Data Quality Management
 
Migrating data: How to reduce risk
Migrating data: How to reduce riskMigrating data: How to reduce risk
Migrating data: How to reduce risk
 
Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)
 
LDM Slides: How Data Modeling Fits into an Overall Enterprise Architecture
LDM Slides: How Data Modeling Fits into an Overall Enterprise ArchitectureLDM Slides: How Data Modeling Fits into an Overall Enterprise Architecture
LDM Slides: How Data Modeling Fits into an Overall Enterprise Architecture
 
Capturing Data Requirements
Capturing Data RequirementsCapturing Data Requirements
Capturing Data Requirements
 
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
EPM, ERP, Cloud and On-Premise – All options explained - OOW CON9532
 
Data quality architecture
Data quality architectureData quality architecture
Data quality architecture
 
Chapter 3: Data Governance
Chapter 3: Data Governance Chapter 3: Data Governance
Chapter 3: Data Governance
 
Building a Data Quality Program from Scratch
Building a Data Quality Program from ScratchBuilding a Data Quality Program from Scratch
Building a Data Quality Program from Scratch
 
Data Modeling Fundamentals
Data Modeling FundamentalsData Modeling Fundamentals
Data Modeling Fundamentals
 
Data Architecture - The Foundation for Enterprise Architecture and Governance
Data Architecture - The Foundation for Enterprise Architecture and GovernanceData Architecture - The Foundation for Enterprise Architecture and Governance
Data Architecture - The Foundation for Enterprise Architecture and Governance
 
‏‏Chapter 8: Reference and Master Data Management
‏‏Chapter 8: Reference and Master Data Management ‏‏Chapter 8: Reference and Master Data Management
‏‏Chapter 8: Reference and Master Data Management
 
Master data management
Master data managementMaster data management
Master data management
 
Master Data Management's Place in the Data Governance Landscape
Master Data Management's Place in the Data Governance Landscape Master Data Management's Place in the Data Governance Landscape
Master Data Management's Place in the Data Governance Landscape
 
Data Catalogues - Architecting for Collaboration & Self-Service
Data Catalogues - Architecting for Collaboration & Self-ServiceData Catalogues - Architecting for Collaboration & Self-Service
Data Catalogues - Architecting for Collaboration & Self-Service
 
SAP Teched 2019 - Deployment Options with Business Continuity for SAP HANA
SAP Teched 2019 - Deployment Options with Business Continuity for SAP HANASAP Teched 2019 - Deployment Options with Business Continuity for SAP HANA
SAP Teched 2019 - Deployment Options with Business Continuity for SAP HANA
 
Modern Data Architecture
Modern Data ArchitectureModern Data Architecture
Modern Data Architecture
 
MDM for Customer data with Talend
MDM for Customer data with Talend MDM for Customer data with Talend
MDM for Customer data with Talend
 
Advanced data modeling
Advanced data modelingAdvanced data modeling
Advanced data modeling
 
Incorporating ERP metadata in your data models
Incorporating ERP metadata in your data modelsIncorporating ERP metadata in your data models
Incorporating ERP metadata in your data models
 

Similar a How Financial Institutions Are Leveraging Data Virtualization to Overcome their Business Challenges (EMEA)

KASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationKASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationDenodo
 
Introduction to Modern Data Virtualization (US)
Introduction to Modern Data Virtualization (US)Introduction to Modern Data Virtualization (US)
Introduction to Modern Data Virtualization (US)Denodo
 
Introduction to Modern Data Virtualization 2021 (APAC)
Introduction to Modern Data Virtualization 2021 (APAC)Introduction to Modern Data Virtualization 2021 (APAC)
Introduction to Modern Data Virtualization 2021 (APAC)Denodo
 
A Logical Architecture is Always a Flexible Architecture (ASEAN)
A Logical Architecture is Always a Flexible Architecture (ASEAN)A Logical Architecture is Always a Flexible Architecture (ASEAN)
A Logical Architecture is Always a Flexible Architecture (ASEAN)Denodo
 
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Denodo
 
Data Virtualization: From Zero to Hero
Data Virtualization: From Zero to HeroData Virtualization: From Zero to Hero
Data Virtualization: From Zero to HeroDenodo
 
Data Ninja Webinar Series: Realizing the Promise of Data Lakes
Data Ninja Webinar Series: Realizing the Promise of Data LakesData Ninja Webinar Series: Realizing the Promise of Data Lakes
Data Ninja Webinar Series: Realizing the Promise of Data LakesDenodo
 
¿Cómo modernizar una arquitectura de TI con la virtualización de datos?
¿Cómo modernizar una arquitectura de TI con la virtualización de datos?¿Cómo modernizar una arquitectura de TI con la virtualización de datos?
¿Cómo modernizar una arquitectura de TI con la virtualización de datos?Denodo
 
Why Data Virtualization? An Introduction
Why Data Virtualization? An IntroductionWhy Data Virtualization? An Introduction
Why Data Virtualization? An IntroductionDenodo
 
Fast Data Strategy Houston Roadshow Presentation
Fast Data Strategy Houston Roadshow PresentationFast Data Strategy Houston Roadshow Presentation
Fast Data Strategy Houston Roadshow PresentationDenodo
 
Accelerate Migration to the Cloud using Data Virtualization (APAC)
Accelerate Migration to the Cloud using Data Virtualization (APAC)Accelerate Migration to the Cloud using Data Virtualization (APAC)
Accelerate Migration to the Cloud using Data Virtualization (APAC)Denodo
 
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)Denodo
 
A Key to Real-time Insights in a Post-COVID World (ASEAN)
A Key to Real-time Insights in a Post-COVID World (ASEAN)A Key to Real-time Insights in a Post-COVID World (ASEAN)
A Key to Real-time Insights in a Post-COVID World (ASEAN)Denodo
 
Modern Data Management for Federal Modernization
Modern Data Management for Federal ModernizationModern Data Management for Federal Modernization
Modern Data Management for Federal ModernizationDenodo
 
Data Virtualization: An Introduction
Data Virtualization: An IntroductionData Virtualization: An Introduction
Data Virtualization: An IntroductionDenodo
 
Data Virtualization: An Introduction
Data Virtualization: An IntroductionData Virtualization: An Introduction
Data Virtualization: An IntroductionDenodo
 
Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...
Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...
Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...DataStax
 
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...Accelerate Digital Transformation with Data Virtualization in Banking, Financ...
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...Denodo
 
Cloud & Big Data - Digital Transformation in Banking
Cloud & Big Data - Digital Transformation in Banking Cloud & Big Data - Digital Transformation in Banking
Cloud & Big Data - Digital Transformation in Banking Sutedjo Tjahjadi
 
Data Virtualization: An Introduction
Data Virtualization: An IntroductionData Virtualization: An Introduction
Data Virtualization: An IntroductionDenodo
 

Similar a How Financial Institutions Are Leveraging Data Virtualization to Overcome their Business Challenges (EMEA) (20)

KASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationKASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
KASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
 
Introduction to Modern Data Virtualization (US)
Introduction to Modern Data Virtualization (US)Introduction to Modern Data Virtualization (US)
Introduction to Modern Data Virtualization (US)
 
Introduction to Modern Data Virtualization 2021 (APAC)
Introduction to Modern Data Virtualization 2021 (APAC)Introduction to Modern Data Virtualization 2021 (APAC)
Introduction to Modern Data Virtualization 2021 (APAC)
 
A Logical Architecture is Always a Flexible Architecture (ASEAN)
A Logical Architecture is Always a Flexible Architecture (ASEAN)A Logical Architecture is Always a Flexible Architecture (ASEAN)
A Logical Architecture is Always a Flexible Architecture (ASEAN)
 
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
 
Data Virtualization: From Zero to Hero
Data Virtualization: From Zero to HeroData Virtualization: From Zero to Hero
Data Virtualization: From Zero to Hero
 
Data Ninja Webinar Series: Realizing the Promise of Data Lakes
Data Ninja Webinar Series: Realizing the Promise of Data LakesData Ninja Webinar Series: Realizing the Promise of Data Lakes
Data Ninja Webinar Series: Realizing the Promise of Data Lakes
 
¿Cómo modernizar una arquitectura de TI con la virtualización de datos?
¿Cómo modernizar una arquitectura de TI con la virtualización de datos?¿Cómo modernizar una arquitectura de TI con la virtualización de datos?
¿Cómo modernizar una arquitectura de TI con la virtualización de datos?
 
Why Data Virtualization? An Introduction
Why Data Virtualization? An IntroductionWhy Data Virtualization? An Introduction
Why Data Virtualization? An Introduction
 
Fast Data Strategy Houston Roadshow Presentation
Fast Data Strategy Houston Roadshow PresentationFast Data Strategy Houston Roadshow Presentation
Fast Data Strategy Houston Roadshow Presentation
 
Accelerate Migration to the Cloud using Data Virtualization (APAC)
Accelerate Migration to the Cloud using Data Virtualization (APAC)Accelerate Migration to the Cloud using Data Virtualization (APAC)
Accelerate Migration to the Cloud using Data Virtualization (APAC)
 
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)
 
A Key to Real-time Insights in a Post-COVID World (ASEAN)
A Key to Real-time Insights in a Post-COVID World (ASEAN)A Key to Real-time Insights in a Post-COVID World (ASEAN)
A Key to Real-time Insights in a Post-COVID World (ASEAN)
 
Modern Data Management for Federal Modernization
Modern Data Management for Federal ModernizationModern Data Management for Federal Modernization
Modern Data Management for Federal Modernization
 
Data Virtualization: An Introduction
Data Virtualization: An IntroductionData Virtualization: An Introduction
Data Virtualization: An Introduction
 
Data Virtualization: An Introduction
Data Virtualization: An IntroductionData Virtualization: An Introduction
Data Virtualization: An Introduction
 
Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...
Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...
Webinar: DataStax and Microsoft Azure: Empowering the Right-Now Enterprise wi...
 
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...Accelerate Digital Transformation with Data Virtualization in Banking, Financ...
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...
 
Cloud & Big Data - Digital Transformation in Banking
Cloud & Big Data - Digital Transformation in Banking Cloud & Big Data - Digital Transformation in Banking
Cloud & Big Data - Digital Transformation in Banking
 
Data Virtualization: An Introduction
Data Virtualization: An IntroductionData Virtualization: An Introduction
Data Virtualization: An Introduction
 

Más de Denodo

Enterprise Monitoring and Auditing in Denodo
Enterprise Monitoring and Auditing in DenodoEnterprise Monitoring and Auditing in Denodo
Enterprise Monitoring and Auditing in DenodoDenodo
 
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps Approach
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachLunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps Approach
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
 
Achieving Self-Service Analytics with a Governed Data Services Layer
Achieving Self-Service Analytics with a Governed Data Services LayerAchieving Self-Service Analytics with a Governed Data Services Layer
Achieving Self-Service Analytics with a Governed Data Services LayerDenodo
 
What you need to know about Generative AI and Data Management?
What you need to know about Generative AI and Data Management?What you need to know about Generative AI and Data Management?
What you need to know about Generative AI and Data Management?Denodo
 
Mastering Data Compliance in a Dynamic Business Landscape
Mastering Data Compliance in a Dynamic Business LandscapeMastering Data Compliance in a Dynamic Business Landscape
Mastering Data Compliance in a Dynamic Business LandscapeDenodo
 
Denodo Partner Connect: Business Value Demo with Denodo Demo Lite
Denodo Partner Connect: Business Value Demo with Denodo Demo LiteDenodo Partner Connect: Business Value Demo with Denodo Demo Lite
Denodo Partner Connect: Business Value Demo with Denodo Demo LiteDenodo
 
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...Denodo
 
Drive Data Privacy Regulatory Compliance
Drive Data Privacy Regulatory ComplianceDrive Data Privacy Regulatory Compliance
Drive Data Privacy Regulatory ComplianceDenodo
 
Знакомство с виртуализацией данных для профессионалов в области данных
Знакомство с виртуализацией данных для профессионалов в области данныхЗнакомство с виртуализацией данных для профессионалов в области данных
Знакомство с виртуализацией данных для профессионалов в области данныхDenodo
 
Data Democratization: A Secret Sauce to Say Goodbye to Data Fragmentation
Data Democratization: A Secret Sauce to Say Goodbye to Data FragmentationData Democratization: A Secret Sauce to Say Goodbye to Data Fragmentation
Data Democratization: A Secret Sauce to Say Goodbye to Data FragmentationDenodo
 
Denodo Partner Connect - Technical Webinar - Ask Me Anything
Denodo Partner Connect - Technical Webinar - Ask Me AnythingDenodo Partner Connect - Technical Webinar - Ask Me Anything
Denodo Partner Connect - Technical Webinar - Ask Me AnythingDenodo
 
Lunch and Learn ANZ: Key Takeaways for 2023!
Lunch and Learn ANZ: Key Takeaways for 2023!Lunch and Learn ANZ: Key Takeaways for 2023!
Lunch and Learn ANZ: Key Takeaways for 2023!Denodo
 
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way Forward
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way ForwardIt’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way Forward
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way ForwardDenodo
 
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...Denodo
 
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...Denodo
 
How to Build Your Data Marketplace with Data Virtualization?
How to Build Your Data Marketplace with Data Virtualization?How to Build Your Data Marketplace with Data Virtualization?
How to Build Your Data Marketplace with Data Virtualization?Denodo
 
Webinar #2 - Transforming Challenges into Opportunities for Credit Unions
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsWebinar #2 - Transforming Challenges into Opportunities for Credit Unions
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsDenodo
 
Enabling Data Catalog users with advanced usability
Enabling Data Catalog users with advanced usabilityEnabling Data Catalog users with advanced usability
Enabling Data Catalog users with advanced usabilityDenodo
 
Denodo Partner Connect: Technical Webinar - Architect Associate Certification...
Denodo Partner Connect: Technical Webinar - Architect Associate Certification...Denodo Partner Connect: Technical Webinar - Architect Associate Certification...
Denodo Partner Connect: Technical Webinar - Architect Associate Certification...Denodo
 
GenAI y el futuro de la gestión de datos: mitos y realidades
GenAI y el futuro de la gestión de datos: mitos y realidadesGenAI y el futuro de la gestión de datos: mitos y realidades
GenAI y el futuro de la gestión de datos: mitos y realidadesDenodo
 

Más de Denodo (20)

Enterprise Monitoring and Auditing in Denodo
Enterprise Monitoring and Auditing in DenodoEnterprise Monitoring and Auditing in Denodo
Enterprise Monitoring and Auditing in Denodo
 
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps Approach
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachLunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps Approach
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps Approach
 
Achieving Self-Service Analytics with a Governed Data Services Layer
Achieving Self-Service Analytics with a Governed Data Services LayerAchieving Self-Service Analytics with a Governed Data Services Layer
Achieving Self-Service Analytics with a Governed Data Services Layer
 
What you need to know about Generative AI and Data Management?
What you need to know about Generative AI and Data Management?What you need to know about Generative AI and Data Management?
What you need to know about Generative AI and Data Management?
 
Mastering Data Compliance in a Dynamic Business Landscape
Mastering Data Compliance in a Dynamic Business LandscapeMastering Data Compliance in a Dynamic Business Landscape
Mastering Data Compliance in a Dynamic Business Landscape
 
Denodo Partner Connect: Business Value Demo with Denodo Demo Lite
Denodo Partner Connect: Business Value Demo with Denodo Demo LiteDenodo Partner Connect: Business Value Demo with Denodo Demo Lite
Denodo Partner Connect: Business Value Demo with Denodo Demo Lite
 
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...
 
Drive Data Privacy Regulatory Compliance
Drive Data Privacy Regulatory ComplianceDrive Data Privacy Regulatory Compliance
Drive Data Privacy Regulatory Compliance
 
Знакомство с виртуализацией данных для профессионалов в области данных
Знакомство с виртуализацией данных для профессионалов в области данныхЗнакомство с виртуализацией данных для профессионалов в области данных
Знакомство с виртуализацией данных для профессионалов в области данных
 
Data Democratization: A Secret Sauce to Say Goodbye to Data Fragmentation
Data Democratization: A Secret Sauce to Say Goodbye to Data FragmentationData Democratization: A Secret Sauce to Say Goodbye to Data Fragmentation
Data Democratization: A Secret Sauce to Say Goodbye to Data Fragmentation
 
Denodo Partner Connect - Technical Webinar - Ask Me Anything
Denodo Partner Connect - Technical Webinar - Ask Me AnythingDenodo Partner Connect - Technical Webinar - Ask Me Anything
Denodo Partner Connect - Technical Webinar - Ask Me Anything
 
Lunch and Learn ANZ: Key Takeaways for 2023!
Lunch and Learn ANZ: Key Takeaways for 2023!Lunch and Learn ANZ: Key Takeaways for 2023!
Lunch and Learn ANZ: Key Takeaways for 2023!
 
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way Forward
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way ForwardIt’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way Forward
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way Forward
 
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...
 
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...
 
How to Build Your Data Marketplace with Data Virtualization?
How to Build Your Data Marketplace with Data Virtualization?How to Build Your Data Marketplace with Data Virtualization?
How to Build Your Data Marketplace with Data Virtualization?
 
Webinar #2 - Transforming Challenges into Opportunities for Credit Unions
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsWebinar #2 - Transforming Challenges into Opportunities for Credit Unions
Webinar #2 - Transforming Challenges into Opportunities for Credit Unions
 
Enabling Data Catalog users with advanced usability
Enabling Data Catalog users with advanced usabilityEnabling Data Catalog users with advanced usability
Enabling Data Catalog users with advanced usability
 
Denodo Partner Connect: Technical Webinar - Architect Associate Certification...
Denodo Partner Connect: Technical Webinar - Architect Associate Certification...Denodo Partner Connect: Technical Webinar - Architect Associate Certification...
Denodo Partner Connect: Technical Webinar - Architect Associate Certification...
 
GenAI y el futuro de la gestión de datos: mitos y realidades
GenAI y el futuro de la gestión de datos: mitos y realidadesGenAI y el futuro de la gestión de datos: mitos y realidades
GenAI y el futuro de la gestión de datos: mitos y realidades
 

Último

Master's Thesis - Data Science - Presentation
Master's Thesis - Data Science - PresentationMaster's Thesis - Data Science - Presentation
Master's Thesis - Data Science - PresentationGiorgio Carbone
 
5 Ds to Define Data Archiving Best Practices
5 Ds to Define Data Archiving Best Practices5 Ds to Define Data Archiving Best Practices
5 Ds to Define Data Archiving Best PracticesDataArchiva
 
CI, CD -Tools to integrate without manual intervention
CI, CD -Tools to integrate without manual interventionCI, CD -Tools to integrate without manual intervention
CI, CD -Tools to integrate without manual interventionajayrajaganeshkayala
 
Optimal Decision Making - Cost Reduction in Logistics
Optimal Decision Making - Cost Reduction in LogisticsOptimal Decision Making - Cost Reduction in Logistics
Optimal Decision Making - Cost Reduction in LogisticsThinkInnovation
 
Mapping the pubmed data under different suptopics using NLP.pptx
Mapping the pubmed data under different suptopics using NLP.pptxMapping the pubmed data under different suptopics using NLP.pptx
Mapping the pubmed data under different suptopics using NLP.pptxVenkatasubramani13
 
Strategic CX: A Deep Dive into Voice of the Customer Insights for Clarity
Strategic CX: A Deep Dive into Voice of the Customer Insights for ClarityStrategic CX: A Deep Dive into Voice of the Customer Insights for Clarity
Strategic CX: A Deep Dive into Voice of the Customer Insights for ClarityAggregage
 
How is Real-Time Analytics Different from Traditional OLAP?
How is Real-Time Analytics Different from Traditional OLAP?How is Real-Time Analytics Different from Traditional OLAP?
How is Real-Time Analytics Different from Traditional OLAP?sonikadigital1
 
The Universal GTM - how we design GTM and dataLayer
The Universal GTM - how we design GTM and dataLayerThe Universal GTM - how we design GTM and dataLayer
The Universal GTM - how we design GTM and dataLayerPavel Šabatka
 
ChistaDATA Real-Time DATA Analytics Infrastructure
ChistaDATA Real-Time DATA Analytics InfrastructureChistaDATA Real-Time DATA Analytics Infrastructure
ChistaDATA Real-Time DATA Analytics Infrastructuresonikadigital1
 
CCS336-Cloud-Services-Management-Lecture-Notes-1.pptx
CCS336-Cloud-Services-Management-Lecture-Notes-1.pptxCCS336-Cloud-Services-Management-Lecture-Notes-1.pptx
CCS336-Cloud-Services-Management-Lecture-Notes-1.pptxdhiyaneswaranv1
 
Cash Is Still King: ATM market research '2023
Cash Is Still King: ATM market research '2023Cash Is Still King: ATM market research '2023
Cash Is Still King: ATM market research '2023Vladislav Solodkiy
 
TINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptx
TINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptxTINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptx
TINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptxDwiAyuSitiHartinah
 
Virtuosoft SmartSync Product Introduction
Virtuosoft SmartSync Product IntroductionVirtuosoft SmartSync Product Introduction
Virtuosoft SmartSync Product Introductionsanjaymuralee1
 
Elements of language learning - an analysis of how different elements of lang...
Elements of language learning - an analysis of how different elements of lang...Elements of language learning - an analysis of how different elements of lang...
Elements of language learning - an analysis of how different elements of lang...PrithaVashisht1
 
Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024
Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024
Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024Guido X Jansen
 
Rock Songs common codes and conventions.pptx
Rock Songs common codes and conventions.pptxRock Songs common codes and conventions.pptx
Rock Songs common codes and conventions.pptxFinatron037
 

Último (16)

Master's Thesis - Data Science - Presentation
Master's Thesis - Data Science - PresentationMaster's Thesis - Data Science - Presentation
Master's Thesis - Data Science - Presentation
 
5 Ds to Define Data Archiving Best Practices
5 Ds to Define Data Archiving Best Practices5 Ds to Define Data Archiving Best Practices
5 Ds to Define Data Archiving Best Practices
 
CI, CD -Tools to integrate without manual intervention
CI, CD -Tools to integrate without manual interventionCI, CD -Tools to integrate without manual intervention
CI, CD -Tools to integrate without manual intervention
 
Optimal Decision Making - Cost Reduction in Logistics
Optimal Decision Making - Cost Reduction in LogisticsOptimal Decision Making - Cost Reduction in Logistics
Optimal Decision Making - Cost Reduction in Logistics
 
Mapping the pubmed data under different suptopics using NLP.pptx
Mapping the pubmed data under different suptopics using NLP.pptxMapping the pubmed data under different suptopics using NLP.pptx
Mapping the pubmed data under different suptopics using NLP.pptx
 
Strategic CX: A Deep Dive into Voice of the Customer Insights for Clarity
Strategic CX: A Deep Dive into Voice of the Customer Insights for ClarityStrategic CX: A Deep Dive into Voice of the Customer Insights for Clarity
Strategic CX: A Deep Dive into Voice of the Customer Insights for Clarity
 
How is Real-Time Analytics Different from Traditional OLAP?
How is Real-Time Analytics Different from Traditional OLAP?How is Real-Time Analytics Different from Traditional OLAP?
How is Real-Time Analytics Different from Traditional OLAP?
 
The Universal GTM - how we design GTM and dataLayer
The Universal GTM - how we design GTM and dataLayerThe Universal GTM - how we design GTM and dataLayer
The Universal GTM - how we design GTM and dataLayer
 
ChistaDATA Real-Time DATA Analytics Infrastructure
ChistaDATA Real-Time DATA Analytics InfrastructureChistaDATA Real-Time DATA Analytics Infrastructure
ChistaDATA Real-Time DATA Analytics Infrastructure
 
CCS336-Cloud-Services-Management-Lecture-Notes-1.pptx
CCS336-Cloud-Services-Management-Lecture-Notes-1.pptxCCS336-Cloud-Services-Management-Lecture-Notes-1.pptx
CCS336-Cloud-Services-Management-Lecture-Notes-1.pptx
 
Cash Is Still King: ATM market research '2023
Cash Is Still King: ATM market research '2023Cash Is Still King: ATM market research '2023
Cash Is Still King: ATM market research '2023
 
TINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptx
TINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptxTINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptx
TINJUAN PEMROSESAN TRANSAKSI DAN ERP.pptx
 
Virtuosoft SmartSync Product Introduction
Virtuosoft SmartSync Product IntroductionVirtuosoft SmartSync Product Introduction
Virtuosoft SmartSync Product Introduction
 
Elements of language learning - an analysis of how different elements of lang...
Elements of language learning - an analysis of how different elements of lang...Elements of language learning - an analysis of how different elements of lang...
Elements of language learning - an analysis of how different elements of lang...
 
Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024
Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024
Persuasive E-commerce, Our Biased Brain @ Bikkeldag 2024
 
Rock Songs common codes and conventions.pptx
Rock Songs common codes and conventions.pptxRock Songs common codes and conventions.pptx
Rock Songs common codes and conventions.pptx
 

How Financial Institutions Are Leveraging Data Virtualization to Overcome their Business Challenges (EMEA)

  • 1. 1
  • 2. Agenda9:00am - ONLINE CONFERENCE • Introductory keynote • How banking and financial institutions can leverage on Denodo Data Virtualization? • Success stories in the financial sector 10:00am – ONLINE HANDS-ON SESSION • Use cases & Successful implementations • Performance and optimization • Governance and security • Live demo with Denodo • Next steps • Q&A 11:00am - CONCLUSIONS & END OF SESSION
  • 4. ❖ 26 Brands ❖ 172 M Turnover ❖ 600 Clients references ❖ 1575 Experts around the world AAA BeNeLux ❖ 5 Brands ❖ 20 M Turnover ❖ 60 Clients references ❖ 180 Experts AAA ecosystem federates various consulting companies and expertise across the world
  • 5. Dynafin / Satisco Data Competence Center history A long lasting story of successful achievements
  • 6. A changing World The financial sector faces some major challenges, impacting all domains and levels of companies: STRATEGY COMMERCIAL TECHNOLOGY FINANCEREGULATION ORGANISATION TALENT PROCESSES DATA MOBILITY DATA SECURITY DATA MANAGEMENT DATA MONITORING Dynafin / Satisco Data Competence Center history
  • 7. Banking IT Integration services Partner in Financial services ❖ Impact analyses ❖ Data centric strategy ❖ Data modelling ❖ Data governance strategy ❖ Change management ❖ Project management ❖ Testing ❖ Architecture design ❖ Technical analyses ❖ Integration strategy review ❖ Integration (re)development ❖ IT Testing ❖ DevOps ❖ Data Virtualization strategy ❖ Data Virtualization modelling ❖ Solution implementation follow-up ❖ Data governance strategy follow-up ❖ Data security strategies ❖ Change Management ❖ Denodo expertise ❖ Denodo Product Evolution ❖ Denodo Product Support ❖ Denodo Product Training ❖ Denodo User Meetings ❖ Denodo Use Cases Sharing Digital Influencer and data enabler Solution provider A multidisciplinary team to guide you in your business transformations DATA CENTRIC OFFER CREATION Dynafin / Satisco Data Competence Center history
  • 8. HOW BANKING AND FINANCIAL INSTITUTIONS CAN LEVERAGE ON DENODO DATA VIRTUALIZATION? WEBINAR 29 OCTOBER
  • 9. Speakers Aly Wane Diene Senior Solution Consultant Alain Kunnen Chairman & Associate Vincent Boucheron Data Influencer & Managing Partner
  • 10. HOW BANKING AND FINANCIAL INSTITUTIONS CAN LEVERAGE ON DENODO DATA VIRTUALIZATION? WEBINAR 29 OCTOBER A BIT OF CONTEXT….
  • 11. • More and More data to deal with (IoT or Not) • More and more heterogeneous type of data (cf rise of unstructured data) • Mobility (data at your fingertips wherever you are) • Security & Regulations & Sustainability & Reputations • Real time Closure (versus the day after) • Move to Cloud(s) in hybrid mode (or not) to benefit from scalability and agility • Digital tools booming ( ex Ia, Rpa, Smart Sensors, 5g, etc) • Move to modular applications (cf containers) • A stormy IT ecosystem MAJOR MARKET TRENDS Business Trends Technology Trends BE PREPARE TO SURF ON THE NEW BIG WAVE © DIAMS. All rights reserved.
  • 12. DATA AND PERSONAS OF EXISTING ECOSYSTEMS Sales HR Apps/API Executive Marketing Data Science AI/ML SIMPLIFIED PICTURE 90% OF DEMANDS REQUIRES NEAR REAL TIME DATA SECURITY AND GOVERNANCE? 75% OF STORED DATA NEVER USED
  • 13. 2020-2050 IT challenges are no longer a matter of user digital awareness or Technical Limitations ………. It’s a matter of data federation MAJOR MARKET TRENDS © DIAMS. All rights reserved.
  • 14. HOW BANKING AND FINANCIAL INSTITUTIONS CAN LEVERAGE ON DENODO DATA VIRTUALIZATION? WEBINAR 29 OCTOBER DATA VIRTUALIZATION 2.0
  • 15. Timeline 1992: First version of Crystal Reports embedding a DV layer 1991-1992: First version of Skipper-BO 1994: MS release Access 2 with DV layer (links) and Foxpro query optimizer 2003 : BO purchase Crystal Decisions 2008 : SAP purchases Business Objects 2019: ex MS SQL comprise a DV layer to mix SQL and Hortonworks, Thomas Siebel launches C3.ai suite leveraging on a DV embedded layer to analyze data … • « Data Virtualization securely connects and federates any type of data whatever their location » • The need to federate data is not new ……& Data virtualization is not new either …. • Data virtualization is often hidden in software packages DATA VIRTUALIZATION DEFINITION Data Virtualization ▪ Provides an easy access to disparate data (sources, formats) ▪ Hiding the technical aspect of the storage (location, formats) ▪ While data is not being moved physically Comparaison « MS Access expose the data but misses BO universes functionalities (and vice versa) » « Denodo is the new BO version we would have had if BO had not been acquired by SAP » © DIAMS. All rights reserved.
  • 16. DATA VIRTUALIZATION – HOW IT WORKS? CONNECT, COMBINE & CONSUME Sales HR Executive Marketing Apps/API Data Science AI/ML Connect Combine Consume COMBINE & INTEGRATE INTO BUSINESS DATA VIEWS
  • 17. Basic Core modules Performance Data Gouvernance Security • New generation of Data virtualization tools disrupts due to extra functionalities unleashing new business and technical use cases. • Core modules (Connect-Combine-Consume) • Performance (query plan optimization, load balancing, cloud distribution and delocalization) • Data Governance modules (data catalog, physical/logical model synchronization, data dictionaries / synonym interface) • Security features (auditability, lineage, protection of underline data sources) All Data virtualization product do not cover all axis…. DATA VIRTUALIZATION DATA VIRTUALIZATION 2.0 © DIAMS. All rights reserved.
  • 18. HOW BANKING AND FINANCIAL INSTITUTIONS CAN LEVERAGE ON DENODO DATA VIRTUALIZATION? WEBINAR 29 OCTOBER DATA VIRTUALIZATION 2.0 -> THE PROMISE
  • 19. DV REQUIRES GOVERNANCE AND A MINIMUM LEVEL OF DATA MATURITY • Data virtualization is not a toy for IT geeks !!!!!! • It’s a business tool to support and secure operations involving client or company data !!!!! • Remember that a lot of companies expected BO Universes to solve quality issues by magic with low governance ….. • Data virtualization tool should mirror data governance organization and deeply involve its actors (ex data owners, data stewards, data office) DATA VIRTUALIZATION : THE PROMISE Business, CDO, CIO, CISO must team up to deliver this promise CDO Team CISO Team IT & DATA ARCHITECTS © DIAMS. All rights reserved.
  • 20. • The capacity of Data virtualization to protect its underline sources in terms of security and performance will allow to create a molecular architecture where internal and external referentials and golden sources are accessed directly and efficiently by operations and back offices. • Such capacity will allow to • Swap rapidly modules (ex IA, containers) and infra structure (ex AWS, AZURE, on premises) • Reduce reconciliation need • Offload physical data transport • Federate ecosystems and foster marketplaces • Simplify and make SI more understandable for its users DATA VIRTUALIZATION : THE PROMISE THE SUPPORT OF NEW TYPE OF BUSINESS INTERACTION BI TOOLS, IA, RPA,… Referential X External Data DATA Virtualization Product Referential © DIAMS. All rights reserved.
  • 21. DV Layer Service DV Data Virtualization hub DV Layer Service DV Layer Service Azure ML DV Layer Service US Zone EMEA Zone On premice(s) DATA VIRTUALIZATION 2.0 THE SUPPORT OF A NEW TYPE INFRASTRUCTURE
  • 22. SUCCESS STORIES IN THE FINANCIAL SECTOR WEBINAR 29 OCTOBER
  • 23. SUCCESS STORIES IN THE FINANCIAL SECTOR WEBINAR 29 OCTOBER DATA VIRTUALISATION FOR WHO ….
  • 24. 24 DATA VIRTUALISATION : THE DATA TOOL EVERYBODY LOVES DATA VIRTUALISATION FEDERATES DATA AND PEOPLE Several Data virtualization functionalities will appeal to all the players managing data. All user will for example love its ability to connect to the golden source in a secure, non-intrusive way and the ability to cross- reference data in an optimized way. At the implementation level, a majority of players appreciates the speed of the solution's implementation, its affordability and scalability. Data virtualization federates not only data but also business lines and therefore almost all the players in companies are likely to benefit from it, from "operational business" (process and quality) to data scientists, CIOs, CDOs, architects and security players. © DIAMS. All rights reserved.
  • 25. 25 Persona 1 : CIO CIO will appreciate the data virtualization for the following nice use cases 1. Support for business transformation (360 client, merge, digital transformation, IOT etc), Providing sandboxes with raw and fresh data, quickly and securely for data scientists. 2. Controlled transition of the IS (migration to the cloud or reengineering of the IS) 3. Cost optimization (ex storage) 4. Protection of source systems to guarantee or refine a level of service 5. Information on the use of the data that allows the implementation of a pay per data usage system for the IS or the identification of obsolete systems © DIAMS. All rights reserved.
  • 26. 26 Persona 2 : CDO The CDO will appreciate 1. The ability to quickly map golden sources, golden copies and related processes. 2. The ability to easily and cost-effectively reconcile the physical model of business objects with the data dictionary. 3. The ability to federate business around common concepts by easily creating an inter-application synonym dictionary, 4. He will also like the possibility of easily upgrading to a specialized dictionary. 5. The DV's ability to support data governance by providing a simple means of comprehensive quality controls on processes will also certainly appeal to him © DIAMS. All rights reserved.
  • 27. 27 Persona 3 : CISO The data virtualisation will allow the CISO to 1. Improve its ability to detect real anomalies by easily cross-referencing data directly on the golden sources (versus copied data). 2. Move towards a more predictive mode due to its near real-time connection to the sources. 3. Build even more relevant alerts by cross-referencing structured data (such as firewall intrusion type) and/or unstructured data (e.g. building protection cameras). 4. Limit the maintenance costs of the alert system 5. Go beyond the limits of human control by coupling these alerts with machine learning or iam systems to avoid false alarms, 6. Evolve its protection system towards a model that is more data centric than application centric. © DIAMS. All rights reserved.
  • 28. 28 Persona 3 : ARCHITECT The data virtualisation will allow the architect to 1. Beter understand data flows and golden source and simplify the SI 2. Bring flexibility to the SI by fostering web services capacities for intraday exchanges 3. Propose a pragmatic approach to deal with legacy system obsolescence 4. Propose progressive approach durting transformation phases to better control transition phasis 5. Satify business requirements in tems of rapidity and scalabiliy 6. Comply with high standards in terms of security and IT practices © DIAMS. All rights reserved.
  • 29. 29 Persona 3 : DATA SCIENTIST The data scientist will love data virtualization for 1. Its capacity to expose and query bruto data and cleansed data and more generally any type of data 2. Its capacity to rapidly integrate new sources and/or new fields (no need to wait for IT) 3. Its flexibility and compatibility with almost any reporting or data scientist tool of now (and probably tomorrow) 4. The possibility to work on non filtered production datasets (if they have the right credentials) 5. The capacity to use the scheduler, audit trail and Denodo to align data scientist sandbox and explain rapidly deviance between individual results. 6. The coherence between sandbox and production and the simplicity of replicability of analytics accross historical versions of data set, © DIAMS. All rights reserved.
  • 30. SUCCESS STORIES IN THE FINANCIAL SECTOR WEBINAR 29 OCTOBER DATA VIRTUALISATION FOR WHAT AND WHEN ….
  • 31. 31 DATA VIRTUALISATION FOR WHAT FROM A BUSINESS PERSPECTIVE CORPORATE USE CASES IT USE CASES COMPLIANCE & SECURITY USE CASES DIGITAL INITIATIVES HOLISTIC VIEWS (PRODUCT, CLIENT) OPERATION OPTIMISATIONS DATA GOVERNANCE BUSINESS TRANSFORMATIONS COST OPTIMISATIONS ARCHITECTURE SIMPLIFICATION IT MIGRATIONS & TRANSFORMATIONS COST OPTIMISATIONS HOLISTIC VIEWS (RISK) DATA PROTECTION COORDINATION HELPING CXO TO BRING VALUE AND SUPPORT © DIAMS. All rights reserved.
  • 32. 32 Tactical approach Leverage on • Database protection • Migration Reactive approach Leverage on • Regulation • Security Threats Strategic visionary approach Leverage on • Business Maturity & Objectives Plan DATA VIRTUALISATION FOR WHEN BE OPPORTUNIST : PICK YOUR BATTLES © DIAMS. All rights reserved.
  • 33. SUCCESS STORIES IN THE FINANCIAL SECTOR WEBINAR 29 OCTOBER DATA VIRTUALISATION SUCCESS STORIES….
  • 36. DATA MARKET PLACE : TRANSFORM YOUR SI USER EXPERIENCE Use Case : LEVERAGE ON DV CAPACITIY TO PROTECT UNDERLINED DATA BASE TO ALLOW USERS AND APPLICATIONS TO ACCESS DATA MORE EASILY AND SECURELY . SALES RERERENTIAL STAFF REFERENTAL SALES GUI HR GUI Data Marts IAM Tool Data Virtualization Data Catalog BI++ Sales Data dictionary PROCUREMENT GUI EXTERNAL STAFFF REFERENTAL BI++ security IAM GUI CLAIMS or INCIDENT REFERENTAL CLAIMS GUI Firewaal Logs or Censors or Video Or Audio Market data ChatBot Sales ChatBot Security BI++ Complia nce Use Case : Build customised views associated to each group of user needs Step 1 : Implement data virtualization layer between consumers and sources in a direct non intrusive way Step 2 : Build queries and view associate to each user or applications need Step 3 : Build data catalog and data dictionary Step 4a : Monitor and control usage Step 4b : Introduce a pay per use principle for maintenance costs, © DIAMS. All rights reserved.
  • 37. KYC & Client 360 VIEWS: EXTEND YOUR VISION OR CONTROLS IN A RAPID AND FLEXIBLE MANNER Use Case : LEVERAGE ON DV CAPACITIY TO PROTECT UNDERLINED DATA BASE TO ALLOW USERS AND APPLICATIONS TO ACCESS DATA MOREEASILY AND SECURELY . CLOUD & Internet EXTERNAL SOURCES CLIENT REFENTIAL CLIENT GUI Data Marts IAM Tool Data Virtualization Data Catalog BI++ Sales Data dictionary EXTERNAL STAFFF REFERENTAL Data scientists IAM GUI FORBIDEN CLIENTS Website content or PDF or Video Or Audio flies Market data ChatBot KYC ChatBot 360 BI++ Complia nce Use Case : Build customised views associated to each group of user needs Step 1 : Implement data virtualization layer between consumers and sources in a direct non intrusive way Step 2 : Build queries and view associate to KYC or 360 requirements Step 3 : Build data catalog and data dictionary Step 4a : Monitor and control usage Step 4b : Add rapidly any type of sources or fields Step 4c : Create and Secured Sandbox for data scientists © DIAMS. All rights reserved.
  • 38. Use Case : LEVERAGE ON DV MULTISOURCE CAPACITIES MOVE DATA AND BUSINESS MORE EASILY AND SECURELY . Application B Application C Data Catalog Data dictionary & Synonym Application A Use Case : Build customised views associated to each group of user needs Step 1 : Implement data virtualization layer between legacy and new consumers/sources in a direct non intrusive way Step1b : build data dictionary and data synonym Step 2 : Build queries to verify easily data integration in new systems by ETL and integration layers Step 3 : Complement and offload ETL with web services capacity on new system Step 4 : Push bask data to slave system during transition period Step 5 : Add rapidly any type of sources or fields to fix temporary ETL delivery unforeseen issues (ex new source or field to be added urgently) Step 6 : Decommission Gui A and B but use DV to expose Application A and B archived data Data Virtualization ETL Merger and Acquisition – Business Transformation : Allow a secured and progressive move GUI A GUI C Check Migration // Run GUI B Web service near real time capacity Master slave feed back post migration Urgent temporary KAD Fix BI++ Archived data © DIAMS. All rights reserved.
  • 39. UNIFIED DATA LAYER : USE DV TO REGAIN CONTROL OF SILOED SI Use Case : Leverage on regulators pressure, obsolescence or cloud migrations to put in place data virtualization to regain control of data exposure Step 1 : Implement data virtualization layer between consumers and sources in a direct non intrusive way Step 2: Analyze query and identify golden copy and inappropriate query Step 3 : Build action plan based on hall of horrors area Step 4 : Kill bad golden copies and redirect easily users to golden sources or replicate golden source set of data control on golden copies to monitor and communicate on data quality issues. Step 5 : Simplify SI architecture by allowing new application to access securely and directly to golden source tables, avoiding integration tables, reducing ETL usage and duplicate data storage. Step 6 : Analyze data usage and move security from application centricity to data centricity Step 7 : Use DV data catalog to help build data dictionaries and data synonyms Step 8 : communicate on success per data source, business lines or data objects Focus on data source and data usage Ex regain control of siloed SI © DIAMS. All rights reserved.
  • 40. DATA GOVERNANCE : USE DV TO DISCOVER AND SOLVE DATA ISSUES Use Case : Use DV to build a simple but efficient data flow monitoring tool to discover and solve any process issues, Step 1 : Chose a process and identify important phasis/steps Step 2: Associate each steps to points of controls (in and out) associated to expected values and metadata Step 3 : identify data sources associated to this value and metadata and plug data Virtualization connectors to these sources Step 4 : Build queries retrieving data and metadata on regular basis Step 5 : Trigger alerts based on fixed acceptable limits or dynamic values based on historical data logs Step 6 : Use BI tools “plugged” on Dv queries to build efficient dashboards showing values and trigger alerts. Step 7 : Analyze alerts to take corrective actions Step 8 : Use IA to help business take preventive actions according to context (ex shift of resources to secure deadlines or cut off times) Focus on data process Ex Implement Data Flow Monitoring Step1 Step2 Step3 Step4a Step4b Data Virtualization © DIAMS. All rights reserved.
  • 41. DATA SECURITY : BUILD A DOME TO ENABLE A DATA CENTRIC PROTECTION Use Case 2 : Team up with security teams to support their projects and/or implement affordable data protection measures. Step 1 : Implement data virtualization layer between consumers and sources in a direct non intrusive way Step 2: Analyze query logs to identify sensible data usage and their associated users Step 3a : Create user groups to create data centric access control (versus application centric) Step 3b : Create security dashboards monitoring data usage and alerts based on DV logs, Step 4a : Connect DV to data sources and progressively build a DV protection dome over your data monitoring data usage for sensible data whatever their type and location, Step 4b : Use data lineage and/or ETL logs to extend the DV dome protection to golden copies. Step 4c : Easy trigger alerts if applications or users try to extend their data perimeter consumption Step 4d: Include application built if APIs or DV create APIs into DV dome protection THE BUILDING OF SUCH DOME CAN BE BUILD ON A SUBSET OF THE SI (ex DWHs and/or DATALAKEs) Offer a Holistic view od data to your security/compliance/risk officers Remember that in most organisation ETL does not handle data security metadatas leading to a « Kerviel risk » ….. © DIAMS. All rights reserved.
  • 42. DATA MIGRATION & HYBRID DATA FABRIC : SECURE YOUR SI TRANSFORMATION WITH AGILITY TO BENEFIT FROM BEST OF BREED TECHNICS Use Case : Take advantage of any type of transformation projects (obsolescence, migration to cloud, merger) to benefit from best of bread technics while putting improving rapidly data governance and services. Step 1 : Implement data virtualization layer between consumers and sources in a direct non intrusive way. Database base logical and physical views remain active. Step 2: Run legacy system and new system in // and analyse query log to secure data source performance Step 3 : Kill legacy views Step 4 : Start building new query views bases on joins between data sources. Step 5 : Implement web services base on join based queries. Step 6 : Implement new data sources (ex hortonworks, clouds,) Step 7 : Redirect customer to new sources when appropriate with progressively and with no major impact Step 8 : Offload ETL transfer when appropriate and bring more “real time” services,Oracle DWH Intraday Sybase DWH end of day Data MartsViews Horton Works CLOUD 1 Data Virtualization Data Catalog BI++ IA RPA Apps BI++ BI++ IA IA IA RPA RPA Apps Apps Data dictionary CLOUD 2 CLOUD 3 Cloudera © DIAMS. All rights reserved.
  • 43. LOGICAL DATAWAREHOUSE/DATALAKE & APIFICATION : SIMPLIFY YOUR SI TO INCREASE FLEXIBILITY AND OPTIMIZE COST Use Case : Leverage on IT (ex Migration to Cloud) or Business projects (ex Client 360) to rethink data architecture and become data centric Step 1 : Focus on golden sources and referential and expose golden source data directly to users and application via data virtualization platform as a single point of access. Step 2 : Progressively move toward a molecular SI where Golden source can be added, modified or replaced more easily and where data replication and data reconciliation are reduced. Step 3 : Use DV API features to create API on legacy applications and build an Api catalog, Gartner Vision of Data Architecture Evolution © DIAMS. All rights reserved.
  • 44. SUCCESS STORIES IN THE FINANCIAL SECTOR WEBINAR 29 OCTOBER TIME AND MEANS….
  • 45. 45 DATA VIRTUALISATION : FACTS AND FIGURES TIME AND MEANS Denodo infrastructure • Windows or linux • On premise or Cloud base (AWS and Azure Available) Licence & Infrastructure Costs • On Premises : Depend on client infrastructure • HW Cloud Base + Client Own License : Depend on client context • Cloud base :from 88 ke to 188 ke according to context Implementation time • On Premise : 3 to 6 month Based on BNPP implementations • Cloud based : Less than 6 month in average
  • 46. 46 DATA VIRTUALISATION : FACTS AND FIGURES LOW RISK AND MAXIMUM RETURN Several ways to adopt data virtualization 1. Denodo Express : free but with some limitations 2. Free Cloud Test drive to discover predefined use cases 3. Free Trial Cloud License with 14 days limitation 4. Diams Starter Kits DIAMS has created a set of « starter kits » proposition enabling its client to test data virtualization on a specific scope for a small period of time and a limited budget. These « easy » offers of service are intended to be scalable to allow clients to leverage on first positive results. Don’t hesitate to contact us to get more details. • Data Maturity Assessment -DMM • Dataflow monitoring- DFM • Easy Line of Production Monitoring - LPM • Easy KPI for Industry -KP2I • Easy Compliance & Risk Monitoring -ECRM • Easy-Security Alert and Monitoring -SAM • Easy KPI for Corporate- KPIC • Easy Transition Transformation and Migration- T2M • Easy-Robotic Process Automation Implementation - RPAI • Easy-Articial Intelligence Implementation -A2I • Integration & Data Exposure As a Services -IDEAS DIAMS « Starter Kits » offers TRY IT & LOVE IT © DIAMS. All rights reserved.
  • 47. TAKE AWAY • We have all the technical tools needed to reach success • Do not forget data governance and change • DV federates data so try to co finance your projects with other CxO • Pick the right battles and data, dream big, deliver fast • You are not alone DIAMS & Denodo can help you ….
  • 48. Q&A
  • 49. THANK YOU Vincent BOUCHERON +32 496-08-77-33 Vincent.Boucheron@diams-it.com www.diams-it.com
  • 51. Uses cases and successful implementations
  • 52. 52 Data Virtualization use cases Sales HR Executiv e Marketin g Apps/A PI Data Science AI/ML
  • 54. 54 Case Study : Unified view into regulatory risk Business Need • Need a controlled data environment to support tougher regulatory requirements. • Information does not tie across all data silos. • Need a smart data governance initiative to avoid garbage- in-garbage-out problem. • c Benefits • Enables faster time-to-market and incremental information delivery. • Helps CIT realize value from data - successfully access all data through provisioning point instead of through legacy point-to-point integration. • Minimize data replication and proliferation by eliminating data redundancy. Solution
  • 55. 55 Case Study : Logical Data Warehouse Business Need • Accelerate business operations in loans, deposits, and other departments through use of self-service reports and dashboards. • Establish a central information delivery platform to easily add new data sources from acquired companies. • Prior cloud-based data warehouse was inflexible to accommodate new data sources. Benefits • Improved efficiency through self-service for business users in loans, deposits, fraud, credit, and risk departments. • Reporting turnaround time improved from 2-3 days to 2 hours. • Business operations such as loan processing is handled in real-time. Solution
  • 57. 57 Performance and optimization “What about performance?”…usually first question we get about Data Virtualization Many factors affect performance • Data sources, network latency, complexity of query and processing, consumer ingestion rates, etc. Overall performance driver… • Minimize data moved through network • Maximize ‘local’ data processing ‘Move processing to the data’ 1 1
  • 58. 58 Query Optimization Pipeline Parts of the optimization pipeline Query Parsing • Retrieves execution capabilities and restrictions for views involved in the query Static Optimizer • Query delegation • SQL rewriting rules (removal of redundant filters, tree pruning, join reordering, transformation push-up, star-schema rewritings, etc.) • Data movement query plans Dynamic Optimizer • Picks optimal JOIN methods and orders based on data distribution statistics, indexes, transfer rates, etc. Execution • Creates the calls to the underlying systems in their corresponding protocols and dialects (SQL, MDX, WS calls, etc.)
  • 59. 59 Performance Optimization Techniques Query Plans SQL is a declarative language: • Queries specify what users want, not how to get the data. • There are potentially many ways of executing a query. A query plan specifies a set of steps for executing a query or subquery. The optimizer has the goal of selecting the best plan. • First generate multiple query plans for each query. • Estimate the cost of each plan. • Select the plan with the minimum cost.
  • 60. 60 Performance Optimization Techniques Static vs. Dynamic Optimization Static optimization: • Based on SQL transformations. • Rewrite query in more optimal way. • Remove redundancies, inactive sub- trees, etc. • Push-down delegation: • Optimize query by pushing down sub- trees to underlying data sources. Dynamic optimization: • Use statistics and indices to estimate costs of alternative execution plans. • Select Join methods and Join ordering.
  • 61. 61 Performance Optimization Techniques Query Delegation Objective: Push the processing to the data. • Utilize power and optimizations of underlying data sources. • Especially relational databases and data warehouses. • Minimize expensive data movement. Delegation mechanisms: • Vendor specific SQL dialect. • Function delegation. • Configurable by data source. • Delegate SQL operations. • e.g. Join, Union, Group By, Order By, etc.
  • 62. 62 Performance Optimization Techniques Query Rewriting Goal: Rewrite query in an optimal way before the query is executed. Typical optimizations: • Simplify partitioned unions. • Remove redundant sub-views. • Remove unused join branches due to projections. • Transform outer joins to inner joins. • Static join reordering to maximize delegation. • Reordering of operations. • Full and partial aggregation push-down.
  • 63. 63 Performance Optimization Techniques Source Constraint Optimization Denodo Platform optimization has to work across multiple diverse data source types: • Not just relational databases. • Not all data sources have same capabilities. Recognize and optimize for constraints in underlying data sources: • e.g. MySQL can be ordered for Merge join… but a delimited file cannot.
  • 64. 64 Performance Optimization Techniques Data Movement Typically used when one dataset is significantly smaller and aggregations performed on joined data: Execute query in DS1 and fetch its data 1 3 When step 2 is completed, execute the JOIN in DS2 and return the results to the DV layer Create a temporary table in DS2 and insert data from step 1 2 DS1 DS2
  • 65. 65 Caching Real time vs. caching Sometimes, real time access & federation not a good fit: • Sources are slow (e.g.. text files, cloud apps. like Salesforce.com) • A lot of data processing needed (e.g.. complex combinations, transformations, matching, cleansing, etc.) • Limited access or have to mitigate impact on the sources For these scenarios, Denodo can replicate just the relevant data in the cache
  • 66. 66 Caching Overview Based on an external relational database • Traditional: Oracle, SLQServer, DB2, MySQL • MPP: Teradata, Netezza, Vertica • Cloud-based: Amazon Redshift, Snowflake • In-memory storage: Oracle TimesTen, SAP HANA Works at view level • Allows hybrid access (real-time / cached) of an execution tree Cache Control • Manually – user initiated at any time • Time based - using the TTL or the Denodo Scheduler • Event based - e.g. using JMS messages triggered in the DB
  • 67. 67 Caching Caching options Denodo offers two different types of cache • Partial: • Query-by-query cache • Useful for caching only the most commonly requested data • More adequate to represent the capabilities of non-relational sources, like web services or APIs with input parameters • Full: • Similar to the concept of materialized view • Incrementally updateable at row level to avoid unnecessary full refresh loads • Offers full push-down capabilities to the source, including group by and join operations • Supports hybrid incremental queries for SaaS data sources (next slide)
  • 68. 68 Caching Incremental Queries Merge cached data and fresh data to provide fully up-to-date results with minimum latency Leads changed / added since 1:00AM CACHE Leads updated at 1:00AM Up-to-date Leads data 1. Salesforce ‘Leads’ data cached in Denodo at 1:00 AM. 2. Query needing Leads data arrives at 11:00 AM 3. Only new/changed leads are retrieved through the WAN. 4. Response is up-to-date but query is much faster.
  • 69. 69 MPP Query Acceleration • Denodo 7.0 supports using MPP cluster to accelerate queries • Hive, Spark, Impala, Presto • Operations that can be parallelized can be moved to MPP cluster • e.g. GROUP BY aggregations • Data is copied to cluster and operation is delegated for processing • Data copied in Parquet file • Results returned to Denodo Platform • Does not require any special commands from user
  • 70. 70 Example Scenario Sales by state over the last four years. Scenario: • Current data (last 12 months) in EDW • Historical data offloaded to Hadoop cluster for cheaper storage • Customer master data is in the RDBMS Very large data volumes: • Sales tables have hundreds of millions of rows join group by State union Current Sales 68 million rows Historical Sales 220 million rows Customer 2 million rows (RDBMS)
  • 71. 71 MPP Query Acceleration Current Sales 68 M rows Customer 2 M rows join group by State System Execution Time Optimization Techniques Others ~ 19 min Simple federation No MPP 43 sec Aggregation push-down With MPP 26 sec Aggregation push-down + MPP integration (Impala 4 nodes) 5. Fast parallel execution Support for Spark, Presto and Impala for fast analytical processing in inexpensive Hadoop-based solutions Hist. Sales 220 M rows 4. Integration with local data The engine detects when data is cached or comes from a local table already in the MPP 2. Integrated with Cost Based Optimizer Based on data volume estimation and the cost of these particular operations, the CBO can decide to move all or part of the execution tree to the MPP group by ID 2M rows (sales by customer) 1. Partial Aggregation push down Maximizes source processing dramatically Reduces network traffic 3. On-demand data transfer Denodo automatically generates and upload Parquet files
  • 72. 72 Example: Denodo 7.0 MPP Acceleration ‘Non-optimized’ Execution Optimized Execution with MPP Acceleration Execution Time: 19 mins (1,142 seconds) Execution Time: 26.6 secs (26,640 ms)
  • 74. 74 Governance • Governance is a very broad topic • More than just data access and delivery • Data Virtualization can play an important role in overall data governance • But it’s not the whole story by itself • End-to-end governance, metadata management, data lineage, etc. require other tools • Data Virtualization has “1 degree of visibility”
  • 75. 75 Enterprise Governance Data Lineage • Find source of ‘truth’ – top down – shows where data comes from and/or how it is derived. Source Refresh • Detect changes in underlying data sources and propagate to the affected data services. Impact Analysis • Analyze impact of metadata changes in workflows where the modified view is used. Catalog Search • Have a complete understanding of each of the views and data services created in Denodo.
  • 76. 76 Metadata Management Data Virtualization Platform collects lots of metadata • Data source metadata • By introspection or configuration • Operational metadata • How data changes as it flows through Data Virtualization layer • Generated by the Data Virtualization Platform based on ‘model’ built by developers • Business metadata • Enriched metadata either imported or added by ‘data steward’ • e.g. view and field descriptions Metadata Categories Technical Metadata Operational Metadata Business Metadata
  • 77. 77 Metadata Introspection • Denodo Platform gathers metadata from data sources: • Automatically or via configuration. • Maps native data types to ‘Denodo types’. • Inspects indexes in the sources. • Analyzes source query capabilities and abstracts them into common model. • Stores all metadata and configuration data in Metadata repository: • Uses built-in Apache Derby database. • Small size – only stores metadata…actual data is retrieved in real time from sources or cache.
  • 78. 78 Data Lineage • Graphical view for showing data lineage for any field in any virtual view. • Trace source of any field: • Includes any functions applied to field contents. • Trace source of calculated fields: • View calculations used to create new fields.
  • 79. 79 ‘Used By’ Tool • Graphical view for showing where a view is used. • “Big picture” view of usage. • Useful tool for seeing impact of changes on whole system.
  • 80. 80 Impact Analysis example: Adding a new field 1 Views affected by the change 2 Web Services affected by the change 3 Option to propagate new field individually per view 4 Preview of the Tree view of the affected views
  • 81. 81 Authentication Register Denodo Asset Type Publish Assets & Flows Metadata Integration • Export all metadata – Technical, Operational, Business • APIs and Stored Procedures • Integration with Governance Tools • IBM IGC, Collibra, Informatica Enterprise Information Catalog (EIC) • Data Virtualization Platform represents the “as implemented” data asset Denodo Governance Bridge Information Server IGC REST Information Governance Catalog
  • 82. 82 Data Quality & Integrity • Data Virtualization can help with data quality • Apply data quality functions as data is requested • e.g. address lookup and validation routines • But…serious data cleansing – e.g. matching and deduping – is not recommended • Use a DQ tool or MDM • Data Virtualization forces you to think about the best source of accurate data • ‘Customer’ view – which are the best sources for customer data? • Manual process to decide and build views
  • 83. 83 Data Access • Making data available to the users who need it • Based on need, not access to databases or applications • Managing and auditing data access • Ensure (and prove) compliance with security policies • Regulatory, geograp hic, contractual, and organizational data access compliance
  • 87. 87 Security Unified Security Management through Data Virtualization. • Data Virtualization offers an abstraction layer that decouples sources from consumer applications. • Single Point for accessing all the information avoiding point-to-point connections to sources. • As a single point of access, this is an ideal place to enforce security: • Access restrictions to sources are enforced here. • They can be defined in terms of the canonical model (e.g. access restrictions to “Bill”, to “Order”, and so on) with a fine granularity.
  • 89. 89 Secure Access Data Virtualization secures the access from consumers to sources: • Consumer to Denodo Platform (northbound): • Communications between consumer applications and the Data Virtualization layer can be secured, typically using of SSL (data in motion). • Denodo Platform to Sources (southbound): • Communications between the Data Virtualization layer and the sources can be secured too. • Specific security protocol depends on the source: SSL, HTTPS, sFTP, … (data in motion). • Data can be both read and exported encrypted (data at rest).
  • 90. 90 Denodo Platform Authentication – Northbound • Client application -> Denodo Platform. • Three options: • Usernames and passwords defined within the Denodo Platform. • Delegate the authentication to an external LDAP/AD server. • Use Kerberos for Single Sign On.
  • 91. 91 Denodo Platform Authentication – Southbound • Denodo Platform -> data source. • Three options (for each individual source): • Use a service account for the source. • The admins create a user account in the source. • The Denodo Platform always uses those credentials. • Use Kerberos authentication. • Use credentials pass-through. • Access the data source with the username and password combination or the Kerberos ticket that was used to authenticate with the Denodo Platform northbound.
  • 92. 92 Denodo Platform Authorization • Role-based Authorization. • Users/roles can be defined in the Data Virtualization layer and assigned specific permissions. • Fine-grained authorization. • Several permissions scopes: • Virtual Database level (e.g. credit risk database, etc.). • Views level (e.g. “Regional Risk Exposure”, etc.). • Row level (filter rows that are not authorized) • Column level: ▪ Grant/block access. ▪ Data masking (hiding sensitive fields).
  • 93. 93 Role-Base Data Privacy • Control what data is visible based on user role • e.g. Admin sees everything, Analyst has PII masked • Masking can be encryption, tokenization, partial masking, redaction • Built-in and custom functions allow partial masking, tokenization, etc. • More complex logic also possible • e.g. HIPAA Safe Harbor zip code handling using in-memory look-up maps • e.g. anonymization of CC owners and transactions for pattern analysis
  • 94. 94 Role Based Access Controls - Permissions
  • 95. 95 Policy Based Security • Custom Policies allow developers to provide their own access control rules. • Developers can code their own custom access control policies and the administrator can assign them to one (or several) users/roles in a view in Denodo (or to a whole database). DATA SOURCES Custom Policies POLICY SERVER (e.g Axiomatics) Accept + Filter + Mask Reject Condition s Satisfied DATA CONSUMERS USERS, APPS
  • 96. 96 Policy Based Security : Example Dynamic Authorization based on policies • Example : set limits to the number of queries executed by a certain user/role; determine if a query can be executed depending on the time of the day or leveraging the access policies in an external policy server.
  • 97. 97 Denodo Platform Authorization Hierarchical role definition • A role can inherit and redefine an existing role at any level in the tree.
  • 98. Live demo with Denodo
  • 100. 100 Next Steps Access Denodo Platform in the Cloud! Take a Test Drive today! www.denodo.com/TestDrive GET STARTED TODAY
  • 101. Q&A
  • 102. Merci ! www.denodo.com info.emea@denodo.com +33 (0)1 42 68 51 27 www.satisco.be first@satisco.com +32 (0)22 060 710 www.dynafin.be info@dynafin.be +32 (0)2 210 57 40 www.diams-it.com vincent.boucheron@diams-it.com +32 496 08 77 33