SlideShare una empresa de Scribd logo
1 de 12
A 5-step methodology for
complex E&P data
management

                       Raising data
                       management
                        standards

www.etlsolutions.com
The increasing complexity of E&P data
 New devices are being used                         Timescales are collapsing.
in every phase of Exploration                        Once, drilling and logging
& Production (E&P) in the Oil                       data were distinct activities
  & Gas industry, gathering                         separated by days, but now
 more data with which better                               they happen
   decisions can be made.                                simultaneously.




                                                     Metadata (in the Dublin core
 These changes are being                              and ISO 19115 sense) are
     factored into the                                    becoming ever more
  development of industry                               important in providing
    standards (such as                                 context. This has a direct
    PPDM), driving their                                 impact on proprietary
    evolution to ensure                                  database design and
      continued use.                                         functionality.


            The price of progress is growing data complexity.
A 5-step methodology for managing this data


To make a robust and repeatable
     approach work, we use
Transformation Manager, our data        The Transformation Manager
       integration toolset.         software is coupled with the approach
                                    we have adopted over many years in
                                            the Oil & Gas industry




              The result is a five-stage methodology.
Step 1

  Separate source and target data models and the logic which
  lies between them.

• This means that we can isolate the pure model structure and
  clearly see the elements, attributes and relationships in each
  model.
• We can also see detail such as database primary keys and
  comments.
• As exposing relationships is the key in handling PPDM and
  other highly normalized models, this is a critical step.
Step 2

  Separate the model from the mechanics of data storage.


• The mechanics define physical characteristics such as ‘this is
  an Oracle database’ or ‘this flat file uses a particular delimiter
  or character set’. It is the model that tells us things like ‘a well
  can have many bores’, ‘a wellbore many logs’, and that ‘log
  trace mnemonics’ are catalogue controlled.
• At a stroke, this separation abolishes a whole category of
  complexity.
• For both source and target we need a formal data
  model, because this enables us to read or write to
  database, XML, flat file, or any other data format.
Step 3

    Specify relationships between source and target.

•   In all data integration projects, determining the rules for the data
    transfer is a fundamental requirement usually defined by analysts
    working in this field, often using spreadsheets.
•   But based on these or other forms of specification, we can create the
    integration components in Transformation Manager using its descriptive
    mapping language. This enables us to create a precisely defined
    description of the link between the two data models.
•   From this we can generate a runtime system which will execute the
    formal definitions. Even if we chose not to create an executable
    link, the formal definition of the mappings is still useful, because it
    shows where the complexity in the PPDM integration is and the formal
    syntax can be shared with others to verify our interpretation of their
    rules.
Step 4
    Follow an error detection procedure.
•   To ensure that only good data is stored, Transformation Manager has a robust
    process of error detection that operates like a series of filters. For each phase, we
    detect errors relevant to that phase and we don't send bad data to the next
    phase, where detection becomes even more complex.
•   We detect mechanical and logical errors separately. If the source is a flat file, a
    mechanical error could be malformed lines; logical errors could include dangling
    foreign key references or missing data values.
•   Next, we can detect errors at the mapping level, inconsistencies that are a
    consequence of the map itself. Here, for example, we could detect that we are trying
    to load production data for a source well which does not exist in the target.
•   Finally there are errors where the data is inconsistent with the target logical model.
    Here, simple tests (a string value is too long, a number is negative) can often be
    automatically constructed from the model. More complex tests (well bores cannot
    curve so sharply, these production figures are for an abandoned well) are built using
    the semantics of the model.
•   A staging store is very useful in providing an isolated area where we can disinfect the
    data before letting it out onto a master system. Staging stores were an integral part of
    the best practice data loaders we helped build for a major E&P company, and it is
    now common practice that these are stored until issues are resolved.
Step 5

  Execute a runtime link to generate the code required to
  generate the integration.

• This will generate integration components, in the form of Java
  code, which can reside anywhere in the architecture.
• This could be on the source, target or any other system to
  manage the integration between PPDM and non-PPDM data
  sources.
Our offerings: E&P data management


                                 Transformation                 Support,
 Transformation                  Manager data                 training and
    Manager                          loader                    mentoring
    software                     developer kits                 services




                   Data loader                      Data
                       and                        migration
                    connector                     packaged
                  development                      services
Why Transformation Manager?

For the user:    Everything under one roof
                 Greater control and
                  transparency
                 Identify and test against errors
                  iteratively
                 Greater understanding of the
                  transformation requirement
                 Automatically document
                 Re-use and change
                  management
                 Uses domain specific
                  terminology in the mapping
Why Transformation Manager?

For the business:    Reduces cost and effort
                     Reduces risk in the project
                     Delivers higher quality and
                      reduces error
                     Increases control and
                      transparency in the
                      development
                     Single product
                     Reduces time to market
Contact information
   Karl Glenn
   kg@etlsolutions.com
   +44 (0) 1912 894040




                         Raising data
                         management
                          standards
www.etlsolutions.com

Más contenido relacionado

La actualidad más candente

warner-DP-203-slides.pptx
warner-DP-203-slides.pptxwarner-DP-203-slides.pptx
warner-DP-203-slides.pptx
HibaB2
 
Overview ppdm data_architecture_in_oil and gas_ industry
Overview ppdm data_architecture_in_oil and gas_ industryOverview ppdm data_architecture_in_oil and gas_ industry
Overview ppdm data_architecture_in_oil and gas_ industry
Suvradeep Rudra
 
Introduction of ssis
Introduction of ssisIntroduction of ssis
Introduction of ssis
deepakk073
 

La actualidad más candente (20)

Build data quality rules and data cleansing into your data pipelines
Build data quality rules and data cleansing into your data pipelinesBuild data quality rules and data cleansing into your data pipelines
Build data quality rules and data cleansing into your data pipelines
 
Modernize & Automate Analytics Data Pipelines
Modernize & Automate Analytics Data PipelinesModernize & Automate Analytics Data Pipelines
Modernize & Automate Analytics Data Pipelines
 
Creating a Modern Data Architecture
Creating a Modern Data ArchitectureCreating a Modern Data Architecture
Creating a Modern Data Architecture
 
Data Architecture Brief Overview
Data Architecture Brief OverviewData Architecture Brief Overview
Data Architecture Brief Overview
 
warner-DP-203-slides.pptx
warner-DP-203-slides.pptxwarner-DP-203-slides.pptx
warner-DP-203-slides.pptx
 
Master the Multi-Clustered Data Warehouse - Snowflake
Master the Multi-Clustered Data Warehouse - SnowflakeMaster the Multi-Clustered Data Warehouse - Snowflake
Master the Multi-Clustered Data Warehouse - Snowflake
 
pwc-data-mesh.pdf
pwc-data-mesh.pdfpwc-data-mesh.pdf
pwc-data-mesh.pdf
 
Data Modeling, Data Governance, & Data Quality
Data Modeling, Data Governance, & Data QualityData Modeling, Data Governance, & Data Quality
Data Modeling, Data Governance, & Data Quality
 
Example data specifications and info requirements framework OVERVIEW
Example data specifications and info requirements framework OVERVIEWExample data specifications and info requirements framework OVERVIEW
Example data specifications and info requirements framework OVERVIEW
 
Overview ppdm data_architecture_in_oil and gas_ industry
Overview ppdm data_architecture_in_oil and gas_ industryOverview ppdm data_architecture_in_oil and gas_ industry
Overview ppdm data_architecture_in_oil and gas_ industry
 
Data Migration Steps PowerPoint Presentation Slides
Data Migration Steps PowerPoint Presentation Slides Data Migration Steps PowerPoint Presentation Slides
Data Migration Steps PowerPoint Presentation Slides
 
Data Cleansing
Data CleansingData Cleansing
Data Cleansing
 
Data Modeling Fundamentals
Data Modeling FundamentalsData Modeling Fundamentals
Data Modeling Fundamentals
 
Introduction to Data Vault Modeling
Introduction to Data Vault ModelingIntroduction to Data Vault Modeling
Introduction to Data Vault Modeling
 
Data Warehousing Trends, Best Practices, and Future Outlook
Data Warehousing Trends, Best Practices, and Future OutlookData Warehousing Trends, Best Practices, and Future Outlook
Data Warehousing Trends, Best Practices, and Future Outlook
 
Webinar Data Mesh - Part 3
Webinar Data Mesh - Part 3Webinar Data Mesh - Part 3
Webinar Data Mesh - Part 3
 
Introduction of ssis
Introduction of ssisIntroduction of ssis
Introduction of ssis
 
ETL Process
ETL ProcessETL Process
ETL Process
 
Data Centre Relocation PowerPoint Presentation Slides
Data Centre Relocation PowerPoint Presentation SlidesData Centre Relocation PowerPoint Presentation Slides
Data Centre Relocation PowerPoint Presentation Slides
 
INTRODUCTION TO DATABASE
INTRODUCTION TO DATABASEINTRODUCTION TO DATABASE
INTRODUCTION TO DATABASE
 

Destacado

SPWLA-2015-GGGG
SPWLA-2015-GGGGSPWLA-2015-GGGG
SPWLA-2015-GGGG
Ramy Essam
 
Selecting Data Management Tools - A practical approach
Selecting Data Management Tools - A practical approachSelecting Data Management Tools - A practical approach
Selecting Data Management Tools - A practical approach
Christopher Bradley
 
Well logging and interpretation techniques asin b000bhl7ou
Well logging and interpretation techniques asin  b000bhl7ouWell logging and interpretation techniques asin  b000bhl7ou
Well logging and interpretation techniques asin b000bhl7ou
Ahmed Raafat
 
Basic well log interpretation
Basic well log interpretationBasic well log interpretation
Basic well log interpretation
Shahnawaz Mustafa
 

Destacado (13)

E&P data management: Implementing data standards
E&P data management: Implementing data standardsE&P data management: Implementing data standards
E&P data management: Implementing data standards
 
DMS data integration: 6 ways to get it right
DMS data integration: 6 ways to get it rightDMS data integration: 6 ways to get it right
DMS data integration: 6 ways to get it right
 
Integrated petrophysical parameters and petrographic analysis characterizing ...
Integrated petrophysical parameters and petrographic analysis characterizing ...Integrated petrophysical parameters and petrographic analysis characterizing ...
Integrated petrophysical parameters and petrographic analysis characterizing ...
 
SPWLA-2015-GGGG
SPWLA-2015-GGGGSPWLA-2015-GGGG
SPWLA-2015-GGGG
 
Overview of Experimental works conducted in this work
Overview of Experimental works conducted in this workOverview of Experimental works conducted in this work
Overview of Experimental works conducted in this work
 
Petrophysics More Important Than Ever
Petrophysics   More Important Than EverPetrophysics   More Important Than Ever
Petrophysics More Important Than Ever
 
Petrophysics and Big Data by Elephant Scale training and consultin
Petrophysics and Big Data by Elephant Scale training and consultinPetrophysics and Big Data by Elephant Scale training and consultin
Petrophysics and Big Data by Elephant Scale training and consultin
 
Selecting Data Management Tools - A practical approach
Selecting Data Management Tools - A practical approachSelecting Data Management Tools - A practical approach
Selecting Data Management Tools - A practical approach
 
Well logging and interpretation techniques asin b000bhl7ou
Well logging and interpretation techniques asin  b000bhl7ouWell logging and interpretation techniques asin  b000bhl7ou
Well logging and interpretation techniques asin b000bhl7ou
 
Basic well log interpretation
Basic well log interpretationBasic well log interpretation
Basic well log interpretation
 
Well logging analysis: methods and interpretation
Well logging analysis: methods and interpretationWell logging analysis: methods and interpretation
Well logging analysis: methods and interpretation
 
Basic Petrophysics
Basic PetrophysicsBasic Petrophysics
Basic Petrophysics
 
Well logging
Well loggingWell logging
Well logging
 

Similar a A 5-step methodology for complex E&P data management

Mapping Manager Brochure
Mapping Manager BrochureMapping Manager Brochure
Mapping Manager Brochure
Rakesh Kumar
 
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Nathan Bijnens
 

Similar a A 5-step methodology for complex E&P data management (20)

Hadoop Migration to databricks cloud project plan.pptx
Hadoop Migration to databricks cloud project plan.pptxHadoop Migration to databricks cloud project plan.pptx
Hadoop Migration to databricks cloud project plan.pptx
 
Summary of Accelerate - 2019 State of Devops report by Google Cloud's DORA
Summary of Accelerate - 2019 State of Devops report by Google Cloud's DORASummary of Accelerate - 2019 State of Devops report by Google Cloud's DORA
Summary of Accelerate - 2019 State of Devops report by Google Cloud's DORA
 
Data integration case study: Oil & Gas industry
Data integration case study: Oil & Gas industryData integration case study: Oil & Gas industry
Data integration case study: Oil & Gas industry
 
Data Governance for the Cloud with Oracle DRM
Data Governance for the Cloud with Oracle DRMData Governance for the Cloud with Oracle DRM
Data Governance for the Cloud with Oracle DRM
 
09 mdm tool comaprison
09 mdm tool comaprison09 mdm tool comaprison
09 mdm tool comaprison
 
Make A Stress Free Move To The Cloud: Application Modernization and Managemen...
Make A Stress Free Move To The Cloud: Application Modernization and Managemen...Make A Stress Free Move To The Cloud: Application Modernization and Managemen...
Make A Stress Free Move To The Cloud: Application Modernization and Managemen...
 
The Xoriant Whitepaper: Last Mile Soa Implementation
The Xoriant Whitepaper: Last Mile Soa ImplementationThe Xoriant Whitepaper: Last Mile Soa Implementation
The Xoriant Whitepaper: Last Mile Soa Implementation
 
Rapidly Enable Tangible Business Value through Data Virtualization
Rapidly Enable Tangible Business Value through Data VirtualizationRapidly Enable Tangible Business Value through Data Virtualization
Rapidly Enable Tangible Business Value through Data Virtualization
 
Logical Data Fabric: An Introduction
Logical Data Fabric: An IntroductionLogical Data Fabric: An Introduction
Logical Data Fabric: An Introduction
 
Data summit connect fall 2020 - rise of data ops
Data summit connect fall 2020 - rise of data opsData summit connect fall 2020 - rise of data ops
Data summit connect fall 2020 - rise of data ops
 
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
 
Performance tuning datasheet
Performance tuning datasheetPerformance tuning datasheet
Performance tuning datasheet
 
How to add security in dataops and devops
How to add security in dataops and devopsHow to add security in dataops and devops
How to add security in dataops and devops
 
Mapping Manager Brochure
Mapping Manager BrochureMapping Manager Brochure
Mapping Manager Brochure
 
M.S. Dissertation in Salesforce on Force.com
M.S. Dissertation in Salesforce on Force.comM.S. Dissertation in Salesforce on Force.com
M.S. Dissertation in Salesforce on Force.com
 
Iod session 3423 analytics patterns of expertise, the fast path to amazing ...
Iod session 3423   analytics patterns of expertise, the fast path to amazing ...Iod session 3423   analytics patterns of expertise, the fast path to amazing ...
Iod session 3423 analytics patterns of expertise, the fast path to amazing ...
 
From Chaos to Compliance: The New Digital Governance for DevOps
From Chaos to Compliance: The New Digital Governance for DevOpsFrom Chaos to Compliance: The New Digital Governance for DevOps
From Chaos to Compliance: The New Digital Governance for DevOps
 
Data Mesh
Data MeshData Mesh
Data Mesh
 
Salesforce Platform: Governance and the Social Enterprise
Salesforce Platform: Governance and the Social EnterpriseSalesforce Platform: Governance and the Social Enterprise
Salesforce Platform: Governance and the Social Enterprise
 
Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)Data Mesh in Azure using Cloud Scale Analytics (WAF)
Data Mesh in Azure using Cloud Scale Analytics (WAF)
 

Más de ETLSolutions

Más de ETLSolutions (7)

How to create a successful proof of concept
How to create a successful proof of conceptHow to create a successful proof of concept
How to create a successful proof of concept
 
WITSML to PPDM mapping project
WITSML to PPDM mapping projectWITSML to PPDM mapping project
WITSML to PPDM mapping project
 
How to prepare data before a data migration
How to prepare data before a data migrationHow to prepare data before a data migration
How to prepare data before a data migration
 
An example of a successful proof of concept
An example of a successful proof of conceptAn example of a successful proof of concept
An example of a successful proof of concept
 
Data integration case study: Automotive industry
Data integration case study: Automotive industryData integration case study: Automotive industry
Data integration case study: Automotive industry
 
Migrating data: How to reduce risk
Migrating data: How to reduce riskMigrating data: How to reduce risk
Migrating data: How to reduce risk
 
Automotive data integration: An example of a successful project structure
Automotive data integration: An example of a successful project structureAutomotive data integration: An example of a successful project structure
Automotive data integration: An example of a successful project structure
 

Último

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 

Último (20)

Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 

A 5-step methodology for complex E&P data management

  • 1. A 5-step methodology for complex E&P data management Raising data management standards www.etlsolutions.com
  • 2. The increasing complexity of E&P data New devices are being used Timescales are collapsing. in every phase of Exploration Once, drilling and logging & Production (E&P) in the Oil data were distinct activities & Gas industry, gathering separated by days, but now more data with which better they happen decisions can be made. simultaneously. Metadata (in the Dublin core These changes are being and ISO 19115 sense) are factored into the becoming ever more development of industry important in providing standards (such as context. This has a direct PPDM), driving their impact on proprietary evolution to ensure database design and continued use. functionality. The price of progress is growing data complexity.
  • 3. A 5-step methodology for managing this data To make a robust and repeatable approach work, we use Transformation Manager, our data The Transformation Manager integration toolset. software is coupled with the approach we have adopted over many years in the Oil & Gas industry The result is a five-stage methodology.
  • 4. Step 1 Separate source and target data models and the logic which lies between them. • This means that we can isolate the pure model structure and clearly see the elements, attributes and relationships in each model. • We can also see detail such as database primary keys and comments. • As exposing relationships is the key in handling PPDM and other highly normalized models, this is a critical step.
  • 5. Step 2 Separate the model from the mechanics of data storage. • The mechanics define physical characteristics such as ‘this is an Oracle database’ or ‘this flat file uses a particular delimiter or character set’. It is the model that tells us things like ‘a well can have many bores’, ‘a wellbore many logs’, and that ‘log trace mnemonics’ are catalogue controlled. • At a stroke, this separation abolishes a whole category of complexity. • For both source and target we need a formal data model, because this enables us to read or write to database, XML, flat file, or any other data format.
  • 6. Step 3 Specify relationships between source and target. • In all data integration projects, determining the rules for the data transfer is a fundamental requirement usually defined by analysts working in this field, often using spreadsheets. • But based on these or other forms of specification, we can create the integration components in Transformation Manager using its descriptive mapping language. This enables us to create a precisely defined description of the link between the two data models. • From this we can generate a runtime system which will execute the formal definitions. Even if we chose not to create an executable link, the formal definition of the mappings is still useful, because it shows where the complexity in the PPDM integration is and the formal syntax can be shared with others to verify our interpretation of their rules.
  • 7. Step 4 Follow an error detection procedure. • To ensure that only good data is stored, Transformation Manager has a robust process of error detection that operates like a series of filters. For each phase, we detect errors relevant to that phase and we don't send bad data to the next phase, where detection becomes even more complex. • We detect mechanical and logical errors separately. If the source is a flat file, a mechanical error could be malformed lines; logical errors could include dangling foreign key references or missing data values. • Next, we can detect errors at the mapping level, inconsistencies that are a consequence of the map itself. Here, for example, we could detect that we are trying to load production data for a source well which does not exist in the target. • Finally there are errors where the data is inconsistent with the target logical model. Here, simple tests (a string value is too long, a number is negative) can often be automatically constructed from the model. More complex tests (well bores cannot curve so sharply, these production figures are for an abandoned well) are built using the semantics of the model. • A staging store is very useful in providing an isolated area where we can disinfect the data before letting it out onto a master system. Staging stores were an integral part of the best practice data loaders we helped build for a major E&P company, and it is now common practice that these are stored until issues are resolved.
  • 8. Step 5 Execute a runtime link to generate the code required to generate the integration. • This will generate integration components, in the form of Java code, which can reside anywhere in the architecture. • This could be on the source, target or any other system to manage the integration between PPDM and non-PPDM data sources.
  • 9. Our offerings: E&P data management Transformation Support, Transformation Manager data training and Manager loader mentoring software developer kits services Data loader Data and migration connector packaged development services
  • 10. Why Transformation Manager? For the user:  Everything under one roof  Greater control and transparency  Identify and test against errors iteratively  Greater understanding of the transformation requirement  Automatically document  Re-use and change management  Uses domain specific terminology in the mapping
  • 11. Why Transformation Manager? For the business:  Reduces cost and effort  Reduces risk in the project  Delivers higher quality and reduces error  Increases control and transparency in the development  Single product  Reduces time to market
  • 12. Contact information Karl Glenn kg@etlsolutions.com +44 (0) 1912 894040 Raising data management standards www.etlsolutions.com