SlideShare una empresa de Scribd logo
1 de 6
Descargar para leer sin conexión
• Cognizant 20-20 Insights

POS Data Quality: Overcoming a
Lingering Retail Nightmare
By embracing a holistic and repeatable framework, retailers can first
pilot and then remediate data quality issues incrementally, overcoming
the planning challenges created by heterogeneous POS systems.
Executive Summary
At many retail organizations, flawless business
execution depends on in-store and back-office
systems working together in harmony. Multiple
and diverse point-of-sale (POS) systems pump
data to major internal applications that drive key
processes, such as planning merchandise assortments, item allocation, replenishment and key
performance indicator (KPI) metrics reporting.
In reality, however, significant POS data quality
issues and inconsistencies can and often do
disrupt business operations.
This white paper details a remediation approach
that helps retailers validate and enhance POS data
quality. The approach enables access to accurate,
on-time data feeds, thus improving strategic decision-making throughout the business.

Issues with Diverse POS Systems
It’s not uncommon for retailers to use a mix of
POS systems that are as varied as their product
portfolios: old and new, open source and proprietary, cloud and on-premises.1 This is due to the
timing of POS system retirement and replacement. New systems are usually deployed at a
location/country level, leaving the other units
within a franchise group or region with incom-

cognizant 20-20 insights | january 2014

patible POS’s that have been retained following a
merger or another business reason.
Whatever the rationale, retail chains of all sizes
often utilize a medley of POS systems. These
diverse, heterogeneous IT environments can
deliver reasonable business value, but they are
not suitable for every organization, especially
when POS data is needed by the consuming applications in real time. The issue is made even more
complex because of different POS systems and
data formats, as well as the absence of proper
audit/validation. Both of these issues need to
be rationalized before the data reaches the
applications because they can adversely affect
weekly allocation and replenishment capabilities,
decision support, reporting accuracy, planning
processes and various operational processes.
Major issues that can impact POS data include:

•	

Missing data: Data/information is not available
in time for use in business processes.

•	

Duplicate data: The same data/information is
received more than once.

•	

Suspicious or incorrect data: Data is present
but incomplete, erroneous or unreliable.

•	

Delayed data: Data is received beyond the
optimal timeframe.
As an end-state objective, POS data quality must
be improved to enable retailers to reach several
essential goals, including enhanced planning,
forecasting and reporting; improved efficiency
through faster data reconciliation; and the ability
to “read” and respond to changes in business performance.

velop possible corrective actions (including
the conversion/upgrade of low-performing
franchisee stores) and, when appropriate,
implement the corrective actions.

>> Based

on the lessons learned from the
work above, design, build and document a
repeatable framework (tools and processes)
aimed at identifying, analyzing and fixing the
issues, as well as improving POS data quality for additional or subsequent populations
of stores and data feeds. This framework
can then be used to carry out similar DQV
exercises across other populations or geographies.

Remediating POS, One Step at a Time
While a long-term solution would be a centralized data clearinghouse, such an implementation would not address the immediate needs of
the business. To mitigate this problem, an interim
solution can be outlined that strengthens data
quality for partial store populations, one population at a time, starting with the most problematic
populations.

•	 Build

a repeatable framework: The process
of building and designing the repeatable
framework to address the immediate needs of
the business will consist of a series of steps to
be performed by multiple, collaborating stakeholders, such as IT, key business partners and
source system data providers. This flow of
activities is depicted in Figure 1.

The steps involved in this solution include the
following:

•	 Pilot before deployment: For starters, retailers

need to take stock of their data quality validation
(DQV) process and uncover key issues (see
sidebar, page 4). DQV processes ensure that
programs are operating with clean, correct and
useful data. Such a pilot should be carried out
with two objectives in mind:

•	 Infrastructure

requirements: Capture the
environment and access requirements needed
prior to the start of the POS data quality
validation exercise. Include infrastructure
requirements for production platforms, as well
as those specific to the DQV platform (e.g.,
data capture file folders, mini-clearinghouse
database, ad hoc programs, reports requirements, etc.). Having this completed before

>> Demonstrate

the data quality validation
process via a proof of concept on a selected population of stores. Identify and catalog
data quality issues, establish root causes, de-

Orchestrating the POS Partner Ecosystem

C. Interface
documentation

G. Collection of
POS reports

A. Infrastructure
requirements

B. Scope
definition

D. Requirements
gathering for
the consuming
application

F. Use and
maintenance
of database
clearinghouse

I. Development and
maintenance of
variance reports

E. Documentation
of known POS
data issues

Source System Resources

Business Partners

IT Resources (all steps)

Figure 1

cognizant 20-20 insights

H. Collection of
live data feeds

2

J. Data fixes and
remediation
starting the DQV exercise will ensure hasslefree execution, with the project team focused
on actual DQV activities rather than infrastructure problems.

This activity should be performed to gather all
the information for the known issues and to
understand the scale and scope of data quality
issues, the frequency of occurrence of those
issues and the most problematic interfaces
for the affiliate in question. This also helps the
project team in the data analysis exercise down
the line.

•	 Scope

definition: Identify the key elements
that need to be defined as in- or out-of-scope
for the next POS DQV exercise, such as a list of
stores (categories based on affiliate, business
model, POS system, etc.), a list of interfaces/
data feeds, variance reports, etc.

•	 Use

and maintenance of a database clearinghouse: Build the custom mini-clearinghouse database, as well as the copy and load
processes for POS reports and live data feeds.
These processes will be reused, configured or
updated for feeding the POS data into the miniclearinghouse database.

Because a scope definition explains which
elements are in or out scope and why, this
activity helps to precisely define the target
elements of a particular execution of the DQV
exercise/process.

This phase involves creating data structures,
uploading captured data and performing other
supporting processes related to the mini-clearinghouse database.

•	 Interface

documentation: Describe the
key information that needs to be collected,
documented and verified to ensure full understanding and usability of the overall POS feeds
integration architecture.
Interface documentation is a crucial factor in
the success of the exercise, as it provides a
strong understanding of the data that is transmitted from the POS systems via the interfaces
and helps determine whether adequate information is available to proceed with the DQV
exercise. In the case of a data type or data
definition mismatch, the interface documentation can expose the failure points in the data
channels if they pertain to mapping issues
between the various integration points.

•	 Requirements gathering for consuming applications: Capture the requirements that are
specific to each consuming application (such
as replenishment, allocation and merchandising applications) regarding POS data feeds This
ensures that the validation takes into account
and analyzes all critical data elements present
in the POS feeds.
This step is critical to gaining insight into what
the target applications expect the interface
data feeds to contain and why. It also provides
knowledge of specific data processing, filtering
and logic for the affiliate POS interface in one
or more of the consuming applications.

•	 Collection

of POS reports: This is one of
the data collection activities. In this step, the
various POS reports are reviewed, captured
and pre-processed to make sure they are ready
for the mini clearinghouse.

•	 Collection of live data feeds: This is another

step of data collection, which includes review,
validation, capture and pre-processing of data
feeds inbound and outbound of each integration point, preparing them for processing in the
mini clearinghouse.
This phase deals with the capture of in-scope
data feeds as they are made available to the
various in-scope integration points.

•	 Development

and maintenance of variance
reports: Identify and develop the variance
reports to capture various data exceptions
and variances. A follow-up root cause analysis
exercise helps identify the possible reasons for
the exceptions.
This phase explains which activities need to be
performed so that the data loaded in the mini
clearinghouse can be analyzed to understand
the data quality issues.

•	 Data

fixes and recommendations: Review,
assess, escalate, resolve and identify recommendations and actions. Follow-up issues and
data fixes depend heavily on the local POS and
system integrations.

•	 Documentation

of known POS data issues:
Catalog historic and live POS data issues
and analyze how they can be used in a POS
DQV exercise, including the “accretion” DQV
documented issues from one iteration to
the next.

cognizant 20-20 insights

The objective of this phase is to build a prioritization, retention/rejection, remediation and
implementation process to be followed for
identified exceptions.

3
•	 Implement

the framework for “all” stores:
This is a recurring step in which the retailer
needs to implement the repeatable framework
to resolve data quality issues for all stores. It
is very important to acknowledge the differences of dissimilar groups of stores and adapt
or customize the framework before using it for
this exercise. In addition, lessons learned from
past implementations should be utilized to improve the framework for speedier and more
efficient roll-outs.

Drivers for Success
Key success criteria that should be examined to
ensure a successful DQV exercise include:

•	 Data issue fixes and remediation: Since the

objective of this exercise is to resolve possible
data quality issues, the main success criteria
should stipulate that issues be fixed upon identification.

•	 Issue

identification: Ensure the tool that
is built as part of the repeatable process
framework is capable of identifying the issues
as and when they exist.

Quick Take
POS Data Validation for a Global Apparel Retailer
We worked with a retailer that was struggling with
data quality issues in its 40-plus POS systems
that sent data to its internal applications. The
issue originated in its European region, where
22 affiliates use five different POS interfaces and
eight consuming systems, which added to the
data quality complexity.
We partnered with the retailer’s IT and business
teams to carry out a DQV exercise for its Eastern
Europe unit and helped build a repeatable
framework to be followed in other regions. The
15-week engagement addressed the DQV needs
for 40-plus stores, with data flowing through
four different POS systems. The exercise was

conducted in two different tracks. The first track
was to build the repeatable process/framework;
the second was to test the framework with data
from different sets of stores.
Figure 2 offers a snapshot of the activities
performed to conduct the POS data quality
analysis exercise.
Roughly 2,500 issues were identified and fixed
through this pilot POS data validation exercise.
The client is in the process of rolling out the
repeatable process framework to other countries
or affiliates in Europe.

Week 18

Week 17

Week 16

Week 15

Week 14

Week 13

Week 12

Week 11

Week 10

Week 9

Week 8

Week 7

Week 6

Week 5

Week 4

Week 3

Week 2

Activities

Week 1

Activities Performed During the DQV Exercise

Infrastructure requirements
Scope definition
Interface documentation

Knowledge
Acquisition

Documentation

Requirements gathering for consuming
applications

Knowledge
Acquisition

Documentation

Documentation of known EPOS data issues
Use and maintenance of DB clearinghouse
Collection of EPOS reports

Construction
Preparation

Collection of live EPOS data feeds

Data Load
Data Sampling Window

Data Capture

Data Sampling Window

Execution and development of variance reports

Development

Analysis of variances

Execution and Pre-Analysis
Root Cause Analysis of Variances

Data fixes and remediation
Figure 2

cognizant 20-20 insights

4
•	

Issue documentation: To better understand
issues for their fix and resolution, the tool
should be able to document the issues
completely and in a timely manner.

•	

Communication: The tool’s ability to communicate issues to the team, along with a proper
root cause analysis in near-real-time, influences
timely and effective issue resolution.

•	

Degree of repeatability: The DQV exercise
should not be limited to only one geography
or one set of stores. The tool/framework
must, therefore, be easily replicated to a new
geography or set of stores under consideration
for data quality validation. Hence, the ability to
implement a repeatable process framework in
a reasonable timeline for any new geography
or set of stores is another important criterion
for success.

•	

Data mining: An immense amount of data will
be loaded into the mini-clearinghouse database
during the entire sampling window, so the tool
should be able to mine the resulting data.

Looking Forward
POS data utilization is fast becoming a necessity
for retailers seeking to maintain their competitive advantage. POS data is critical in this regard,
since it provides precious insight into consumer
actions and, if properly applied, can help improve
customer relationships and maximize internal
business efficiencies. Moreover, once normalized,
POS data serves as a vital input to various operational planning engines, such as replenishment
and allocation.
Such outcomes make a DQV exercise critical to
any retailer’s short- and long-term future. DQV
ensures that the POS data extracted from diverse
POS systems is correct, reliable and actionable.
And importantly, this exercise ensures that the
right data reaches the consuming applications,
enabling better and more accurate decision-making and planning.

Footnotes
1	

Shannon Arnold, “The Pros and Cons of Owning Diverse POS Systems,” Maitre’ D, March 27, 2013, http://
web.maitredpos.com/bid/279593/The-Pros-and-Cons-of-Owning-Diverse-POS-Systems.

About the Authors
Arindam Chakraborty is a Senior Consultant in Cognizant Business Consulting’s Retail Practice. He has
eight years of experience in merchandising, supply chain and store operations with specialization in point
of sale. Arindam has CPIM and CSCP accreditation and holds an M.B.A. from the Institute of Management
 Technology, Ghaziabad, India. He can be reached at Arindam.Chakraborty5@cognizant.com.
Shilpi Varshney is a Consultant in Cognizant Business Consulting’s Retail Practice. She has more than five
years of experience in store operations, focused on multichannel and supply chain strategy. Shilpi holds
an M.B.A. from Management Development Institute, Gurgaon, India, earning the Gold Medal. She can be
reached at Shilpi.Varshney@cognizant.com.
Doug Dennison is a Senior Manager in Cognizant Business Consulting’s Retail Practice. He has more than
20 years of experience in store operations, store systems, loss prevention and workforce management.
Doug holds an M.B.A. from Grand Valley State University in Grand Rapids, Michigan. He can be reached at
Douglas.Dennison@cognizant.com.

cognizant 20-20 insights

5
About Cognizant
Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world’s leading companies build stronger businesses. Headquartered in
Teaneck, New Jersey (U.S.), Cognizant combines a passion for client satisfaction, technology innovation, deep industry
and business process expertise, and a global, collaborative workforce that embodies the future of work. With over 50
delivery centers worldwide and approximately 166,400 employees as of September 30, 2013, Cognizant is a member of
the NASDAQ-100, the SP 500, the Forbes Global 2000, and the Fortune 500 and is ranked among the top performing
and fastest growing companies in the world. Visit us online at www.cognizant.com or follow us on Twitter: Cognizant.

World Headquarters

European Headquarters

India Operations Headquarters

500 Frank W. Burr Blvd.
Teaneck, NJ 07666 USA
Phone: +1 201 801 0233
Fax: +1 201 801 0243
Toll Free: +1 888 937 3277
Email: inquiry@cognizant.com

1 Kingdom Street
Paddington Central
London W2 6BD
Phone: +44 (0) 20 7297 7600
Fax: +44 (0) 20 7121 0102
Email: infouk@cognizant.com

#5/535, Old Mahabalipuram Road
Okkiyam Pettai, Thoraipakkam
Chennai, 600 096 India
Phone: +91 (0) 44 4209 6000
Fax: +91 (0) 44 4209 6060
Email: inquiryindia@cognizant.com

©
­­ Copyright 2014, Cognizant. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, transmitted in any form or by any
means, electronic, mechanical, photocopying, recording, or otherwise, without the express written permission from Cognizant. The information contained herein is
subject to change without notice. All other trademarks mentioned herein are the property of their respective owners.

Más contenido relacionado

La actualidad más candente

Supply Chain Management System
Supply Chain Management SystemSupply Chain Management System
Supply Chain Management System
guest631b66
 
Jeremy_McGurk_Resume-2
Jeremy_McGurk_Resume-2Jeremy_McGurk_Resume-2
Jeremy_McGurk_Resume-2
Jeremy McGurk
 
Chapter 3: Marketing Information Systems and the Sales Order Process
Chapter 3: Marketing Information Systems and the Sales Order ProcessChapter 3: Marketing Information Systems and the Sales Order Process
Chapter 3: Marketing Information Systems and the Sales Order Process
Quang Ngoc
 
Business Systems Analyst Resume
Business Systems Analyst ResumeBusiness Systems Analyst Resume
Business Systems Analyst Resume
Rose Ann Reyes
 
Consulting Projects in Sunbend Computer Services Corporation
Consulting Projects in Sunbend Computer Services CorporationConsulting Projects in Sunbend Computer Services Corporation
Consulting Projects in Sunbend Computer Services Corporation
Gregory Bogenschutz, PMP, CSM
 
Supply Chain Management (Scm) In Management
Supply Chain Management (Scm) In ManagementSupply Chain Management (Scm) In Management
Supply Chain Management (Scm) In Management
Nijo Ninan
 
IT & SCM BY Zaki from Ajay Kumar Garg Institue Of Management
IT & SCM BY Zaki from Ajay Kumar Garg Institue Of ManagementIT & SCM BY Zaki from Ajay Kumar Garg Institue Of Management
IT & SCM BY Zaki from Ajay Kumar Garg Institue Of Management
Mohammad Zaki
 
Supply Chain Management
Supply Chain ManagementSupply Chain Management
Supply Chain Management
uksuman9889
 

La actualidad más candente (19)

Chapter 14 strategic human resources
Chapter 14 strategic human resourcesChapter 14 strategic human resources
Chapter 14 strategic human resources
 
James hall ch 13
James hall ch 13James hall ch 13
James hall ch 13
 
Supply Chain Management and IT
Supply Chain Management and IT Supply Chain Management and IT
Supply Chain Management and IT
 
Supply Chain Management System
Supply Chain Management SystemSupply Chain Management System
Supply Chain Management System
 
Lecture 8 supply chain
Lecture 8   supply chainLecture 8   supply chain
Lecture 8 supply chain
 
Business Functions & Business Processes in ERP
Business Functions & Business Processes in ERPBusiness Functions & Business Processes in ERP
Business Functions & Business Processes in ERP
 
Jeremy_McGurk_Resume-2
Jeremy_McGurk_Resume-2Jeremy_McGurk_Resume-2
Jeremy_McGurk_Resume-2
 
Chapter 3: Marketing Information Systems and the Sales Order Process
Chapter 3: Marketing Information Systems and the Sales Order ProcessChapter 3: Marketing Information Systems and the Sales Order Process
Chapter 3: Marketing Information Systems and the Sales Order Process
 
Enterprise Systems
Enterprise SystemsEnterprise Systems
Enterprise Systems
 
Business Systems Analyst Resume
Business Systems Analyst ResumeBusiness Systems Analyst Resume
Business Systems Analyst Resume
 
Erp (Enterprise Resource Planning)
Erp (Enterprise Resource Planning)Erp (Enterprise Resource Planning)
Erp (Enterprise Resource Planning)
 
Consulting Projects in Sunbend Computer Services Corporation
Consulting Projects in Sunbend Computer Services CorporationConsulting Projects in Sunbend Computer Services Corporation
Consulting Projects in Sunbend Computer Services Corporation
 
Role Of ERP In Supply Chain
Role Of ERP In Supply ChainRole Of ERP In Supply Chain
Role Of ERP In Supply Chain
 
Supply Chain Management (Scm) In Management
Supply Chain Management (Scm) In ManagementSupply Chain Management (Scm) In Management
Supply Chain Management (Scm) In Management
 
Gartner Quadrants 2008
Gartner Quadrants 2008Gartner Quadrants 2008
Gartner Quadrants 2008
 
Supply Chain Management and MRP & ERP
Supply Chain Management and MRP & ERPSupply Chain Management and MRP & ERP
Supply Chain Management and MRP & ERP
 
Role of it in scm
Role of it in scmRole of it in scm
Role of it in scm
 
IT & SCM BY Zaki from Ajay Kumar Garg Institue Of Management
IT & SCM BY Zaki from Ajay Kumar Garg Institue Of ManagementIT & SCM BY Zaki from Ajay Kumar Garg Institue Of Management
IT & SCM BY Zaki from Ajay Kumar Garg Institue Of Management
 
Supply Chain Management
Supply Chain ManagementSupply Chain Management
Supply Chain Management
 

Similar a POS Data Quality: Overcoming a Lingering Retail Nightmare

Priti - ETL Engineer
Priti - ETL EngineerPriti - ETL Engineer
Priti - ETL Engineer
priti kumari
 
Case study businessone (en) 1.0
Case study  businessone (en) 1.0Case study  businessone (en) 1.0
Case study businessone (en) 1.0
BIcasestudy
 

Similar a POS Data Quality: Overcoming a Lingering Retail Nightmare (20)

Erp and related technologies
Erp and related technologiesErp and related technologies
Erp and related technologies
 
About Business Intelligence
About Business IntelligenceAbout Business Intelligence
About Business Intelligence
 
Unit 5 Business Intelligence Roadmap
Unit 5 Business Intelligence RoadmapUnit 5 Business Intelligence Roadmap
Unit 5 Business Intelligence Roadmap
 
Enterprise resource planning_system
Enterprise resource planning_systemEnterprise resource planning_system
Enterprise resource planning_system
 
Implementing & Managing The Demand Signal Managment Process
Implementing & Managing The Demand Signal Managment ProcessImplementing & Managing The Demand Signal Managment Process
Implementing & Managing The Demand Signal Managment Process
 
20959.pptx
20959.pptx20959.pptx
20959.pptx
 
Business Intelligence and OLAP Practice
Business Intelligence and OLAP PracticeBusiness Intelligence and OLAP Practice
Business Intelligence and OLAP Practice
 
Dit yvol4iss21
Dit yvol4iss21Dit yvol4iss21
Dit yvol4iss21
 
Toward a Future-Proof Product Control Function
Toward a Future-Proof Product Control FunctionToward a Future-Proof Product Control Function
Toward a Future-Proof Product Control Function
 
Project+team+1 slides (2)
Project+team+1 slides (2)Project+team+1 slides (2)
Project+team+1 slides (2)
 
Team project - Data visualization on Olist company data
Team project - Data visualization on Olist company dataTeam project - Data visualization on Olist company data
Team project - Data visualization on Olist company data
 
Cognos Business Intelligence
Cognos Business IntelligenceCognos Business Intelligence
Cognos Business Intelligence
 
Priti - ETL Engineer
Priti - ETL EngineerPriti - ETL Engineer
Priti - ETL Engineer
 
Data science in demand planning - when the machine is not enough
Data science in demand planning - when the machine is not enoughData science in demand planning - when the machine is not enough
Data science in demand planning - when the machine is not enough
 
Case Study- BusinessOne
Case Study- BusinessOneCase Study- BusinessOne
Case Study- BusinessOne
 
Case study businessone (en) 1.0
Case study  businessone (en) 1.0Case study  businessone (en) 1.0
Case study businessone (en) 1.0
 
ALLL FZL and TD
ALLL FZL and TDALLL FZL and TD
ALLL FZL and TD
 
Making Data Quality a Way of Life
Making Data Quality a Way of LifeMaking Data Quality a Way of Life
Making Data Quality a Way of Life
 
Bi
BiBi
Bi
 
DenisePierceResume
DenisePierceResumeDenisePierceResume
DenisePierceResume
 

Más de Cognizant

Más de Cognizant (20)

Using Adaptive Scrum to Tame Process Reverse Engineering in Data Analytics Pr...
Using Adaptive Scrum to Tame Process Reverse Engineering in Data Analytics Pr...Using Adaptive Scrum to Tame Process Reverse Engineering in Data Analytics Pr...
Using Adaptive Scrum to Tame Process Reverse Engineering in Data Analytics Pr...
 
Data Modernization: Breaking the AI Vicious Cycle for Superior Decision-making
Data Modernization: Breaking the AI Vicious Cycle for Superior Decision-makingData Modernization: Breaking the AI Vicious Cycle for Superior Decision-making
Data Modernization: Breaking the AI Vicious Cycle for Superior Decision-making
 
It Takes an Ecosystem: How Technology Companies Deliver Exceptional Experiences
It Takes an Ecosystem: How Technology Companies Deliver Exceptional ExperiencesIt Takes an Ecosystem: How Technology Companies Deliver Exceptional Experiences
It Takes an Ecosystem: How Technology Companies Deliver Exceptional Experiences
 
Intuition Engineered
Intuition EngineeredIntuition Engineered
Intuition Engineered
 
The Work Ahead: Transportation and Logistics Delivering on the Digital-Physic...
The Work Ahead: Transportation and Logistics Delivering on the Digital-Physic...The Work Ahead: Transportation and Logistics Delivering on the Digital-Physic...
The Work Ahead: Transportation and Logistics Delivering on the Digital-Physic...
 
Enhancing Desirability: Five Considerations for Winning Digital Initiatives
Enhancing Desirability: Five Considerations for Winning Digital InitiativesEnhancing Desirability: Five Considerations for Winning Digital Initiatives
Enhancing Desirability: Five Considerations for Winning Digital Initiatives
 
The Work Ahead in Manufacturing: Fulfilling the Agility Mandate
The Work Ahead in Manufacturing: Fulfilling the Agility MandateThe Work Ahead in Manufacturing: Fulfilling the Agility Mandate
The Work Ahead in Manufacturing: Fulfilling the Agility Mandate
 
The Work Ahead in Higher Education: Repaving the Road for the Employees of To...
The Work Ahead in Higher Education: Repaving the Road for the Employees of To...The Work Ahead in Higher Education: Repaving the Road for the Employees of To...
The Work Ahead in Higher Education: Repaving the Road for the Employees of To...
 
Engineering the Next-Gen Digital Claims Organisation for Australian General I...
Engineering the Next-Gen Digital Claims Organisation for Australian General I...Engineering the Next-Gen Digital Claims Organisation for Australian General I...
Engineering the Next-Gen Digital Claims Organisation for Australian General I...
 
Profitability in the Direct-to-Consumer Marketplace: A Playbook for Media and...
Profitability in the Direct-to-Consumer Marketplace: A Playbook for Media and...Profitability in the Direct-to-Consumer Marketplace: A Playbook for Media and...
Profitability in the Direct-to-Consumer Marketplace: A Playbook for Media and...
 
Green Rush: The Economic Imperative for Sustainability
Green Rush: The Economic Imperative for SustainabilityGreen Rush: The Economic Imperative for Sustainability
Green Rush: The Economic Imperative for Sustainability
 
Policy Administration Modernization: Four Paths for Insurers
Policy Administration Modernization: Four Paths for InsurersPolicy Administration Modernization: Four Paths for Insurers
Policy Administration Modernization: Four Paths for Insurers
 
The Work Ahead in Utilities: Powering a Sustainable Future with Digital
The Work Ahead in Utilities: Powering a Sustainable Future with DigitalThe Work Ahead in Utilities: Powering a Sustainable Future with Digital
The Work Ahead in Utilities: Powering a Sustainable Future with Digital
 
AI in Media & Entertainment: Starting the Journey to Value
AI in Media & Entertainment: Starting the Journey to ValueAI in Media & Entertainment: Starting the Journey to Value
AI in Media & Entertainment: Starting the Journey to Value
 
Operations Workforce Management: A Data-Informed, Digital-First Approach
Operations Workforce Management: A Data-Informed, Digital-First ApproachOperations Workforce Management: A Data-Informed, Digital-First Approach
Operations Workforce Management: A Data-Informed, Digital-First Approach
 
Five Priorities for Quality Engineering When Taking Banking to the Cloud
Five Priorities for Quality Engineering When Taking Banking to the CloudFive Priorities for Quality Engineering When Taking Banking to the Cloud
Five Priorities for Quality Engineering When Taking Banking to the Cloud
 
Getting Ahead With AI: How APAC Companies Replicate Success by Remaining Focused
Getting Ahead With AI: How APAC Companies Replicate Success by Remaining FocusedGetting Ahead With AI: How APAC Companies Replicate Success by Remaining Focused
Getting Ahead With AI: How APAC Companies Replicate Success by Remaining Focused
 
Crafting the Utility of the Future
Crafting the Utility of the FutureCrafting the Utility of the Future
Crafting the Utility of the Future
 
Utilities Can Ramp Up CX with a Customer Data Platform
Utilities Can Ramp Up CX with a Customer Data PlatformUtilities Can Ramp Up CX with a Customer Data Platform
Utilities Can Ramp Up CX with a Customer Data Platform
 
The Work Ahead in Intelligent Automation: Coping with Complexity in a Post-Pa...
The Work Ahead in Intelligent Automation: Coping with Complexity in a Post-Pa...The Work Ahead in Intelligent Automation: Coping with Complexity in a Post-Pa...
The Work Ahead in Intelligent Automation: Coping with Complexity in a Post-Pa...
 

Último

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 

Último (20)

Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 

POS Data Quality: Overcoming a Lingering Retail Nightmare

  • 1. • Cognizant 20-20 Insights POS Data Quality: Overcoming a Lingering Retail Nightmare By embracing a holistic and repeatable framework, retailers can first pilot and then remediate data quality issues incrementally, overcoming the planning challenges created by heterogeneous POS systems. Executive Summary At many retail organizations, flawless business execution depends on in-store and back-office systems working together in harmony. Multiple and diverse point-of-sale (POS) systems pump data to major internal applications that drive key processes, such as planning merchandise assortments, item allocation, replenishment and key performance indicator (KPI) metrics reporting. In reality, however, significant POS data quality issues and inconsistencies can and often do disrupt business operations. This white paper details a remediation approach that helps retailers validate and enhance POS data quality. The approach enables access to accurate, on-time data feeds, thus improving strategic decision-making throughout the business. Issues with Diverse POS Systems It’s not uncommon for retailers to use a mix of POS systems that are as varied as their product portfolios: old and new, open source and proprietary, cloud and on-premises.1 This is due to the timing of POS system retirement and replacement. New systems are usually deployed at a location/country level, leaving the other units within a franchise group or region with incom- cognizant 20-20 insights | january 2014 patible POS’s that have been retained following a merger or another business reason. Whatever the rationale, retail chains of all sizes often utilize a medley of POS systems. These diverse, heterogeneous IT environments can deliver reasonable business value, but they are not suitable for every organization, especially when POS data is needed by the consuming applications in real time. The issue is made even more complex because of different POS systems and data formats, as well as the absence of proper audit/validation. Both of these issues need to be rationalized before the data reaches the applications because they can adversely affect weekly allocation and replenishment capabilities, decision support, reporting accuracy, planning processes and various operational processes. Major issues that can impact POS data include: • Missing data: Data/information is not available in time for use in business processes. • Duplicate data: The same data/information is received more than once. • Suspicious or incorrect data: Data is present but incomplete, erroneous or unreliable. • Delayed data: Data is received beyond the optimal timeframe.
  • 2. As an end-state objective, POS data quality must be improved to enable retailers to reach several essential goals, including enhanced planning, forecasting and reporting; improved efficiency through faster data reconciliation; and the ability to “read” and respond to changes in business performance. velop possible corrective actions (including the conversion/upgrade of low-performing franchisee stores) and, when appropriate, implement the corrective actions. >> Based on the lessons learned from the work above, design, build and document a repeatable framework (tools and processes) aimed at identifying, analyzing and fixing the issues, as well as improving POS data quality for additional or subsequent populations of stores and data feeds. This framework can then be used to carry out similar DQV exercises across other populations or geographies. Remediating POS, One Step at a Time While a long-term solution would be a centralized data clearinghouse, such an implementation would not address the immediate needs of the business. To mitigate this problem, an interim solution can be outlined that strengthens data quality for partial store populations, one population at a time, starting with the most problematic populations. • Build a repeatable framework: The process of building and designing the repeatable framework to address the immediate needs of the business will consist of a series of steps to be performed by multiple, collaborating stakeholders, such as IT, key business partners and source system data providers. This flow of activities is depicted in Figure 1. The steps involved in this solution include the following: • Pilot before deployment: For starters, retailers need to take stock of their data quality validation (DQV) process and uncover key issues (see sidebar, page 4). DQV processes ensure that programs are operating with clean, correct and useful data. Such a pilot should be carried out with two objectives in mind: • Infrastructure requirements: Capture the environment and access requirements needed prior to the start of the POS data quality validation exercise. Include infrastructure requirements for production platforms, as well as those specific to the DQV platform (e.g., data capture file folders, mini-clearinghouse database, ad hoc programs, reports requirements, etc.). Having this completed before >> Demonstrate the data quality validation process via a proof of concept on a selected population of stores. Identify and catalog data quality issues, establish root causes, de- Orchestrating the POS Partner Ecosystem C. Interface documentation G. Collection of POS reports A. Infrastructure requirements B. Scope definition D. Requirements gathering for the consuming application F. Use and maintenance of database clearinghouse I. Development and maintenance of variance reports E. Documentation of known POS data issues Source System Resources Business Partners IT Resources (all steps) Figure 1 cognizant 20-20 insights H. Collection of live data feeds 2 J. Data fixes and remediation
  • 3. starting the DQV exercise will ensure hasslefree execution, with the project team focused on actual DQV activities rather than infrastructure problems. This activity should be performed to gather all the information for the known issues and to understand the scale and scope of data quality issues, the frequency of occurrence of those issues and the most problematic interfaces for the affiliate in question. This also helps the project team in the data analysis exercise down the line. • Scope definition: Identify the key elements that need to be defined as in- or out-of-scope for the next POS DQV exercise, such as a list of stores (categories based on affiliate, business model, POS system, etc.), a list of interfaces/ data feeds, variance reports, etc. • Use and maintenance of a database clearinghouse: Build the custom mini-clearinghouse database, as well as the copy and load processes for POS reports and live data feeds. These processes will be reused, configured or updated for feeding the POS data into the miniclearinghouse database. Because a scope definition explains which elements are in or out scope and why, this activity helps to precisely define the target elements of a particular execution of the DQV exercise/process. This phase involves creating data structures, uploading captured data and performing other supporting processes related to the mini-clearinghouse database. • Interface documentation: Describe the key information that needs to be collected, documented and verified to ensure full understanding and usability of the overall POS feeds integration architecture. Interface documentation is a crucial factor in the success of the exercise, as it provides a strong understanding of the data that is transmitted from the POS systems via the interfaces and helps determine whether adequate information is available to proceed with the DQV exercise. In the case of a data type or data definition mismatch, the interface documentation can expose the failure points in the data channels if they pertain to mapping issues between the various integration points. • Requirements gathering for consuming applications: Capture the requirements that are specific to each consuming application (such as replenishment, allocation and merchandising applications) regarding POS data feeds This ensures that the validation takes into account and analyzes all critical data elements present in the POS feeds. This step is critical to gaining insight into what the target applications expect the interface data feeds to contain and why. It also provides knowledge of specific data processing, filtering and logic for the affiliate POS interface in one or more of the consuming applications. • Collection of POS reports: This is one of the data collection activities. In this step, the various POS reports are reviewed, captured and pre-processed to make sure they are ready for the mini clearinghouse. • Collection of live data feeds: This is another step of data collection, which includes review, validation, capture and pre-processing of data feeds inbound and outbound of each integration point, preparing them for processing in the mini clearinghouse. This phase deals with the capture of in-scope data feeds as they are made available to the various in-scope integration points. • Development and maintenance of variance reports: Identify and develop the variance reports to capture various data exceptions and variances. A follow-up root cause analysis exercise helps identify the possible reasons for the exceptions. This phase explains which activities need to be performed so that the data loaded in the mini clearinghouse can be analyzed to understand the data quality issues. • Data fixes and recommendations: Review, assess, escalate, resolve and identify recommendations and actions. Follow-up issues and data fixes depend heavily on the local POS and system integrations. • Documentation of known POS data issues: Catalog historic and live POS data issues and analyze how they can be used in a POS DQV exercise, including the “accretion” DQV documented issues from one iteration to the next. cognizant 20-20 insights The objective of this phase is to build a prioritization, retention/rejection, remediation and implementation process to be followed for identified exceptions. 3
  • 4. • Implement the framework for “all” stores: This is a recurring step in which the retailer needs to implement the repeatable framework to resolve data quality issues for all stores. It is very important to acknowledge the differences of dissimilar groups of stores and adapt or customize the framework before using it for this exercise. In addition, lessons learned from past implementations should be utilized to improve the framework for speedier and more efficient roll-outs. Drivers for Success Key success criteria that should be examined to ensure a successful DQV exercise include: • Data issue fixes and remediation: Since the objective of this exercise is to resolve possible data quality issues, the main success criteria should stipulate that issues be fixed upon identification. • Issue identification: Ensure the tool that is built as part of the repeatable process framework is capable of identifying the issues as and when they exist. Quick Take POS Data Validation for a Global Apparel Retailer We worked with a retailer that was struggling with data quality issues in its 40-plus POS systems that sent data to its internal applications. The issue originated in its European region, where 22 affiliates use five different POS interfaces and eight consuming systems, which added to the data quality complexity. We partnered with the retailer’s IT and business teams to carry out a DQV exercise for its Eastern Europe unit and helped build a repeatable framework to be followed in other regions. The 15-week engagement addressed the DQV needs for 40-plus stores, with data flowing through four different POS systems. The exercise was conducted in two different tracks. The first track was to build the repeatable process/framework; the second was to test the framework with data from different sets of stores. Figure 2 offers a snapshot of the activities performed to conduct the POS data quality analysis exercise. Roughly 2,500 issues were identified and fixed through this pilot POS data validation exercise. The client is in the process of rolling out the repeatable process framework to other countries or affiliates in Europe. Week 18 Week 17 Week 16 Week 15 Week 14 Week 13 Week 12 Week 11 Week 10 Week 9 Week 8 Week 7 Week 6 Week 5 Week 4 Week 3 Week 2 Activities Week 1 Activities Performed During the DQV Exercise Infrastructure requirements Scope definition Interface documentation Knowledge Acquisition Documentation Requirements gathering for consuming applications Knowledge Acquisition Documentation Documentation of known EPOS data issues Use and maintenance of DB clearinghouse Collection of EPOS reports Construction Preparation Collection of live EPOS data feeds Data Load Data Sampling Window Data Capture Data Sampling Window Execution and development of variance reports Development Analysis of variances Execution and Pre-Analysis Root Cause Analysis of Variances Data fixes and remediation Figure 2 cognizant 20-20 insights 4
  • 5. • Issue documentation: To better understand issues for their fix and resolution, the tool should be able to document the issues completely and in a timely manner. • Communication: The tool’s ability to communicate issues to the team, along with a proper root cause analysis in near-real-time, influences timely and effective issue resolution. • Degree of repeatability: The DQV exercise should not be limited to only one geography or one set of stores. The tool/framework must, therefore, be easily replicated to a new geography or set of stores under consideration for data quality validation. Hence, the ability to implement a repeatable process framework in a reasonable timeline for any new geography or set of stores is another important criterion for success. • Data mining: An immense amount of data will be loaded into the mini-clearinghouse database during the entire sampling window, so the tool should be able to mine the resulting data. Looking Forward POS data utilization is fast becoming a necessity for retailers seeking to maintain their competitive advantage. POS data is critical in this regard, since it provides precious insight into consumer actions and, if properly applied, can help improve customer relationships and maximize internal business efficiencies. Moreover, once normalized, POS data serves as a vital input to various operational planning engines, such as replenishment and allocation. Such outcomes make a DQV exercise critical to any retailer’s short- and long-term future. DQV ensures that the POS data extracted from diverse POS systems is correct, reliable and actionable. And importantly, this exercise ensures that the right data reaches the consuming applications, enabling better and more accurate decision-making and planning. Footnotes 1 Shannon Arnold, “The Pros and Cons of Owning Diverse POS Systems,” Maitre’ D, March 27, 2013, http:// web.maitredpos.com/bid/279593/The-Pros-and-Cons-of-Owning-Diverse-POS-Systems. About the Authors Arindam Chakraborty is a Senior Consultant in Cognizant Business Consulting’s Retail Practice. He has eight years of experience in merchandising, supply chain and store operations with specialization in point of sale. Arindam has CPIM and CSCP accreditation and holds an M.B.A. from the Institute of Management Technology, Ghaziabad, India. He can be reached at Arindam.Chakraborty5@cognizant.com. Shilpi Varshney is a Consultant in Cognizant Business Consulting’s Retail Practice. She has more than five years of experience in store operations, focused on multichannel and supply chain strategy. Shilpi holds an M.B.A. from Management Development Institute, Gurgaon, India, earning the Gold Medal. She can be reached at Shilpi.Varshney@cognizant.com. Doug Dennison is a Senior Manager in Cognizant Business Consulting’s Retail Practice. He has more than 20 years of experience in store operations, store systems, loss prevention and workforce management. Doug holds an M.B.A. from Grand Valley State University in Grand Rapids, Michigan. He can be reached at Douglas.Dennison@cognizant.com. cognizant 20-20 insights 5
  • 6. About Cognizant Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world’s leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.), Cognizant combines a passion for client satisfaction, technology innovation, deep industry and business process expertise, and a global, collaborative workforce that embodies the future of work. With over 50 delivery centers worldwide and approximately 166,400 employees as of September 30, 2013, Cognizant is a member of the NASDAQ-100, the SP 500, the Forbes Global 2000, and the Fortune 500 and is ranked among the top performing and fastest growing companies in the world. Visit us online at www.cognizant.com or follow us on Twitter: Cognizant. World Headquarters European Headquarters India Operations Headquarters 500 Frank W. Burr Blvd. Teaneck, NJ 07666 USA Phone: +1 201 801 0233 Fax: +1 201 801 0243 Toll Free: +1 888 937 3277 Email: inquiry@cognizant.com 1 Kingdom Street Paddington Central London W2 6BD Phone: +44 (0) 20 7297 7600 Fax: +44 (0) 20 7121 0102 Email: infouk@cognizant.com #5/535, Old Mahabalipuram Road Okkiyam Pettai, Thoraipakkam Chennai, 600 096 India Phone: +91 (0) 44 4209 6000 Fax: +91 (0) 44 4209 6060 Email: inquiryindia@cognizant.com © ­­ Copyright 2014, Cognizant. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the express written permission from Cognizant. The information contained herein is subject to change without notice. All other trademarks mentioned herein are the property of their respective owners.