We shared our lessons about building and scaling an AI practice from scratch at Open Data Science Conference Boston in 2018. AI's promise is helping uncover newer learnings, make effective decisions at scale and optimize existing operations to help leaders focus on what they do best "innovate, expand and grow". However it takes work - it requires a reliable data pipeline, mature analytics practice and "right" application of AI techniques. Our focus has been on AI-As-A-Service initially aimed at optimizing our operations. We shared our approach, milestones so far, lessons learnt the hard way and our vision for the future. Reach out to us with your comments, feedback and questions.
Schema on read is obsolete. Welcome metaprogramming..pdf
Building A Successful Artificial Intelligence Practice
1. Intended for Knowledge Sharing only
Quick recap of what it is
1
Building a Successful AI Practice
Open Data Science East 2018
Ramkumar Ravichandran
Director @ Visa, Inc.
Yash Shah
Data Scientist @ Visa, Inc.
2. Intended for Knowledge Sharing only 2https://memegenerator.net/Scumbag-Terminator/caption
fight all fraud…
increase your credit limit anytime you need…
HERE TO TELL YOU THAT WE ARE CREATING AN AI THAT WILL AUTOMATICALLY…
…will send gifts at special occasions
You think… fantastic! fabulous! awesome!!!
…but all of it would be a f-lie!
3. LET’S BREAK IT DOWN…
“Building a Successful AI Practice”
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
4. ARTIFICIAL INTELLIGENCE IS ALL ABOUT MAKING MACHINES SMARTER
“Building a Successful Artificial Intelligence Practice”
Artificial Intelligence
Machine
Learning
Deep
Learning
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com
https://www.youtube.com/watch?v=QizsAE4fBpQ&list=LLcphuW2awSOePdDqc-6nAvQ&index=1&t=0s
yashks2109@gmail.com
5. PRACTICE IS THE OVERARCHING SET UP THAT DELIVERS VALUE
“Building a Successful Artificial Intelligence Practice”
Practice is the set up that
delivers value on initiatives
through a combination of
systems, programs and people.
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
6. SUCCESS IS DEFINED AS MEETING THE GOAL OF INCREMENTAL VALUE
“Building a Successful Artificial Intelligence Practice”
Impact must be fundamental and
not just incremental over and
above the current set up,
optimization of current set up
and all other alternatives.
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
7. WE HAVE JUST BEGUN…
“Building a Successful Artificial Intelligence Practice”
The past 18 months we learnt, &
demonstrated value with POCs,
expanded use case list and forming
relationships that will play key
roles. We have just begun scaling!
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
8. Intended for Knowledge Sharing only
Quick recap of what it is
8
What are we doing and why…
How are we going about it…
Where are we and where next…
9. Intended for Knowledge Sharing only
Quick recap of what it is
9
What are we doing and why…
10. IT IS DEFINED BY WHO WE ARE, WHY WE DO WHAT WE DO AND HOW WE WANT TO DO IT BETTER, FASTER & CHEAPER…
About us
• Team: Digital Analytics in Product Organization
• Responsibilities: Enable Strategy, Optimize Execution, Drive Impact
• Scope: Strategy Analytics, Conversion Rate Optimization, Customer Lifecycle Management
How we do it…
• Insight Gaurdians across the entire Product Lifecycle
• Data ownership: Instrumentation, Platforming, Governance, Management
• Own the Analytics Value Chain: Data Science, Experimentation & Machine Learning
Why AI??
• Scope: So much to do all the time! New needs, problems, bugs, old issues surfacing back…
• Scalability: Not a “throw more resources at the problem” situation
• Impact: Focus is backwards or catch-up and we miss new opportunities!
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com
…we need something that can stay on top of everything, proactively alert on key drifts and their possible drivers, help
explore options, be easy to work with and keep stakeholders happy
yashks2109@gmail.com
11. BUT IT COMES WITH IT’S OWN PRE-REQUISITES…
Operating
Parameters
• Not black box: Interpretability, Verifiability and Customizability must!
• Tradeoff: Fit Accuracy vs. Execution RoI
• UED: Accessible, Available, Interactive and with support structure when necessary
Constraints
• On-premise preferred
• Controls: Privacy, Security, Regulatory, Legal
• Enterprise readiness: Not just exploratory R&D
• RoI: Budgetary resources for “support” needs
• Customization Scope: UED, Problem Statements, Solutions & Politics
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com
…for us the type of AI needs would be classified under AI-AS-A-Services and AI-As-A-Strategy buckets
yashks2109@gmail.com
12. …THE NEW SYSTEM NEEDS TO BE “ALWAYS ON”, CONNECTING THE DOTS, ALERTING, PERSONALIZED SO THAT PEOPLE CAN
DO WHAT THEY DO BEST –EXPAND, GROW & INNOVATE!
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
Optimize Strategy & Operations
• Strategy Development & Execution
• Innovation Delivery
• Performance Tracking & Intervention
• Business Operations
• Resource Investment Decisions (Finance)
• Strategic Research: Competitive
Monitoring, Regulatory, Policies, Legal
Metrics: Earnings Growth, Guidance
delivery, Investor Confidence
Optimize Product Lifecycle
• Strategy
• Experience
• Development
• Management
Metrics: Click Through Rate, Conversion,
%Happy Path, Speed, Distribution
Minimize Risk
• Decrease in Standard Risk levels (leak
through)
• Successful Prevention Rate
• New Risk Detection Efficiency
• Rule efficiency: FPR/FNR, Agent Reviews,
Reported
• Implementation cost: CXM, CSS
Metrics: Bad Rate Changes, %bad
prevented, %leak through, business KPI
impact
Optimizely Technology Delivery Cycle
• Development Prioritization
• Delivery Quality & Monitoring
• Cost of Development
• Platform Management
• Scalability: Compatibility, Detection,
Pre-emption & Prevention
Metrics: Uptime, Performance, #Story
points to Develop/Scale/Iterate, #Bugs/Bug
Rate
Optimize User Journey
• Campaign Strategy
• Performance Attribution
• Funnel Management (Omni)
• Cost Optimization
• Brand Management
Metrics: Awareness, Sentiment, Adoption,
CPE/CPM/CPC, Engagement, NPS, LTV
Optimize Sales Process
• Goal Setting, Monitoring & Tweaking
• Prospect Scoring & Prioritization
• Lead Funnel Management: Rate, Speed,
Cost
• Retention & Growth
• Turnover
Metrics: Topline, Time to Live, Cost of
Acquisition & Retention, Account growth/
NPS
13. YOU PULLING A FAST ONE ON US? CALLING TYPICAL MACHINE LEARNING SET UP, AI’ey?
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com
http://www.thefrisky.com/photos/12-bizarro-celebrity-look-a-likes/perry-ritter-caplan-deschanel-lookalike-jpg/
• Stakeholder Interface: Form (Voice, Bot, Smart Push), Personalized
and Learning based (on-the-edge)
• Process Management: Smart Intake, Pattern Identification &
Issue/Need Surveillance, Predictive help (“others like you”, “in the
past”), Dashboard & Alerts, Smart Communication, Knowledge
Indexing, Proactive Change Handling
• System Dynamics: Interplay drivers, anomaly detection & smart alerts
• Virtualization: Synthetic Bots for Scenario Simulation, Causation
Studies & Research
• Opportunity Identification & Sizing: Internal/External
yashks2109@gmail.com
14. Intended for Knowledge Sharing only
Quick recap of what it is
14
How are we going about it…
15. PRODUCT DEVELOPMENT LIFECYCLE
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
• Critical Review of Existing Set Up
• Use Cases (Opp, Impact, RoI)
• Goals, Success/Stop Criteria
• Readiness (Stakeholders, Data, Analytics
Maturity, People, Process, Tech, Culture)
• Alternatives, Optimization or worth AI?
• Stakeholder Persona (Who, How, Why,
What and their higher Order needs)
• Tactical: Platform, Program, Process
• Ownership & Plan of Action
• Use Case Scoring & Prioritization
• POC- Success/Lessons, Impact, RoI
• Optimization/Customization
• Review, Stress Test, UAT
• Plan & Timelines - Milestones
• Evangelize & Engage
• Platform creation
• Deploy, Monitor & Integrate
successful POCs in planned priority
• Usage Protocols : Guide & Comply
• Model Governance, Lineage,
Integration, Risk
• Support Framework: Admins, PMs,
Troubleshooters, Analysts
• Refine, Revamp or Retire?
• Models as Extensible Data
Products
• Innovation & Upgrade
ExtendManageBuildPrototypeDesignPlan
17. • Customer “adopt”-ability: Stakeholder Needs & Analytics Maturity Curve
• Data reliability: Coverage, Usability, Accessibility, Pipeline Reliability, Quality
• Analytics Practice Maturity Curve
• Skills: Available inhouse, Hire, Freelancers, Consultants, Augment
• Capability sizing (People-Process-Technology-Culture)
• Knowledge Management, Compliance, Communication set up for new needs
Use Cases
• Type: New problems, optimize delivery & extension of capabilities
• Focus Areas & Domain: Business Unit, Problems/Opportunities
• Pipeline: Internal, Research, Customers, Stakeholders, Competition, Regulation
• Prioritization: Scoring & Baselining (Internal, Stakeholders & External)
o Parameters: AI Fitness, Impact, RoI, Strategic Goals (level), Urgency, Feasibility, Tradeoffs,
Learning Curve, Efforts & Cost, Scalability, Readiness Assessment Score (from below),
Privacy/Regulatory/Legal Concerns, Politics
• Review of Prioritized List: Stakeholders, Leadership & Developers
• Roadmap & Expectations Setting
PLAN: DEFINED BY IDENTIFYING NEEDS, TEST USE CASES, SUCCESS CRITERIA & READINESS TO EFFECTIVELY USE AI SOLUTIONS
COMPONENTS DETAILS
Practice Goals
• Strategic KPIs: ΔKPI Baselines, RoI, Productivity
• Operational KPIs: Time to Action, #/$ Missed Issues, Stakeholder NPS
Readiness
Assessment
(Internal & External
Benchmarking)
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
18. Program
• Owners: Data Scientists, Data Engineers, Data Product Owners, Program Managers,
Developers, Stakeholders, Legal, Compliance, Model Risk Management, Support
• Knowledge Management: Customized User Friendly Documentation Guide, Code
Repository & Version Control, Compliance Records, Feature Store, Failed projects
• Standards: Model Governance, Lineage, Integration (Standard Payload, API), UX
• UX Design: Usage friendly, Integrated with existing set up, re-usage/extensible
• Engagement: Onboarding, Training, Brownbags, Gamification, Whitepapers,
Dashboards, Office hours, Offsites, Feedback (Surveys & In person), Sponsorships
Process
• Distinct Phases: Model Need Assessment, Data Operations (ETL & Model Prep),
Development & Validation, Deployment, Testing, Monitoring, Finetuning, Support
• Need Assessment & Data Operations: Agile (Front Door, Grooming, Commitment,
Releases, UAT, Acceptance)
• Development: Continuous Delivery
• Deployment: Kanban
• Monitoring, Finetuning & Support: Continuous Delivery
DESIGN: GOING IN VERSION OF PLATFORM, PROCESS, PROGRAM MANAGEMENT FRAMEWORK
COMPONENTS DETAILS
Platform
• Needs: End to End Platform needs
• Build vs. Buy Decision: Need & ability
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
19. DESIGN: WORKFLOW TO BE USED IN THE POC (KEPT OUT THE ASSESSMENT, PM & DOCUMENTATION PARTS)
KEY
1 – Extract /
Sample
2 – Spark
3 – SparkSQL
4 - Deployment
(Predictions +
Quality checks)
LEGEND
Needs to be
built
Existing
Significant
Effort
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
Servers, GPU,
Connectors
20. DESIGN PHASE: BUILD VS BUY DECISIONS
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
Full end-to-end platform: Must be noted that our need is more than just AML, starts from problem
conceptualization through the documentation stages. Eventual set up may be Data Ingestion & Processing layer,
AML and Programming layer, Rally for Project Management, Sharepoint and Native Site for Communication
AML specific:
Customizability and interpretability of models (We can’t work with Blackbox solutions)
Breadth of algorithms and use cases supported. Although Classification & Regression account for a
sizeable proportion, we also need support for Survival, Panel Data, Forecasting, Text Handling/NLU,
DN/RNN/CNN, etc.
Support for Prescriptive Analytics
Platform specific: Ease of integration with existing set up and potential AML/packages. Coverage of the entire
data lifecycle (Support Admin, Testing, Monitoring, Alerting).
Input data types supported and level of pre-data operations required.
Learning curve and level of support for training team and stakeholders.
Costs: Fixed, Operational, Integration, Training and Migration cost. Net RoI positive.
Documentation: Model Governance, Lineage, Integration, support documentation customization & analyzable.
Deployment ready: API, POJO, FTP dumps (APIs can be used to connect with Testing/Research/Analytics tools).
21. PROTOTYPE: USE CASE EXECUTION
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
Models which worked
• Targeting Campaigns
• KPI Forecasting & Strategic Guidance
• Sentiment Analyses & Theme extraction
• Risk Predictions
Types of problems that are tricky
• Dynamic front end integration/personalization
• Inputs that are dependent on the current state
(product/business changes drastically)
• Significant scaling costs & manual judgement
post production
• Legal/Contractual considerations
Roadmap
Use Case Sourcing
(Internal, Research, Stakeholders)
Scoring
(Internal, Stakeholders, Leadership)
Shortlist & Priority Review
POC
Outcome review
https://slidehunter.com/download-template/?did=MTQ3Njc%3D
23. WE HAD TO “WORK OUR WAY UP” THE ANALYTICS MATURITY CURVE TO BE READY & ABLE TO EFFECTIVELY DEPLOY AI
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com
60%
20%
10%
5% 5%
20%
30%
15%
10%
5%
20%
25%
25%
25%
20%
25%
25%
20%
15%
25%
20%
20%
20%
20%
15%
YEAR 1 YEAR 2 YEAR 3 YEAR 4 YEAR 5
Primary source of insights for decision making
Reporting Data Analytics User Research A/B Testing Advanced Analytics/Machine Learning Data Products Cognitive Analytics
yashks2109@gmail.com
24. THESE MOMENTS DEFINED OUR JOURNEY – PIVOTAL DECISIONS….
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
Learning Charter for the team: Extend the potential of the current team to take on AI projects. Carried
over the learning from building the system that exists today.
End to End AI platform first, AML later: Build phase not optimization phase yet.
Perseverance & thick skin pays: Initial reactions were “nice job”! As we kept hard selling and proved
value via Testing, we gained support to go ahead, deploy and deliver.
Show it, don’t just say it: Proposals, pitches didn’t work, a working prototype & test results clicked.
Stakeholder education, constant education and selling must.
25. …AND THE LESSONS WE LEARNT THE HARD WAY
Questions, comments, feedback? @decisions_2_0 ram.nit05@gmail.com yashks2109@gmail.com
Scalability as an afterthought: A promising model had to be shelved for scalability constraints.
Easy problems, not tough but big ones: Smart tradeoffs between speed & impact needed.
We “sucked” at sales: Quality work, demonstrated results but no one knew .
Forming right relationships: Who we thought as “resistors” earlier are best friends now!
Ownership doesn’t end on delivery, stay on top of implementation too.
26. Intended for Knowledge Sharing only
Quick recap of what it is
26
…and with that, we thank you for your
time. Do let us know your feedback &
thoughts!
28. REFERENCES
Analytics provides insights into “user behavior”, Research context on “motivations” & Testing helps verify the “tactics” in
the field and everything has to be productized…
Key benefits
Focus on Big Wins
Reduced Wastage
Quick Fixes
Adaptability
Assured execution
Learning for future
initiatives
Strategy
Data
Tagging
Data
Platform
Reporting
Analytics
Research
Cognitive
Iterative
Loop
Optimization
29. REFERENCES USED FOR LEARNING ABOUT BUILDING AI SYSTEMS
What is it?
· What is the difference between AL, ML and DL:
o https://www.youtube.com/watch?v=2ePf9rue1Ao&list=FLcphuW2awSOePdDqc-6nAvQ&index=3,
o https://www.youtube.com/watch?v=QizsAE4fBpQ
· Get trained on ML: https://www.youtube.com/watch?v=Cr6VqTRO1v0,
Executing it end-to-end:
· Model Project Lifecycle: https://www.dominodatalab.com/wp-content/uploads/domino-managing-
ds.pdf?mkt_tok=eyJpIjoiWmpjMk5tRTVabUZsTTJSayIsInQiOiJZZ3pnUkw3VlJQcWZlYnhZanBoQm1cL2diMmNIUEtkcVN5b0gzN3YrRE
1xMHdDdU01a2lISlJLc0htXC9rT2hGajRYdUt6V2dvVWdoNkZiWkdua2V4ZllkcSs4b1huVzFMRnVOUE95WHVvbGtMdjkyQTNSQk5TQXQy
Vm45ZGRzczc2In0%3D
· Machine Learning on AWS: https://docs.aws.amazon.com/machine-learning/latest/dg/what-is-amazon-machine-learning.html
What does the platform need?
· Automated Machine Learning: https://www.datasciencecentral.com/profiles/blogs/automated-deep-learning-so-simple-
anyone-can-do-it
· Machine Learning Model Performance Management: https://community.hds.com/community/products-and-
solutions/pentaho/blog/2018/03/06/4-steps-to-machine-learning-model-management
· What is Rest API: https://www.youtube.com/watch?v=7YcW25PHnAA
· Data labeling help: https://www.kdnuggets.com/2017/06/acquiring-quality-labeled-training-data.html
Actual use cases where we can build AI systems:
· ML with Kafka: https://www.confluent.io/blog/build-deploy-scalable-machine-learning-production-apache-kafka/
· Image classifier with Tensorflow: https://www.youtube.com/watch?v=QfNvhPx5Px8
· Building Chatbot with API.AI: https://www.youtube.com/watch?v=5iKdfPjEOJk&t=12s
Notas del editor
First we would want to define how will we monitor the health of the practice…it can be delta in the product KPIs, ROI , Lead times, End-user net promoter scores etc.
Then there are 2 lenses through which we plan – there’s the business lens to identify where will we be adding maximum value – how is our time best spent.
For this we have the use case scoring methodology which factors in internal, stakeholder and implementation criteria
2nd aspect is the technology aspect to understand how ready the org and your team is to implement AI – for this we have a maturity and readiness assessment which would identify gaps as well, also there is capability sizing which brings in the people and process aspect of things
Platform
Identify components of the value chain and fine-tune each of the parameters within the chain
In the program phase, we would want to set ownerships and responsibility for the process components and also define standards and the engagement strategy to drive adoption of AI through the enterprise