SlideShare a Scribd company logo
1 of 10
MapR Certified Developer, IBM Certified Hadoop Fundamentals. Completed Hadoop Developer, Pig and Hive
course from MapR. Learning Big Data, Hadoop and ecosystem tools like Pig, Hive, Flume, Sqoop, HCatalog,
Map Reduce, Spark, Hbase from 1.5 years. Installed & configured Hadoop cluster on laptop from scratch using
different version of Hadoop starting from 0.20 to 2.7 and other tools like Map Reduce, Pig, Sqoop, Flume,
Hive, Hbase, and Spark. Used distribution from MapR for the installation. In depth knowledge for Hadoop
architecture and framework. Below are few of the activities I have performed in Hadoop big data environment.
Hive:
Configured hive to
 Setting hive execution mode to strict so that query which result in Cartesian product or ran against
whole partitioned table will not executed.
 Using Lateral view with transform function for transformation and querying data.
 Altering table definition and properties without recreating them.
 Using complex data type like Map, Struct, Array.
 Running multiple queries against table in single pass thru using form clause first in the query.
 Creating Indexes on table to improve query performance.
 Using distributed cache.
 Creating user defined variable and using them in query for dynamic query execution and atomization.
 Granting or restricting user access in Hive for various activities like database creation, table deletion.
Defining role/user/group based security. Automatically granting default access to certain users.
Restricting user access on few columns/tables in table by using hive views.
Pig:
 Using complex datatype like Bag, Tuple & Map.
 Loading data into relations using different separators.
 Reading data from and writing data to HDFS using Hcatlog so that it can used across the ecosystem.
KRISHNA KUMAR SHARMA
Mobile: +1-201-300-5019
kreonakrish@gmail.com
He has been in the Information Technology Industry for more
than nine years. His experiences have been in Extract
Transform Load ETL and Reporting sides of Data
Warehousing & Business Intelligence.
 Data Warehousing
 Requirement Gathering
 Analysis & Design of Data Warehouses
 Extract Transform & Load (ETL)
He has been involved in all phases of a Data warehouse
projects during his tenure that includes Requirements
Gathering, Analysis & Design, Development, Reporting and
Testing for end-to-end IT solution offerings.
 Business Intelligence
 Reporting Solutions
He has worked with the Technologies like Informatica, SSIS
Business Objects and Actuate. Currently he is working for a
large insurance company supporting Corporate IT for HR
business on Informatica and Webfocus related applications.
 MapR Data Convergence
 IBM Certified Hadoop Fundamentals
 Oracle Certified Professional
 Informatica Certified Professional
 LOMA Certified Professional (ALMI)
Experience Summary:
Domain:
 Insurance– 9+ years
Technology:
 Data Warehouse and Business Intelligence – 9+ Years
 Doing various operation of data like filtering, grouping, ordering, joins...
Hadoop and MapReduce:
 Starting and Stopping services like NameNode, Secondary NameNode, Datanode, Yarn, Spark...
 Configuring hadoop and other tools.
 Used Hadoop Streaming to run map and reduce using different programming language like Ruby,
Phython...
 Understands how the data is stored and read from HDFS.
 Creating MapReduce program and executing them. Examining counter for overall job execution.
 Copying data to and from Cluster and native OS. Ingesting data into HDFS using Flume and Sqoop.
 Using dfsadmin to perform various activities like checking safe mode status, setting/clearing quota.
Turning on/off speculative execution.
Sqoop:
 Using Sqoop to import/export full/partial data from/to relational database.
 Specifying number of mappers and split by class for parallel processing to be used for data
import/export.
 Importing data into hive tables
Hbase:
 Created table and column families in Hbase.
 Loaded and retrieve data from table.
 Deleted row and dropped table from this columnar database.
Flume:
 Used Flume by creating and executing flume agents to read log data
Spark:
 Used Spark to create RDD on files (local and HDFS).
 Understand Spark architecture.
 Created Spark Context to used in Scala, python script. Spark is general purpose distributed engine
which can make use of memory as well disk for performing computation. Spark is 100 times faster
when used with in memory computation and 10 times faster when used with on disk computation
compare to MapReduce.
 Can create HiveContext to execute hive query using Spark engine.
 Used various transformation on rdd, pairrdd like filter, map, distinct, union, flatmap, subtract, join,
cogroup, reduceByKey, groupByKey and so on
 Performed various action on RDD's like count, reduce, take, first, saving as files and so on.
EDUCATION
B.Tech (Electronics & Communication) – Institute of Engineering & Technology, Bareilly – 78.7 %
HSCE (Physics, Chemistry, Math & Biology) – Kendriya Vidyalaya, Kanpur (CBSE) – 76%
SSC – Kendriya Vidyalaya, Kanpur (CBSE) – 76.8%
CERTIFICATIONS
MapR certified Hadoop Developer
MapR Certified Hadoop Cluster Administrator
MapR Certified Spark Developer
IBM Certified Hadoop Fundamentals
IBM Certified Spark Fundamentals
LOMA 320 conducted by Life Office Management Association, Inc.
LOMA 311 conducted by Life Office Management Association, Inc.
LOMA 307 conducted by Life Office Management Association, Inc.
LOMA 301 conducted by Life Office Management Association, Inc.
LOMA 290 conducted by Life Office Management Association, Inc.
LOMA 280 conducted by Life Office Management Association, Inc.
Informatica Certification (Power Centre 8 Map Designer) conducted by Informatica Corporation.
Oracle Certification (1Z0-007) conducted by Oracle Corporation.
UNIX Programming (General) conducted by Brainbench.
SQL (ANSI) Fundamentals conducted by Brainbench.
Business Communication conducted by Brainbench.
Programmer Analyst Aptitude conducted by Brainbench.
Data Warehousing conducted by Cognizant.
Life Insurance conducted by Cognizant.
General Insurance conducted by Cognizant.
TRAININGS & SEMINARS
MapR courses on Hadoop Development, Administration, and Analyst.
IBM Big Insights courses on Hadoop Development, Administration, and Analyst.
Working with Informatica Power Center 8 attended training in Cognizant as Project Specific Training.
Business Objects XI gave training to around 25 associates in Cognizant as Project Specific Training.
Actuate V9.0 – eRDPro gave training to around 12 associates in Cognizant as Project Specific Training.
Actuate V8.0, V10 – eSpreadsheet gave training to around 4 associates in Cognizant as Project Specific
Training.
Designing an Informatica based ETL System gave training to 4 associates in Cognizant as Project Specific
Training.
Basics of Data Warehousing gave training to around 24 associates in Cognizant as ELT Training.
TECHNICAL SKILLS
Operating System: UNIX (AIX Version 5), Windows 98, XP, Vista
Programming Language: PL/SQL, C, JAVA, .Net
Big Data Technology: MapR, Apache Hadoop, Spark, Hive, Pig, Sqoop, Hbase, etc
Database and RelatedTools: Oracle 10g, PostGre SQL, DB2 UDB V8, MySQL, SQL Server 2005, 2008 R2
Extract Transform Load ETL Tools: Informatica 9, 8, 7, SSIS 2008, Omni Extract, Oracle Watchlist
Screening, Enterprise Data Quality.
BI and Reporting Tools: Business Objects XI, 6.5, Crystal Reports XI, Actuate 9, 8, 7, Oracle Report Builder,
BIRT, SSAS, SSRS, OBIEE, Tableau.
Data Modeling Tools: ER Studio, Erwin Data Modeler
Project Support Tools: Prolite, eTracker, Quality Centre, Visio
WORK EXPERIENCE
Company Name: Cognizant Technology Solutions.
Designation: Senior Associate
Duration: Feb’ 07 – Till Date
Role: DW DI/BI Developer
PROJECT EXPERIENCE
1. Project Experience
Project : HR Data Warehouse
Client : Large P&C Insurance Company
Duration : Jul 2015 – Till Date
Domain : HR (Corporate)
Technology : Shell Scripting
Tools Used : Oracle, Informatica, Web Focus
Project Description: HR Data Warehouse is the centralized repository for Client’s multiple lines of business
for HR related data. The system provides employee assignments, time keeping, payroll and other related data to
various integrated systems.
Responsibilities:
Informatica:
 As part of production support we were managing the offshore team of 6 resources. We had to
monitor around 400+ workflows daily and monthly runs communicating with 15 systems. Along
with the regular production support we were managing the enhancement and maintenance
changes to the system on release basis.
2. Project Experience
Project : Oracle Watchlist Screening, FATCA (Foreign Account Tax Compliance Act)
Client : Large Life Insurance Company
Duration : Oct 2013 – Jul 2015
Domain : Insurance (Life)
Technology : Oracle Watchlist Screening, Enterprise Data Quality, PostGre SQL
Tools Used : Datanomics
Project Description: OWS and FATCA provides the business the capability to screen through all their
customer records with the global and local Sanctions, Politically Exposed People, EDD lists and generate alerts
if there are matches in the system. It generates alerts with different priority score based on the rules set in the
system and allows the compliance department to review those alerts and work on them via a workflow solution.
Responsibilities:
Datanomics:
 As part of production support we were managing the offshore team of 4 resources. We had
trained them in this screening tool and were managing all the production issues on 24x7 across
47 countries.
3. Project Experience
Project : Balanced Scorecard, Global Data Entry System
Client : Large Life Insurance Company
Duration : Aug 2011 – Jul 2015
Domain : Insurance (Life)
Technology : Net, Microsoft BI. SharePoint 2007
Tools Used : Visual Studio 2008, SQL Server 2008, Microsoft BI 2008
Project Description: Balanced Scorecard provides a platform for the Senior Management to view their KPI’s
and Operational metrics at a common platform organization wide. As part of production support we were
supporting all the issues and enhancements related to this application.
Global Data Entry System was designed to provide a workflow system to the Data Entry users to scan and load
the New Life Insurance policies sold through traditional channel into the eNew Business system. The system
captures all the scanned polices and extracts all the relevant information and sends them to eNb. All the
communications happens through the FTP, Web Services and MQs. The application uses Omni Extract, and
Omni Flow as the tool to provide the extract and workflow facilities.
Responsibilities:
Microsoft BI:
 As part of Production Support we manage all the issues related to SSAS, SSIS and SSRS
components of Microsoft BI stack.
 The .Net solution was designed to read the data from compliance source system via web services
and loads it in the mart using SSIS.
 SSRS was used as the reporting platform for the users and these reports were hosted in the
sharepoint.
SharePoint 2007:
 As the application is customized and hosted within the MOSS 2007, we deal with all the issues
related to w.r.t SharePoint 2007.
4. Project Experience
Project : Legal Affairs
Client : Large Life Insurance Company
Duration : May 2009 – Jul 2011
Domain : Insurance (Life)
Technology : SQL Server, Unix, Data Warehousing
Tools Used : Business Objects XI, Crystal Reports XI, SQL Server 2005, Excel
Project Description: Legal Affairs deals with the Legal Department of Client. Client uses Law Manager
Application to manage all the legal activities with their clients. They use Business Objects and Crystal Reports
for the reporting solutions. Client was facing issues with the configuration of Business Objects SSO and was
having some issues with the reporting. Cognizant proposed a full end to end solution for their reporting needs.
Responsibilities:
Business Objects:
 Design of Semantic Layer for the client and assist them in their reporting solutions.
 Resolved the issues related to SSO for the Business Objects application.
 Rearrange the Security Groups for the whole business objects environment.
 Designing reports using Info View, Desktop Intelligence, Crystal Reports.
 Created and resolved the issues with the Stored Procedures for SQL Server used in the Crystal
Reports.
 Modified the Business Manager’s query definitions used for the Crystal Reports querying.
5. Project Experience
Project : T360, Metrics DB
Client : Large Life Insurance Company
Duration : Aug 2009 – Jan 2010
Domain : Insurance (Life)
Technology : SQL Server, Data Warehousing
Tools Used : Business Objects XI, SQL Server 2005
Project Description: T360 is an application integrated with the Law Manager Application to capture the
billing details for Legal Affairs which deals with the Legal Department of Client. Client uses Law Manager
Application to manage all the legal activities with their clients. They use Business Objects and Crystal Reports
for the OLAP solutions. Client was looking for a solution to integrate the Law Manager application for loading
the data from feed files to SQL Server database and then to warehouse for their reporting needs.
Responsibilities:
Business Objects:
 Modification of the Universe for the client and assist them in their reporting solutions.
 Designing reports using Info View.
6. Project Experience
Project : Enterprise Project Management
Client : Large Life Insurance Company
Duration : Nov’ 08 – May’ 09
Domain : Insurance (Life)
Technology : SQL, Unix, Data Warehousing
Tools Used : Actuate, Clarity, Quality Centre, Maestro JSC, SQL Developer, SQL Loader
Project Description: Client is an insurance company and Cognizant is providing project management services
for various lines of business. Client is using Clarity as PPM tool for their Project Management requirements
and planning and Cognizant is providing support/maintenance for the application.
Responsibilities:
Actuate:
 Created a Team of associates who can develop Actuate reports and assist the client reporting
needs.
 Successfully completed the migration of all the reports from Actuate 8 to Actuate 9.
Clarity:
 Successfully created the Production Support Manual Document for monitoring the Clarity Jobs.
 Successfully monitored the Interface jobs and trained the team to execute it at chennai.
UNIX:
 Created UNIX shell scripts for Interface Redesign and resolved the issues related to the SQL
Loader file.
7. Project Experience
Project : Web Analytics
Client : Large Financial Services Company
Duration : May’ 08 – Nov’ 08
Domain : Insurance (Financial)
Technology : Oracle, SAS, SQL, Unix, Data Warehousing
Tools Used : Informatica 8.1, Business Objects XI, Project Plan, Visio, SQL Developer
Project Description: Client and its subsidiaries offer financial products and services including life insurance,
annuities, mutual funds, disability income insurance, bank products and more to its nearly 3 million members.
Client wanted to increase the usability of its internet site for that Cognizant analyzed their system and proposed
them a solution which will load the essential data into the database and then provide a platform for the
reporting needs of Client that will ultimately help them to increase the usability of the system.
Responsibilities:
Informatica:
 Designed the Data Mart that provides the Decision Mechanism for the Client team.
 Designed database views also for the reporting solutions.
8. Project Experience
Project : Advice Redesign
Client : Large Financial Services Company
Duration : Mar’ 08 – Nov’ 08
Domain : Insurance (Life)
Technology : SQL, SAS
Tools Used : Informatica 8.1, Erwin
Project Description: Client and its subsidiaries offer financial products and services including life insurance,
annuities, mutual funds, disability income insurance, bank products and more to its nearly 3 million members.
Client was using SAS system which was not meeting the business objectives of the clients, so they wanted to
migrate to Informatica based system.
Responsibilities:
Informatica:
 Analyses of the SAS code and then involved in the BRD creation and then designed the Data
Mart using Informatica and to provided end to end solution which included ETL and Reporting.
9. Project Experience
Project : Business Objects XI Reports
Client : Large Financial Services Company
Duration : Feb’ 08 – May’ 08
Domain : Insurance (Life)
Technology : SQL, Oracle, Data Warehousing
Tools Used : Business Objects XI, Xcelcius, Data Federator, Dashboard Manager
Project Description: Client uses Business Objects, Hyperion and others reporting solutions from different
databases for their reporting requirements, Since Client was interested in having a common platform for most
of their reporting needs Cognizant analyzed their reporting environment and proposed a new environment
which requires to integrate the specific sources from various database and transform the business logics into
virtual tables that can be used to develop a universe for reporting solutions.
Responsibilities:
Business Objects:
 Created Universe which takes data from heterogeneous sources.
 Created reports from Info View and Xcelcius.
 Worked with and Data Federator and provided a platform to the Universe Designer to create a
Universe from Oracle, DB2 and CSV data source.
10. Project Experience
Project : CSP Reporting
Client : Large Financial Services Company
Duration : Oct’ 07 – Feb’ 08
Domain : Insurance (Auto & Home)
Technology : SQL, Oracle, DB2, Java, Data Warehousing, Unix
Tools Used : Actuate 8, Informatica 8.1, TOAD, Microsoft Visio, Microsoft Access DB
Project Description: Client organization was at a stage in its support of the company’s drive for profitable
growth. Analysis of their current Phoenix Policy administration system concluded that the system would no
longer support their growth by year-end 2011. Client needs a new system platform to sustain their aggressive
growth, support future business and improve the overall client experience. Due to capacity and scale constraints
within the current Phoenix application, Client has purchased the Policy STAR and Fiserv Advanced Billing
(FAB) from the Fiserv product suite
Client initiated a program to implement Fiserv products, the program consist of multiple streams namely Gap
testing, Auto, Home, Decommissioning and Reporting. Since the Phoenix system will be decommissioned,
objective of the reporting stream was to reproduce the current reports through a new reporting environment,
which will include data from both Fiserv and Phoenix systems.
The RQA phase was followed by Report Design & Development phase. The reports design and the ETL design
were completed based on the SOW. this will be followed by subsequent SDLC phases namely Unit testing,
System and Parallel testing, UAT, Implementation and Post Implementation Support.
Responsibilities:
Informatica:
 Created mappings which will integrate data from two different sources and then load it to a
common platform.
 From the accredited layer data was loaded into the Reporting layer for the report developers.
Oracle:
 Converted all the information stored in the Access & DB2 into Oracle PL SQLs.
Actuate:
 Reports from Actuate were created which uses the concepts like
o Customizing the Active Portal
o Scheduling the Reports in the I Server using Autosys
o Working with IDAPI
o Triggering of the reports based on specific events
o Parameters in the main Query
o Results set from one Main query used as parameters for the subsequent queries
o Grouping in the reports so that the grouped data does not get repeated
o Access to all the reports in the I Server from a single platform based on their hierarchies
o Best practices used while migrating from one environment to other
 Developing Actuate e Spreadsheet reports.
 Developing an Actuate Team which will assist the client in their reporting needs.
11. Project Experience
Project : Claims Reporting
Client : Large P&C Insurance Company
Duration : May’ 07 – Oct’ 07
Domain : Insurance (Property & Casualty)
Technology : MS SQL Server 2005, Oracle 10g, Data Warehousing, HTML, Java Script
Tools Used : Actuate 7, Informatica 7.2, Business Objects 6.5, TOAD, Quality Centre, AQUA
Project Description: Client is one of the largest and oldest providers of Property and Casualty insurance
products in the United States. Business analysts and decision-makers had to rely heavily on manual efforts to
collect data from numerous stand-alone Claims data repositories to produce meaningful reports. Business did
not have an ad-hoc querying capability to analyze and verify the data presented in the reports and had difficulty
in accessing/viewing additional Claims data that was not presented in the Claims reports. Cognizant built
Claims Analytics data mart which would serve as the one-stop information repository for strategic decision
making process.
The Claims data mart will have Claims data extracted from HCS, the core claims system An
Operational data store was built to produce 13 compliance reports using Actuate Reporting Tool. The project
also designed the appropriate hub and mart structures to support the two reporting requirements which included
ad-hoc capabilities using Business Object query tools.
Responsibilities:
Actuate:
 Reports from Actuate were created which uses the concepts like:
o Report Bursting
o Browser Scripting Control
o Reading the DB connection details from INI files.
o Granting access to various reports users in the Actuate I Server.
o Creating a customized requestor page using HTML to redirect the user to various reports.
Informatica:
 Created Low Level Design document for Informatica Mappings.
 Created Informatica mappings including most of the transformations like Filter, Source
Qualifier, Joiner, Lookup, Update Strategy, and Normalizer according to the low level design
documents.
Business Objects:
 Created Universe to provide a platform for the adhoc reporting.
 Created Desktop Intelligence reports from the developed universe.
12. Project Experience
Project : Billing Analytics
Client : Large P&C Insurance Company
Duration : Feb’ 07 – Oct’ 07
Domain : Insurance (Property & Casualty)
Technology : MS SQL Server 2005, Oracle 10g, Data Warehousing
Tools Used : Informatica 7.2, Business Objects 6.5, Actuate 7.0, TOAD, Quality Centre, AQUA
Project Description: Client is one of the largest and oldest providers of Property and Casualty insurance
products in the United States. Business analysts and decision-makers had to rely heavily on manual efforts to
collect data from numerous stand-alone data repositories to produce meaningful reports. Business was not able
to analyze and view detailed billing transaction data (receivables & payments) for all billing types by selected
policy-level dimensions such as line of business, policy term, division (PL/CL), underwriting tier etc. Business
did not have the capability to utilize the billing historical data available in the data repositories to perform trend
analysis using the past payment behaviors to predict future behaviors, analyze payment cycles to identify
opportunities for improved cash flow, analyze the relationship between billing plans and payment patterns for
all billing types. Cognizant built the Billing data mart which would serve as the one-stop information repository
for strategic decision making process.
The Billing data mart would contain the core billing data (receivable & payment transaction details)
extracted from three different source systems namely Policy Management System (PMS), Payroll Deduct and
Electronic Fund Transfer (EFT) for Personal and Commercial lines of business.
Responsibilities:
Business Objects:
 Created Universe to provide a platform for the adhoc reporting.
 Created Desktop Intelligence reports from the developed universe.
Informatica:
 Created Low Level Design document for Informatica Mappings.
 Created Informatica mappings including most of the transformations like Filter, Source
Qualifier, Joiner, Lookup, Update Strategy, and Normalizer as per the low level design
documents.
Date:
Place: Krishna Kumar Sharma

More Related Content

What's hot

YARN: Future of Data Processing with Apache Hadoop
YARN: Future of Data Processing with Apache HadoopYARN: Future of Data Processing with Apache Hadoop
YARN: Future of Data Processing with Apache Hadoop
Hortonworks
 
Compute-based sizing and system dashboard
Compute-based sizing and system dashboardCompute-based sizing and system dashboard
Compute-based sizing and system dashboard
DataWorks Summit
 
Designing data pipelines for analytics and machine learning in industrial set...
Designing data pipelines for analytics and machine learning in industrial set...Designing data pipelines for analytics and machine learning in industrial set...
Designing data pipelines for analytics and machine learning in industrial set...
DataWorks Summit
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
abinash bindhani
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
Bharath Kumar
 
The rise of big data governance: insight on this emerging trend from active o...
The rise of big data governance: insight on this emerging trend from active o...The rise of big data governance: insight on this emerging trend from active o...
The rise of big data governance: insight on this emerging trend from active o...
DataWorks Summit
 
Moving Health Care Analytics to Hadoop to Build a Better Predictive Model
Moving Health Care Analytics to Hadoop to Build a Better Predictive ModelMoving Health Care Analytics to Hadoop to Build a Better Predictive Model
Moving Health Care Analytics to Hadoop to Build a Better Predictive Model
DataWorks Summit
 
IoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected WorldIoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected World
DataWorks Summit
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
arbind_jha
 
Oracle Advanced Analytics
Oracle Advanced AnalyticsOracle Advanced Analytics
Oracle Advanced Analytics
aghosh_us
 
Software engineering practices for the data science and machine learning life...
Software engineering practices for the data science and machine learning life...Software engineering practices for the data science and machine learning life...
Software engineering practices for the data science and machine learning life...
DataWorks Summit
 

What's hot (20)

Driving Enterprise Adoption: Tragedies, Triumphs and Our NEXT
Driving Enterprise Adoption: Tragedies, Triumphs and Our NEXTDriving Enterprise Adoption: Tragedies, Triumphs and Our NEXT
Driving Enterprise Adoption: Tragedies, Triumphs and Our NEXT
 
Madhu
MadhuMadhu
Madhu
 
YARN: Future of Data Processing with Apache Hadoop
YARN: Future of Data Processing with Apache HadoopYARN: Future of Data Processing with Apache Hadoop
YARN: Future of Data Processing with Apache Hadoop
 
Designing Data Pipelines for Automous and Trusted Analytics
Designing Data Pipelines for Automous and Trusted AnalyticsDesigning Data Pipelines for Automous and Trusted Analytics
Designing Data Pipelines for Automous and Trusted Analytics
 
Compute-based sizing and system dashboard
Compute-based sizing and system dashboardCompute-based sizing and system dashboard
Compute-based sizing and system dashboard
 
Designing data pipelines for analytics and machine learning in industrial set...
Designing data pipelines for analytics and machine learning in industrial set...Designing data pipelines for analytics and machine learning in industrial set...
Designing data pipelines for analytics and machine learning in industrial set...
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Talend For Big Data : Secret Key to Hadoop
Talend For Big Data  : Secret Key to HadoopTalend For Big Data  : Secret Key to Hadoop
Talend For Big Data : Secret Key to Hadoop
 
The rise of big data governance: insight on this emerging trend from active o...
The rise of big data governance: insight on this emerging trend from active o...The rise of big data governance: insight on this emerging trend from active o...
The rise of big data governance: insight on this emerging trend from active o...
 
ING- CoreIntel- Collect and Process Network Logs Across Data Centers in Real ...
ING- CoreIntel- Collect and Process Network Logs Across Data Centers in Real ...ING- CoreIntel- Collect and Process Network Logs Across Data Centers in Real ...
ING- CoreIntel- Collect and Process Network Logs Across Data Centers in Real ...
 
Moving Health Care Analytics to Hadoop to Build a Better Predictive Model
Moving Health Care Analytics to Hadoop to Build a Better Predictive ModelMoving Health Care Analytics to Hadoop to Build a Better Predictive Model
Moving Health Care Analytics to Hadoop to Build a Better Predictive Model
 
IoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected WorldIoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected World
 
Rescue your Big Data from Downtime with HP Operations Bridge and Apache Hadoop
Rescue your Big Data from Downtime with HP Operations Bridge and Apache HadoopRescue your Big Data from Downtime with HP Operations Bridge and Apache Hadoop
Rescue your Big Data from Downtime with HP Operations Bridge and Apache Hadoop
 
Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0
 
A Mayo Clinic Big Data Implementation
A Mayo Clinic Big Data ImplementationA Mayo Clinic Big Data Implementation
A Mayo Clinic Big Data Implementation
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Oracle Advanced Analytics
Oracle Advanced AnalyticsOracle Advanced Analytics
Oracle Advanced Analytics
 
Software engineering practices for the data science and machine learning life...
Software engineering practices for the data science and machine learning life...Software engineering practices for the data science and machine learning life...
Software engineering practices for the data science and machine learning life...
 

Viewers also liked

Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
Abhinav khanduja
 
Sanket Thakare- CV- Jan 2016
Sanket Thakare- CV- Jan 2016Sanket Thakare- CV- Jan 2016
Sanket Thakare- CV- Jan 2016
sanket Thakare
 
Sekhar Naskar-Sr DataStage Developer and Admin v5
Sekhar Naskar-Sr DataStage Developer and Admin v5Sekhar Naskar-Sr DataStage Developer and Admin v5
Sekhar Naskar-Sr DataStage Developer and Admin v5
Sekhar Naskar
 
OBIEE OBIA Developer Profile
OBIEE OBIA Developer ProfileOBIEE OBIA Developer Profile
OBIEE OBIA Developer Profile
Rajendra M
 

Viewers also liked (14)

Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Sanket Thakare- CV- Jan 2016
Sanket Thakare- CV- Jan 2016Sanket Thakare- CV- Jan 2016
Sanket Thakare- CV- Jan 2016
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer
 
Datastage developer Resume
Datastage developer ResumeDatastage developer Resume
Datastage developer Resume
 
Datastage 4.5 years Exp at IBM INDIA PVT
Datastage 4.5 years Exp at IBM INDIA PVTDatastage 4.5 years Exp at IBM INDIA PVT
Datastage 4.5 years Exp at IBM INDIA PVT
 
Divya Resume
Divya ResumeDivya Resume
Divya Resume
 
Sekhar Naskar-Sr DataStage Developer and Admin v5
Sekhar Naskar-Sr DataStage Developer and Admin v5Sekhar Naskar-Sr DataStage Developer and Admin v5
Sekhar Naskar-Sr DataStage Developer and Admin v5
 
Rajendra Thota CV
Rajendra Thota CVRajendra Thota CV
Rajendra Thota CV
 
OBIEE OBIA Developer Profile
OBIEE OBIA Developer ProfileOBIEE OBIA Developer Profile
OBIEE OBIA Developer Profile
 
BI Developer, NTU
BI Developer, NTUBI Developer, NTU
BI Developer, NTU
 
OBIEE Developer
OBIEE DeveloperOBIEE Developer
OBIEE Developer
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume
 
Efficient processing of large and complex XML documents in Hadoop
Efficient processing of large and complex XML documents in HadoopEfficient processing of large and complex XML documents in Hadoop
Efficient processing of large and complex XML documents in Hadoop
 
Hadoop Report
Hadoop ReportHadoop Report
Hadoop Report
 

Similar to BigData_Krishna Kumar Sharma (20)

Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
Rajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developerRajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developer
 
resumePdf
resumePdfresumePdf
resumePdf
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
Jeevananthan_Informatica
Jeevananthan_InformaticaJeevananthan_Informatica
Jeevananthan_Informatica
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Resume
ResumeResume
Resume
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Venkata Sateesh_BigData_Latest-Resume
Venkata Sateesh_BigData_Latest-ResumeVenkata Sateesh_BigData_Latest-Resume
Venkata Sateesh_BigData_Latest-Resume
 
Monika_Raghuvanshi
Monika_RaghuvanshiMonika_Raghuvanshi
Monika_Raghuvanshi
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
Kumar Godasi - Resume
Kumar Godasi - ResumeKumar Godasi - Resume
Kumar Godasi - Resume
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
Balamurugan.KM_Arch
Balamurugan.KM_Arch Balamurugan.KM_Arch
Balamurugan.KM_Arch
 

BigData_Krishna Kumar Sharma

  • 1. MapR Certified Developer, IBM Certified Hadoop Fundamentals. Completed Hadoop Developer, Pig and Hive course from MapR. Learning Big Data, Hadoop and ecosystem tools like Pig, Hive, Flume, Sqoop, HCatalog, Map Reduce, Spark, Hbase from 1.5 years. Installed & configured Hadoop cluster on laptop from scratch using different version of Hadoop starting from 0.20 to 2.7 and other tools like Map Reduce, Pig, Sqoop, Flume, Hive, Hbase, and Spark. Used distribution from MapR for the installation. In depth knowledge for Hadoop architecture and framework. Below are few of the activities I have performed in Hadoop big data environment. Hive: Configured hive to  Setting hive execution mode to strict so that query which result in Cartesian product or ran against whole partitioned table will not executed.  Using Lateral view with transform function for transformation and querying data.  Altering table definition and properties without recreating them.  Using complex data type like Map, Struct, Array.  Running multiple queries against table in single pass thru using form clause first in the query.  Creating Indexes on table to improve query performance.  Using distributed cache.  Creating user defined variable and using them in query for dynamic query execution and atomization.  Granting or restricting user access in Hive for various activities like database creation, table deletion. Defining role/user/group based security. Automatically granting default access to certain users. Restricting user access on few columns/tables in table by using hive views. Pig:  Using complex datatype like Bag, Tuple & Map.  Loading data into relations using different separators.  Reading data from and writing data to HDFS using Hcatlog so that it can used across the ecosystem. KRISHNA KUMAR SHARMA Mobile: +1-201-300-5019 kreonakrish@gmail.com He has been in the Information Technology Industry for more than nine years. His experiences have been in Extract Transform Load ETL and Reporting sides of Data Warehousing & Business Intelligence.  Data Warehousing  Requirement Gathering  Analysis & Design of Data Warehouses  Extract Transform & Load (ETL) He has been involved in all phases of a Data warehouse projects during his tenure that includes Requirements Gathering, Analysis & Design, Development, Reporting and Testing for end-to-end IT solution offerings.  Business Intelligence  Reporting Solutions He has worked with the Technologies like Informatica, SSIS Business Objects and Actuate. Currently he is working for a large insurance company supporting Corporate IT for HR business on Informatica and Webfocus related applications.  MapR Data Convergence  IBM Certified Hadoop Fundamentals  Oracle Certified Professional  Informatica Certified Professional  LOMA Certified Professional (ALMI) Experience Summary: Domain:  Insurance– 9+ years Technology:  Data Warehouse and Business Intelligence – 9+ Years
  • 2.  Doing various operation of data like filtering, grouping, ordering, joins... Hadoop and MapReduce:  Starting and Stopping services like NameNode, Secondary NameNode, Datanode, Yarn, Spark...  Configuring hadoop and other tools.  Used Hadoop Streaming to run map and reduce using different programming language like Ruby, Phython...  Understands how the data is stored and read from HDFS.  Creating MapReduce program and executing them. Examining counter for overall job execution.  Copying data to and from Cluster and native OS. Ingesting data into HDFS using Flume and Sqoop.  Using dfsadmin to perform various activities like checking safe mode status, setting/clearing quota. Turning on/off speculative execution. Sqoop:  Using Sqoop to import/export full/partial data from/to relational database.  Specifying number of mappers and split by class for parallel processing to be used for data import/export.  Importing data into hive tables Hbase:  Created table and column families in Hbase.  Loaded and retrieve data from table.  Deleted row and dropped table from this columnar database. Flume:  Used Flume by creating and executing flume agents to read log data Spark:  Used Spark to create RDD on files (local and HDFS).  Understand Spark architecture.  Created Spark Context to used in Scala, python script. Spark is general purpose distributed engine which can make use of memory as well disk for performing computation. Spark is 100 times faster when used with in memory computation and 10 times faster when used with on disk computation compare to MapReduce.  Can create HiveContext to execute hive query using Spark engine.  Used various transformation on rdd, pairrdd like filter, map, distinct, union, flatmap, subtract, join, cogroup, reduceByKey, groupByKey and so on  Performed various action on RDD's like count, reduce, take, first, saving as files and so on. EDUCATION B.Tech (Electronics & Communication) – Institute of Engineering & Technology, Bareilly – 78.7 % HSCE (Physics, Chemistry, Math & Biology) – Kendriya Vidyalaya, Kanpur (CBSE) – 76% SSC – Kendriya Vidyalaya, Kanpur (CBSE) – 76.8%
  • 3. CERTIFICATIONS MapR certified Hadoop Developer MapR Certified Hadoop Cluster Administrator MapR Certified Spark Developer IBM Certified Hadoop Fundamentals IBM Certified Spark Fundamentals LOMA 320 conducted by Life Office Management Association, Inc. LOMA 311 conducted by Life Office Management Association, Inc. LOMA 307 conducted by Life Office Management Association, Inc. LOMA 301 conducted by Life Office Management Association, Inc. LOMA 290 conducted by Life Office Management Association, Inc. LOMA 280 conducted by Life Office Management Association, Inc. Informatica Certification (Power Centre 8 Map Designer) conducted by Informatica Corporation. Oracle Certification (1Z0-007) conducted by Oracle Corporation. UNIX Programming (General) conducted by Brainbench. SQL (ANSI) Fundamentals conducted by Brainbench. Business Communication conducted by Brainbench. Programmer Analyst Aptitude conducted by Brainbench. Data Warehousing conducted by Cognizant. Life Insurance conducted by Cognizant. General Insurance conducted by Cognizant. TRAININGS & SEMINARS MapR courses on Hadoop Development, Administration, and Analyst. IBM Big Insights courses on Hadoop Development, Administration, and Analyst. Working with Informatica Power Center 8 attended training in Cognizant as Project Specific Training. Business Objects XI gave training to around 25 associates in Cognizant as Project Specific Training. Actuate V9.0 – eRDPro gave training to around 12 associates in Cognizant as Project Specific Training. Actuate V8.0, V10 – eSpreadsheet gave training to around 4 associates in Cognizant as Project Specific Training. Designing an Informatica based ETL System gave training to 4 associates in Cognizant as Project Specific Training. Basics of Data Warehousing gave training to around 24 associates in Cognizant as ELT Training. TECHNICAL SKILLS Operating System: UNIX (AIX Version 5), Windows 98, XP, Vista Programming Language: PL/SQL, C, JAVA, .Net Big Data Technology: MapR, Apache Hadoop, Spark, Hive, Pig, Sqoop, Hbase, etc Database and RelatedTools: Oracle 10g, PostGre SQL, DB2 UDB V8, MySQL, SQL Server 2005, 2008 R2 Extract Transform Load ETL Tools: Informatica 9, 8, 7, SSIS 2008, Omni Extract, Oracle Watchlist Screening, Enterprise Data Quality. BI and Reporting Tools: Business Objects XI, 6.5, Crystal Reports XI, Actuate 9, 8, 7, Oracle Report Builder, BIRT, SSAS, SSRS, OBIEE, Tableau. Data Modeling Tools: ER Studio, Erwin Data Modeler Project Support Tools: Prolite, eTracker, Quality Centre, Visio
  • 4. WORK EXPERIENCE Company Name: Cognizant Technology Solutions. Designation: Senior Associate Duration: Feb’ 07 – Till Date Role: DW DI/BI Developer PROJECT EXPERIENCE 1. Project Experience Project : HR Data Warehouse Client : Large P&C Insurance Company Duration : Jul 2015 – Till Date Domain : HR (Corporate) Technology : Shell Scripting Tools Used : Oracle, Informatica, Web Focus Project Description: HR Data Warehouse is the centralized repository for Client’s multiple lines of business for HR related data. The system provides employee assignments, time keeping, payroll and other related data to various integrated systems. Responsibilities: Informatica:  As part of production support we were managing the offshore team of 6 resources. We had to monitor around 400+ workflows daily and monthly runs communicating with 15 systems. Along with the regular production support we were managing the enhancement and maintenance changes to the system on release basis. 2. Project Experience Project : Oracle Watchlist Screening, FATCA (Foreign Account Tax Compliance Act) Client : Large Life Insurance Company Duration : Oct 2013 – Jul 2015 Domain : Insurance (Life) Technology : Oracle Watchlist Screening, Enterprise Data Quality, PostGre SQL Tools Used : Datanomics Project Description: OWS and FATCA provides the business the capability to screen through all their customer records with the global and local Sanctions, Politically Exposed People, EDD lists and generate alerts if there are matches in the system. It generates alerts with different priority score based on the rules set in the system and allows the compliance department to review those alerts and work on them via a workflow solution. Responsibilities: Datanomics:  As part of production support we were managing the offshore team of 4 resources. We had trained them in this screening tool and were managing all the production issues on 24x7 across 47 countries. 3. Project Experience
  • 5. Project : Balanced Scorecard, Global Data Entry System Client : Large Life Insurance Company Duration : Aug 2011 – Jul 2015 Domain : Insurance (Life) Technology : Net, Microsoft BI. SharePoint 2007 Tools Used : Visual Studio 2008, SQL Server 2008, Microsoft BI 2008 Project Description: Balanced Scorecard provides a platform for the Senior Management to view their KPI’s and Operational metrics at a common platform organization wide. As part of production support we were supporting all the issues and enhancements related to this application. Global Data Entry System was designed to provide a workflow system to the Data Entry users to scan and load the New Life Insurance policies sold through traditional channel into the eNew Business system. The system captures all the scanned polices and extracts all the relevant information and sends them to eNb. All the communications happens through the FTP, Web Services and MQs. The application uses Omni Extract, and Omni Flow as the tool to provide the extract and workflow facilities. Responsibilities: Microsoft BI:  As part of Production Support we manage all the issues related to SSAS, SSIS and SSRS components of Microsoft BI stack.  The .Net solution was designed to read the data from compliance source system via web services and loads it in the mart using SSIS.  SSRS was used as the reporting platform for the users and these reports were hosted in the sharepoint. SharePoint 2007:  As the application is customized and hosted within the MOSS 2007, we deal with all the issues related to w.r.t SharePoint 2007. 4. Project Experience Project : Legal Affairs Client : Large Life Insurance Company Duration : May 2009 – Jul 2011 Domain : Insurance (Life) Technology : SQL Server, Unix, Data Warehousing Tools Used : Business Objects XI, Crystal Reports XI, SQL Server 2005, Excel Project Description: Legal Affairs deals with the Legal Department of Client. Client uses Law Manager Application to manage all the legal activities with their clients. They use Business Objects and Crystal Reports for the reporting solutions. Client was facing issues with the configuration of Business Objects SSO and was having some issues with the reporting. Cognizant proposed a full end to end solution for their reporting needs. Responsibilities: Business Objects:  Design of Semantic Layer for the client and assist them in their reporting solutions.  Resolved the issues related to SSO for the Business Objects application.  Rearrange the Security Groups for the whole business objects environment.  Designing reports using Info View, Desktop Intelligence, Crystal Reports.
  • 6.  Created and resolved the issues with the Stored Procedures for SQL Server used in the Crystal Reports.  Modified the Business Manager’s query definitions used for the Crystal Reports querying. 5. Project Experience Project : T360, Metrics DB Client : Large Life Insurance Company Duration : Aug 2009 – Jan 2010 Domain : Insurance (Life) Technology : SQL Server, Data Warehousing Tools Used : Business Objects XI, SQL Server 2005 Project Description: T360 is an application integrated with the Law Manager Application to capture the billing details for Legal Affairs which deals with the Legal Department of Client. Client uses Law Manager Application to manage all the legal activities with their clients. They use Business Objects and Crystal Reports for the OLAP solutions. Client was looking for a solution to integrate the Law Manager application for loading the data from feed files to SQL Server database and then to warehouse for their reporting needs. Responsibilities: Business Objects:  Modification of the Universe for the client and assist them in their reporting solutions.  Designing reports using Info View. 6. Project Experience Project : Enterprise Project Management Client : Large Life Insurance Company Duration : Nov’ 08 – May’ 09 Domain : Insurance (Life) Technology : SQL, Unix, Data Warehousing Tools Used : Actuate, Clarity, Quality Centre, Maestro JSC, SQL Developer, SQL Loader Project Description: Client is an insurance company and Cognizant is providing project management services for various lines of business. Client is using Clarity as PPM tool for their Project Management requirements and planning and Cognizant is providing support/maintenance for the application. Responsibilities: Actuate:  Created a Team of associates who can develop Actuate reports and assist the client reporting needs.  Successfully completed the migration of all the reports from Actuate 8 to Actuate 9. Clarity:  Successfully created the Production Support Manual Document for monitoring the Clarity Jobs.  Successfully monitored the Interface jobs and trained the team to execute it at chennai. UNIX:  Created UNIX shell scripts for Interface Redesign and resolved the issues related to the SQL Loader file.
  • 7. 7. Project Experience Project : Web Analytics Client : Large Financial Services Company Duration : May’ 08 – Nov’ 08 Domain : Insurance (Financial) Technology : Oracle, SAS, SQL, Unix, Data Warehousing Tools Used : Informatica 8.1, Business Objects XI, Project Plan, Visio, SQL Developer Project Description: Client and its subsidiaries offer financial products and services including life insurance, annuities, mutual funds, disability income insurance, bank products and more to its nearly 3 million members. Client wanted to increase the usability of its internet site for that Cognizant analyzed their system and proposed them a solution which will load the essential data into the database and then provide a platform for the reporting needs of Client that will ultimately help them to increase the usability of the system. Responsibilities: Informatica:  Designed the Data Mart that provides the Decision Mechanism for the Client team.  Designed database views also for the reporting solutions. 8. Project Experience Project : Advice Redesign Client : Large Financial Services Company Duration : Mar’ 08 – Nov’ 08 Domain : Insurance (Life) Technology : SQL, SAS Tools Used : Informatica 8.1, Erwin Project Description: Client and its subsidiaries offer financial products and services including life insurance, annuities, mutual funds, disability income insurance, bank products and more to its nearly 3 million members. Client was using SAS system which was not meeting the business objectives of the clients, so they wanted to migrate to Informatica based system. Responsibilities: Informatica:  Analyses of the SAS code and then involved in the BRD creation and then designed the Data Mart using Informatica and to provided end to end solution which included ETL and Reporting. 9. Project Experience Project : Business Objects XI Reports Client : Large Financial Services Company Duration : Feb’ 08 – May’ 08 Domain : Insurance (Life) Technology : SQL, Oracle, Data Warehousing Tools Used : Business Objects XI, Xcelcius, Data Federator, Dashboard Manager Project Description: Client uses Business Objects, Hyperion and others reporting solutions from different databases for their reporting requirements, Since Client was interested in having a common platform for most of their reporting needs Cognizant analyzed their reporting environment and proposed a new environment
  • 8. which requires to integrate the specific sources from various database and transform the business logics into virtual tables that can be used to develop a universe for reporting solutions. Responsibilities: Business Objects:  Created Universe which takes data from heterogeneous sources.  Created reports from Info View and Xcelcius.  Worked with and Data Federator and provided a platform to the Universe Designer to create a Universe from Oracle, DB2 and CSV data source. 10. Project Experience Project : CSP Reporting Client : Large Financial Services Company Duration : Oct’ 07 – Feb’ 08 Domain : Insurance (Auto & Home) Technology : SQL, Oracle, DB2, Java, Data Warehousing, Unix Tools Used : Actuate 8, Informatica 8.1, TOAD, Microsoft Visio, Microsoft Access DB Project Description: Client organization was at a stage in its support of the company’s drive for profitable growth. Analysis of their current Phoenix Policy administration system concluded that the system would no longer support their growth by year-end 2011. Client needs a new system platform to sustain their aggressive growth, support future business and improve the overall client experience. Due to capacity and scale constraints within the current Phoenix application, Client has purchased the Policy STAR and Fiserv Advanced Billing (FAB) from the Fiserv product suite Client initiated a program to implement Fiserv products, the program consist of multiple streams namely Gap testing, Auto, Home, Decommissioning and Reporting. Since the Phoenix system will be decommissioned, objective of the reporting stream was to reproduce the current reports through a new reporting environment, which will include data from both Fiserv and Phoenix systems. The RQA phase was followed by Report Design & Development phase. The reports design and the ETL design were completed based on the SOW. this will be followed by subsequent SDLC phases namely Unit testing, System and Parallel testing, UAT, Implementation and Post Implementation Support. Responsibilities: Informatica:  Created mappings which will integrate data from two different sources and then load it to a common platform.  From the accredited layer data was loaded into the Reporting layer for the report developers. Oracle:  Converted all the information stored in the Access & DB2 into Oracle PL SQLs. Actuate:  Reports from Actuate were created which uses the concepts like o Customizing the Active Portal o Scheduling the Reports in the I Server using Autosys o Working with IDAPI o Triggering of the reports based on specific events o Parameters in the main Query
  • 9. o Results set from one Main query used as parameters for the subsequent queries o Grouping in the reports so that the grouped data does not get repeated o Access to all the reports in the I Server from a single platform based on their hierarchies o Best practices used while migrating from one environment to other  Developing Actuate e Spreadsheet reports.  Developing an Actuate Team which will assist the client in their reporting needs. 11. Project Experience Project : Claims Reporting Client : Large P&C Insurance Company Duration : May’ 07 – Oct’ 07 Domain : Insurance (Property & Casualty) Technology : MS SQL Server 2005, Oracle 10g, Data Warehousing, HTML, Java Script Tools Used : Actuate 7, Informatica 7.2, Business Objects 6.5, TOAD, Quality Centre, AQUA Project Description: Client is one of the largest and oldest providers of Property and Casualty insurance products in the United States. Business analysts and decision-makers had to rely heavily on manual efforts to collect data from numerous stand-alone Claims data repositories to produce meaningful reports. Business did not have an ad-hoc querying capability to analyze and verify the data presented in the reports and had difficulty in accessing/viewing additional Claims data that was not presented in the Claims reports. Cognizant built Claims Analytics data mart which would serve as the one-stop information repository for strategic decision making process. The Claims data mart will have Claims data extracted from HCS, the core claims system An Operational data store was built to produce 13 compliance reports using Actuate Reporting Tool. The project also designed the appropriate hub and mart structures to support the two reporting requirements which included ad-hoc capabilities using Business Object query tools. Responsibilities: Actuate:  Reports from Actuate were created which uses the concepts like: o Report Bursting o Browser Scripting Control o Reading the DB connection details from INI files. o Granting access to various reports users in the Actuate I Server. o Creating a customized requestor page using HTML to redirect the user to various reports. Informatica:  Created Low Level Design document for Informatica Mappings.  Created Informatica mappings including most of the transformations like Filter, Source Qualifier, Joiner, Lookup, Update Strategy, and Normalizer according to the low level design documents. Business Objects:  Created Universe to provide a platform for the adhoc reporting.  Created Desktop Intelligence reports from the developed universe.
  • 10. 12. Project Experience Project : Billing Analytics Client : Large P&C Insurance Company Duration : Feb’ 07 – Oct’ 07 Domain : Insurance (Property & Casualty) Technology : MS SQL Server 2005, Oracle 10g, Data Warehousing Tools Used : Informatica 7.2, Business Objects 6.5, Actuate 7.0, TOAD, Quality Centre, AQUA Project Description: Client is one of the largest and oldest providers of Property and Casualty insurance products in the United States. Business analysts and decision-makers had to rely heavily on manual efforts to collect data from numerous stand-alone data repositories to produce meaningful reports. Business was not able to analyze and view detailed billing transaction data (receivables & payments) for all billing types by selected policy-level dimensions such as line of business, policy term, division (PL/CL), underwriting tier etc. Business did not have the capability to utilize the billing historical data available in the data repositories to perform trend analysis using the past payment behaviors to predict future behaviors, analyze payment cycles to identify opportunities for improved cash flow, analyze the relationship between billing plans and payment patterns for all billing types. Cognizant built the Billing data mart which would serve as the one-stop information repository for strategic decision making process. The Billing data mart would contain the core billing data (receivable & payment transaction details) extracted from three different source systems namely Policy Management System (PMS), Payroll Deduct and Electronic Fund Transfer (EFT) for Personal and Commercial lines of business. Responsibilities: Business Objects:  Created Universe to provide a platform for the adhoc reporting.  Created Desktop Intelligence reports from the developed universe. Informatica:  Created Low Level Design document for Informatica Mappings.  Created Informatica mappings including most of the transformations like Filter, Source Qualifier, Joiner, Lookup, Update Strategy, and Normalizer as per the low level design documents. Date: Place: Krishna Kumar Sharma