1. CURRICULUM VITAE
Sandeep Grandhi
E-Mail: sandeepgrk@gmail.com
Mobile: 91- 9790856775/9985942496
PROFESSIONAL SYNOPSIS:
• Having 6 years 6 Months of total IT experience in the Analysis, Design,
Development and ETL Implementation using Data warehousing technologies.
• Currently associated as Technology Analyst with Infosys.
• Traveled ONSITE for project deliverables.
• Holding USA B1/B2 Visa with validity of 10years.
• Previously employed with Tech Mahindra
• Interaction with client for Requirements, verification and analysis.
• Good knowledge on Data Warehousing Concepts.
• Adept at developing test plans using requirement documents and functional
documents.
• Exceptional communication, collaboration & team building skills with proficiency at
grasping new technical concepts quickly and utilize the same in a productive manner.
• Quick learner and having ability to meet tight deadlines.
• Worked on Performance improvement, Unix Shell Scripting.
SKILLS SET:
Data Warehouse/ETL Tool : Informatica9.x
Databases : Oracle 10g, Netezza
Languages : PL/SQL. SQL
CRM : Salesforce.com (SFDC)
Tools : TOAD, Autosys scheduler.
Operating Systems : Microsoft 95/98/2000/XP/7/8, UNIX
Business Domain : Automobile finance, Banking, Oil and Gas.
Achievements:
• Received Positive incident for the dedication towards the project.
• Received Appreciations from the client for my work
• Received POB (Pat on Back) award in response from the customer on the hard work put
in to meet the deadlines.
• Received BRAVO award for delivering the project to the customer within the timeline.
WORK EXPERIENCE:
• Worked in TECH MAHINDRA as Software Engineer from June 2010 to Nov
2013.
2. • Working in Infosys as Technology Analyst.
PROJECTS COORDINATED:
Project Name : STARS to SFDC Migration.
Client : Automobile finance company
Role : ETL Developer
Duration : Feb 2016 to till date
Description:
This project involves the migration of CRM platform from STARS to Salesforce.com (SFDC).
The STARS system will be completely replaced with the Salesforce.com cloud service.
Roles and Responsibilities:
a) Involved in Requirement Gathering and Business Analysis
b) Created mappings using various tools in Informatica Designer like Source Analyser,
Warehouse Designer, Mapplet Designer and Mapping Designer.
c) Worked with Salesforce objects as source and targets in the mapping.
d) Implemented Change Data Capture (CDC) at session level for extracting incremental
data from Salesforce.
e) Write SOQL (Subject Oriented Query Language) queries to validate the data between
Salesforce and targets.
f) Extracted data from Salesforce.com (SFDC), then used Oracle for data warehousing.
g) Extensively used Mapping variables and parameters for delta load and file
creation.
h) Developed ETL’s to generate the files for the downstream applications.
i) Written UNIX scripts to automatically send mail alerts regarding statistics and
success / failure of a job.
j) Written Shell script programs to run workflows through UNIX environment.
k) Troubleshooting the issues related to Salesforce with the help of Salesforce support
team.
l) Optimizing performance tuning at source, target, mapping and session level.
m) Created Autosys JIL (Job information Language) to schedule the jobs through
n) Autosys Tool.
o) Performed Unit testing, Integration testing and System testing of Informatica
mappings.
p) Involved in warranty production support and ensured ETL processes works as
expected
q) Provided Knowledge Transfer to the support team on the process and flow.
Tools & Environment:
Informatica Power Center 9.5.0, Oracle 11g, TOAD, UNIX, Salesforce Workbench and
Autosys.
Project: CSDR (Compliance Surveillance Data Repository)
3. Client: Bank Of America (NOV 2013- Jan 2016)
CSDR is the single source of data repository for all ML and BAC compliance requirements.
Data received from various domain feeds gets loaded into CSDR. Multiple downstream
systems consume the CSDR for various business requirements like Trade Surveillance,
Reporting etc. The data sources are mostly flat files which are pushed or pulled by FTP
server, Tables or Message Queues, which are dumped in the staging database by an ETL job.
Staging data are validated and transformed as per business logic and are loaded into fact
dimension tables accordingly in order to build CSDR.
Responsibilities:
• Involved in understanding the business requirement and coming up with solution for
existing Production issues.
• Designed mappings using Dynamic Lookups for SCD-2 type Dimension.
• Used lookup persistence cache properties to improve performance of Fact Load jobs.
• Worked on session partitioning to improve performance of workflow.
• Worked on scheduling tool Autosys.
• Used Exchange partition feature of DB to improve performance.
• Created UNIX shell script to validating and processing input flat files.
• Onsite- offshore team coordination.
• Data to day task analysis, allocation, support for team on technical doubts/issues,
review and sharing status reports.
b) Project GRID: GRID is Global Repairs Information Delivery (One an Half Years)
The GE Repairs Business is implementing an SAP solution to effectively manage the Global
Repair Business for the long-term future. Today’s Repair Business operates in 60 locations
across 13 countries, and spans 3 product lines: Heavy-duty gas turbines,
Steam/Generators, and Industrial.
Consequently, the new GRID Data Warehouse must be prepared to accommodate and
manage the new SAP source data where required for Reporting.
GRID data warehouse use multiple Source systems, such as SAP IDOC, SAP ABAP,
COMPAS, JDE, CS DW, CIP, FDM, Global DIMS, RCS, MS Excel etc. More than 150+
Informatica mappings and 100+ Congo’s reports need to be built on GRID DWH, scheduled
and emailed to users, whenever required. Additional solution output would also include the
ability to export report data into PDF, Excel, and CSV etc.
Contribution
As a Team Member, was responsible for
• Extensively involved in business and functionality of the system, understanding source
systems and requirement analysis.
• Involved in developing Informatica Mappings to extract data from different sources like
SAP IDOC, SAP ABAP Tables, SQL Server, Oracle, Flat FILES, and sales force etc.
• Developed number of complex Informatica mappings to implement the business logic.
• Created Tables, Views, Indexes, and SQL Scripts to load the data into the tables in the
Development environment.
4. • Provided the Production support for the production issues by troubleshooting the session
logs, bad files, error log tables and debugging the mappings at the time of migration
from Informatica 8.1 to 8.6.
• Prepared quality documents for the testing purpose.
c) Project SARAI Overview (One Year)
The purpose of this project is to:
• Automate partitioning of Orders, Sales, Receivable & Cost to P&L’s
• Create a structure that support P&L and regional view
• Support Parts & Repairs Technology view (Gas, Steam & Ind)
• Reporting by Job tie to G/L
The Informatica implementation will support Project SARA by performing data
transformation between CIP (Central Invoice Processing,) and the Oracle GL. CIP receives
invoice data from the COMPAS ERP system. COMPAS is a Job Costing system for the GE
Repairs & Parts Fulfillment Center, used by all US based Repair Facilities.
Contribution
• Understanding the project requirements and functionality.
• Involved in creation of source-to-target mapping.
• Create Mappings using the design Specification prepared
• Creating Sessions and Workflows using Workflow Manager
• Optimizing/Tuning mappings for better performance and efficiency.
• Thorough review of the Mappings, Sessions and Workflows
• Responsible for migration of code from Development environment to QA and from QA
environment to Production
• UAT and Implementation
d) Project SFDC Overview (One Year)
This project SFDC (salesforce.com) is about storing information in staging area
whose source is salesforce
• SFDC and Staging area will be integrated through Informatica mappings
• Interface will run every day incremental loads to load daily updates from SFDC to
staging area.
• SFDC and Staging area will be always in synch and have the exact replica of the
data.
• Staging are to different DWH will be based on individual system needs.
From the staging area downstream application pull data according to their requirement.
We have a mapping designed to generate files which are used by MCS team.
Contribution
• As source is an application (salesforce), worked on features like CDC Start
Timestamp and CDC End Timestamp, SOQL Filter Condition, CDC Time Limit
• Create Mappings using the design Specification prepared
5. • Provided the Production support for the production issues
• Prepared quality documents for the testing purpose
• Thorough review of the Mappings, Sessions and Workflows
• Extensively involved in business and functionality of the system, understanding
source systems and requirement analysis.
• Responsible for migration of code from Development environment to QA and from
QA environment to Production
e) Automation of Rejection report Preparation: (Four Months)
This involves automation of manual process in preparing report named Rejection
Report. Manual process involved executing Macros written in excel sheet, firing few
Queries to the database and extracting data and finally consolidating it to form the
complete report.
Contribution
• Understanding the excel Macros which is the logic for automating the manual
process
• Designing the mappings to implement the same logic used in macros
• Coordinating with reporting team who designed the report in Congo’s based on the
tables loaded by ETL design
• Testing the results and presenting them for UAT to the customer
• Preparing the technical design documents and mapping documents
EDUCATION DETAILS
• MCA Masters of Computer Applications (2011)
• Graduation: B.Sc (Computers) 2008
PERSONAL DETAILS:
Name : Sandeep Grandhi
Date of Birth : 01-11-1987
Email : sandeepgrk@gmail.com
Mobile : +91-9790856775
6. • Provided the Production support for the production issues
• Prepared quality documents for the testing purpose
• Thorough review of the Mappings, Sessions and Workflows
• Extensively involved in business and functionality of the system, understanding
source systems and requirement analysis.
• Responsible for migration of code from Development environment to QA and from
QA environment to Production
e) Automation of Rejection report Preparation: (Four Months)
This involves automation of manual process in preparing report named Rejection
Report. Manual process involved executing Macros written in excel sheet, firing few
Queries to the database and extracting data and finally consolidating it to form the
complete report.
Contribution
• Understanding the excel Macros which is the logic for automating the manual
process
• Designing the mappings to implement the same logic used in macros
• Coordinating with reporting team who designed the report in Congo’s based on the
tables loaded by ETL design
• Testing the results and presenting them for UAT to the customer
• Preparing the technical design documents and mapping documents
EDUCATION DETAILS
• MCA Masters of Computer Applications (2011)
• Graduation: B.Sc (Computers) 2008
PERSONAL DETAILS:
Name : Sandeep Grandhi
Date of Birth : 01-11-1987
Email : sandeepgrk@gmail.com
Mobile : +91-9790856775