SlideShare una empresa de Scribd logo
1 de 10
Venkata Ramana Sreepathi
Email: ramanasreepathi999@gmail.com
Ph.: 91-7396553280
PROFILE
♦ Result oriented professional with 14+ years of extensive experience in Build and release/
DevOps Engineer, designing & developing, project management and team management.
♦ Working experience on version control tool Subversion (SVN), GIT.
♦ Having experience on Continuous Integration, used to increase the productivity.
♦ Good experience on
♦ Jenkins used to schedule a job as per the requirement.
♦ Experience on adding plug-in to the Jenkins to extend Jenkins functionality.
♦ Good work experience in end to end building and Deploying and configuring process and
implementation from local QA environment to UAT and Production.
♦ Performed smoke tests on all the environments after a new build is deployed.
♦ Good experience in Deploying& Trouble shooting the files in Web server and Application
server.
♦ Working experience on ANT and MAVEN Build Scripts.
♦ Knowledge on writing a SHELL Script to execute a bunch of commands.
♦ Deploying the files inall the environment using shell script.
♦ Ability to accept challenge, learn and grow in good professional manner.
♦ Knowledge on Data base Deployment, and schemas and how to fetch the records.
♦ Having Knowledge on JIRA as a Change management Tool.
♦ Good knowledge on Methodologies like Agile
♦ Competencies across:
 Customer Relationship  Agile Methodologies  Technology Transfer
 Business Analysis  Onsite / Offshore Leadership  Project Scheduling & Tracking
 Team Management  Processes Improvement  Technical Architecting
 Motivation / Feedback
♦ Superior track record in project management activities encompassing planning, budgeting,
resource handling, risk management and team management & mentoring. Has often
demonstrated superior technical leadershipz.
♦ Proven abilities in managing projects, stakeholder interests, business communication;
designing the IT architecture and implementing end-to-end IT solutions in line with core
business objectives.
♦ Functional knowledge of Information Technology and Business Domains.
♦ Experience in people and talent management for highly technical employees from performance
management and career development. Drove highly technical teams of employees into very
effective and productive team.
♦ Highly experienced in conducting technology/ product evaluation, technical reviews of
business applications/ under-development products, and identifying risks & problems in the
architecture across the life-cycle.
♦ Domain Experience: Capital Markets, Investment Banking, etc.
♦ An impressive communicator with strong leadership, coordination, relationship management,
analytical and team management skills. Comfort in interacting with people across hierarchical
levels for ensuring smooth project execution as per client specifications.
♦ Expertise in data warehousing using Informatica PowerCenter / PowerMart in UNIX and
Windows Environments for medium to large enterprise data warehouses.
♦ Extensive experience in data conversion principles and concepts using different ETL Tools and
different versions of Oracle Databases.
♦ Hand on experience in Extraction, Transformation and Loading (ETL) Development
mechanism experience as a Sr. Informatica Developer using Informatica PowerCentre 7x/8x
product version’s with good analytical skills.
♦ Gained exposure in UNIX working environment using VI Editor, Writing Shell Scripts, etc.
♦ Experience in SQL, PL/SQL Procedures, Functions, Packages and Database Triggers in Oracle,
DB2, SQL Server 2008.
♦ Mapping requirements and coordinating in developing and implementing processes in line
with pre-set guidelines.
♦ Monitoring the overall functioning of processes, identifying improvement areas and
implementing adequate measures to maximize operational effectiveness.
♦ Ensuring continuous interaction with the vendors to make sure that area of concern can be
worked upon for improved service levels.
♦ Managing various project related activities involving project planning, execution and
management in tune with the core business objectives (including risk management, effort /
time / cost estimation).
♦ Administering progress as per scheduled deadlines for various tasks. Taking necessary steps to
ensure project completion within time, cost and effort parameters. Executing project plans
within pre-set budgets and deadlines.
♦ Maintaining close coordination between Onsite and Offsite teams for ensuring seamless
delivery of the project as per scheduled timelines.
♦ Resolving all support and operational issues in liaison with project managers & business
group.
♦ Achieving customer satisfaction by ensuring compliance with service quality norms and
building the brand image by exceeding customer expectations.
SKILL SET
Operating Systems : Windows 9x/NT/2003/XP, UNIX, Linux,Redhat
Version Control System : Subversion (SVN), Git and Perforce.
Build tools/Script language : ANT, Maven and Shell Scripting
Programming Languages : Java, PL/SQL, PERL, Unix Shell Scripting, PERL, Java, SAP Script,
Databases : SQL Server, MS-Access, Oracle 9i/10g/11g, DB2 8x, 9x,
CI Tools : Jenkins
Application Servers : Tomcat Websphere
Data Warehouse ETL Tools : Informatica 7x/8x/9x, Erwin, MSBI (SSIS, SSRS, SSAS).
ERP : SAP R/3 3.1H/ 3.1I/ 4.6C/ 4.7 (SD, MM, HR).
Project Management Tools : Microsoft Project, Rational Portfolio Manager
EDUCATION
 Master of Computer Applications (M.C.A), 1998, Osmania University
PROFESSIONAL EXPERIENCE
May ’15 – Till Date VelagaSoft Solutions Private Limited. DevOps Engineer.
Project: HUMANA Health care & life science, USA
Humana lab data integration comes under clinical IT of EDW development, a part of data
integration .The objective of this project is to integrate lab data from different lab vendors with
Humana enterprise data warehouse (EDW). This will help Humana in better management of lab
claims and HEDIS data reporting. This data is used for both HEDIS scoring and providing data for
modelling. So Humana can contact members in the early stages of diseases and help members
make better health choices sooner. So disease can either be stopped or slowed to manageable
level.
.
Key Highlights
• Worked within the Cloud AWS for integration processes.
• Performed DevOps for Linux and Windows platforms.
• Focused on automation and integration.
• Monitored developed applications and fixed bugs.
• Created a continuous delivery pipeline from the ground up built with git, puppet and Jenkins for
Target's Finance Integration Team.
• Deployed and configured WAS and Tomcat applications.
• Each Java application is automatically built, packaged and tested with git hooks and then deployed
to the various environments.
• Maintained subversion repositories for devops environment automation code and configuration.
• Developed bash scripts to redact sensitive data from apache access and error logs using a sed
expression, deploy war files to environments in parallel.
• Developed automation and deployment utilities using bash and python.
• Wrote custom monitoring and integrated monitoring methods into deployment processes to
develop self-healing solutions.
• Created automation and deployment templates for relational including mssql, mysql.
• Scheduling snapshots of volumes for backup and find root cause analysis of failures and
documenting bugs and fixes; scheduled downtimes and maintenance of cluster.
• Wrote Puppet Manifest files to deploy automated tasks to many servers at once
• Provide direct server support during deployment and general production operations.
• Worked with developers to ensure new environments both met their requirements and conformed
to industry-standard best practices.
Nov’12 –April’ 15 JPMorgan, Build & Release Engineer JPMC Pvt. Ltd.
• Project: JPMorgan
VCG IRIS-IVoRI is a Data warehouse financial project which provides Strategic Reporting of
Independent Revaluation Infrastructure Solution Program project for the Valuation Control Group
users. IVoRI project main purpose is to produce Daily and Monthly IPV/FVA (Independent Price
Value/Fair Value Adjustment) model price scenarios Reporting from all the cross application
source systems like fixed income, equities, Rates, Credit and Fx Rates and entitlement to the
respective source system VCG users to view the data and signoff the prices for their respective
securities positions. IVoRI is the IB’s independent control function with responsibility for ensuring
inventory is held at fair-value. The global group owns the IB’s price and parameter verification
process, manage fair value adjustments to account for uncertainty in trading books and face
regulators for all valuation matters. JPMorgan is NOT a market leader with regard to independent
valuation infrastructure capabilities. Current processes are supported by a collection of
fragmented and manually controlled infrastructure solutions that have evolved over time in
response to changing market and regulatory demands. A more sophisticated framework is
required to respond to expected future increases in demand (internal & external) and to position
JPMorgan as the market leader in a competitive environment where 95% of top-tier firms are
planning to invest in their independent valuation infrastructure.
Tools Used: SVN, Informatica Power Center 9.5.0, Athena, Aqua Studio, ORACLE, SYSBASE, Tortoise
SVN, XML, Erwin, PL/SQL, Windows 2003/2008, JIRA. Jenkins, Ant, UNIX, PowerShell Scripts
Key Highlights
• Performing the tasks of Branching & Merging the code.
• Creation of Release Branch.
• Responsible for labeling and building the code to be deployed.
• Perform the tasks of scheduling build automation, building service patch builds and scripts
• Writing Implementation plan and driving the implementation during releases
• Responsible for generating installation procedures and developing source code control systems
• Perform the task of conducting post-implementation reviews and in evaluating the release quality
• Tracking artifacts, Build and deployment information.
• Handle responsibilities of drafting and maintaining written procedures for build and release
management.
• Coordinating with testing team once the build is deployed to testing environment.
• Coordinating the Build errors and issues with development team.
• Responsible for coordinating with development and quality assurance staff for developing
infrastructure required for automating the release process
• Responsible for tracking metrics related to each release
• Handle the task of planning software deployment releases for in-house applications
• Maintenance, support and enhancements on Jenkins.
• Participate in DevOps and Release SCRUM meetings every day.
• Taking technical and functional training sessions for new team members. Working CI/CD and
DevOps concepts to automate the build and Deployment process for all the applications using
various tools.
• Involved in build and deploy, maintaining, and troubleshooting of various applications in various
testing and Production environments
• Created performance tuning techniques, being Informatica expert had identified few areas to fine-
tune the performance of some of the IVoRI ETL processes and delivered with great results in terms
of processing all RMS feeds files for both Daily/Monthly workflows.
• Developed FXClose logic when the base currency is not equal to USD and performed calculation for
those other currencies to have the conversion rate equalling to the base currency value to give the
accurate values in the measure values balances in the Reports for both Daily/Monthly workflows.
• Performed testing for the Unix Batch/Informatica ETL/Athena Reporting tasks for releases
evidences and worked with the release management life cycle for changes/new programs in the
project.
• Modified the Termination of the anonymous FTP (KPI Metrics to Conquest) to sftp file transferring,
worked with the keon UNIX group like creating user id and function id creations and their
association and establishing connection to the remote machine to recognize the IVoRI function
id/user id in their machine and changing my code from ftp to sftp calls to connect to
May ‘12 – Jul’12 ETL Tech Lead - Consultant Zieta Technologies Pvt. Ltd.,
India
• Project: BOI
Key Highlights
• Involved ETL Informatica design document analysis.
• Modified the source system data flow workflows for loading into ODS.
• Tested and Debugged workflow of the data from Source of Record to Staging area loading.
• Created ETL Loaders for EDW loading.
• Created Informatica Mappings into PDO compatible for Saving Accounts Teradata Warehouse
processing.
Tools Used: Informatica Power Center 9.1.0, Toad, ORACLE, Teradata, UNIX, XML, Erwin, PL/SQL,
Windows
May ‘12 – Jul’12 ETL Tech Lead - ConsultantTrinus/Mannara Technologies
Pvt. Ltd., India
Tools Used: Informatica Power Center 9.1.0/IDQ, L1/L2/L3 Support Toad, Teradata, ORACLE,
UNIX, XML, Erwin, PL/SQL, Windows 7
Key Highlights
• Developed ETL Estimate Guidelines Application Complexity Matrix (Very Simple, Simple,
Medium, Complex, Very Complex) for determining modules base time hours for assessment,
design, Build (Coding, Unit Test), Test (Integration, QA) and Implementation and build the
package from dev to QA and finally to the production total hours and sum the grand total hours
for the Project to complete.
• Created Daily Status document template with Task's and their status along with remarks and
also having a log of last update date, start date, code complete date and package delivery date
to maintain project status report.
• Analyzed Impact Analysis for the change requirements for every project cycle of coding and
making sure that we get correct requirements.
• Developed all the membership agreement and party agreement of Wyndam resorts
Informatica mappings design are completed.
• Created all membership class and membership status Informatica sessions are completed and
developed target Oracle scripts in the database.
• Created ETL Test Scripts to check the requirements in QA are met same as Integration and unit
testing in DEV environment.
• Build package for final deliver to the client production environment along with Functional and
Technical documentation.
• Provided production support by monitoring the processes running daily. Also assumed the role
of an L1/L2/L3 support analyst for multiple Informatica and Marketing and Operations Data
Mart based ETL systems, Desktop Client and Administrator (DA)
Dec ‘09 – Jan’12 ETL Tech Lead – Consultant Nomura Structured Finance
Service Pvt. Ltd., India
Tools Used: Informatica Power Center 8.6.1/IDQ, PowerExchange 8.6.1, L1/L2/L3 Support Teradata,
Toad, DB2, SQL Server 2008, ORACLE UNIX, XML, Erwin, PL/SQL, Windows 2003/XP.
Key Highlights
• Played a major role in creating Fixed Income Repo Collateral Intraday Informatica ETL
Loaders. Different Credit Desk Management CMO, CMBS, ARM POOLS, ABS was responsible for
administration of transactions including collateral allocation, marking securities to market and
substitution of collateral.
• Efficiently developed stored procedure to load the cube tables for generating stale price report.
• Instrumental in creating Informatica ETL Swaps Securities Loading into Risk Data Warehouse
to measure Delta, Gamma, Vega for generating these reports.
• Developed Informatica ETL Loader for capturing options securities which are created in
standardized form and traded on an anonymous options exchange among the general public
and other over-the-counter options are customized ad hoc to the desires of the buyer into the
securitized Data warehouse.
• Successfully created Informatica ETL Derivatives loader to load all the fixed income derivative
products like options, swaps, futures contracts as well as forward contracts into securitized
data warehouse.
• Developed Unix Shell Script to be able run individual Informatica workflows in a loop for
specified start and end dates from command prompt.
• Created Unix Shell Script to send Credit Risk Cube Cube Load Completion Notification E-Mail
and created Autosys job for scheduling the process.
• Changed the Shell Script Automation of Data Synchronization (Data of PAFICASH and
SAFICASH) from one prod database and backup database.
• Changed Apex Loader to add new feeds previously we used to get all live EOD REPO Trades
from VERTIGO.
• Created Informatica Loader, DB2 Table and Autosys Scheduler for loading Totoro NGFP US TFF
CRM by position trade data.
• Autosys process of Change all BTElls to BIngs in EOD and Intraday JobsWindow Server
changing the files paths from one directory to another directory and change the autosys path
to point to that new directory.
• Performance Improvements in all the EOD and INTRADAY Data Loading Jobs, created index on
date fields and modified the SQL queries inside Informatica workflow to use this index for the
all the workflows and changed to performance tunings methodologies of all the Informatica
mapping transformations and workflows.
• Created Shell Script for Pre Validation Script to compare the Source data header row with the
target data table in order to make sure that the number of column present in Target table are
equal to the no of fields present in Source data file.
• Designed and developed shell script for post-load data reconciliation components to check the
source data file rows with target database table rows.
• Created number of Windows batch scripts for the reports to be scheduled for autosys purpose.
• Created batch script which loads the dev folders files which are used for autosys box from prod
box, batch script takes the date parameter and loads them in the Dev Box from Prod Box.
• Provided production support by monitoring the processes running daily. Also assumed the role
of an L1/L2/L3 support analyst for multiple Informatica ETL systems for Front Office Fixed
Income Source System, Unix operations, Database Operations and also for my Autosys Systems.
Dec ‘09 – Jan’12 ETL Tech Lead – Consultant Nomura Structured Finance
Service Pvt. Ltd., India
Tools Used: Informatica Power Center 8.1, PowerExchange 8.1, Oracle 9i/10g, Teradata, Toad, DB2,
UNIX, XML, Erwin, PL/SQL, Windows 2003/XP.
Key Highlights:
• Analyzed system documentation like Requirements document, Gap Analysis User Interface
Specifications to develop technical document of the business flow and develop and execute the
mappings.
• Instrumental in writing PL/SQL procedures for processing business logic in the database and
pulling data and exporting into databases. Performance tuning of SQL queries for better
performance.
• Pivotal in checking the data flow and extensively using SQL Queries to extract the data from the
Database and conducted data quality profiling and Oracle explain plan.
• Efficiently worked with different databases such as Oracle, SQL Server and used Informatica to
extract data.
• Successfully developed UNIX Shell Script to load environment variables needed to run
PowerMart jobs using command tasks for invoking PL/SQL stored procedures to extract data
into temp tables.
• Created UNIX shell scripts to load data from flat files into databases and from databases to flat
files and sending emails to update the status.
• Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations
using multiple sources, transformations and targets from different databases and flat files to
facilitate one time, and Weekly Loading of Data for 1-800, TROOP, COB.
• Responsible for monitoring all the sessions that are scheduled, running completed and failed
using Workflow Monitor. Involved in debugging the Mappings that failed using debugger to
validate the mappings and gain troubleshooting information about data and error conditions.
• Modified all the existing Informatica Mappings by applying performance tuning techniques for
better performance.
• Installed, configured, troubleshot and administered Informatica 8.1 Server.
• Created Repository users, groups, folders, and logs and maintained them.
Mar‘08 – May’09 Informatica Developer Relycom Inc,NJ,Plainsboro
Key Highlights:
• Involved with the team in dimensional modelling of the Data Warehouse and designed Erwin
4.0 business process, grain, dimensions and measured facts design.
• Analyzed system documentation like Requirements document, Gap Analysis User Interface
Specifications to develop technical document of the business flow and develop and execute the
mappings.
• Created Sales workflow for changed data and scheduled for every 1hour for sales rep data
using Oracle CDC for registering sales dimension table using PowerExchange.
• Checking the data flow and extensively using SQL Queries to extract the data from the Database
and conducted data quality profiling and Oracle explain plan.
• Writing PL/SQL procedures for processing business logic in the database. Performance tuning
of SQL queries for better performance.
• Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations
using multiple sources, transformations and targets from different databases and flat files to
facilitate one time, Daily, Monthly and Yearly Loading of Data.
• Created Mappings to pull data from the sap source table and loaded data into Oracle database.
• Responsible for monitoring all the sessions that are scheduled, running completed and failed.
Involved in debugging the Mappings that failed using debugger to validate the mappings and
gain troubleshooting information about data and error conditions.
• Modified all the existing Informatica Mappings by applying performance tuning techniques for
better performance.
• Documented processing times for each module, developed test cases and used them to run
through each process. Performance Tuned the matching parameters based on test results.
• Involved in Ftp’ing the files from and to the server.
• Developed UNIX Shell Script to load environment variables needed to run PowerMart jobs
using pmcmd command line.
• Tested Datawarehouse after loading with the Business Intelligence environment reports.
Tools Used: Used: Informatica Power Center 7.1/8.1, Power mart, PowerExchange/CDC, SAP
ALE/IDOC, DIM, UNIX, Erwin, PL/SQL, Toad, Oracle 9i/10g, SQL SERVER, Cognos, DB2, XML,
Windows 2003.
Oct ‘05 – Nov’07 ETL Developer Logic Bytes,NJ
Key Highlights:
• Analyzed and interpreted the business requirements for a data warehouse system for the
Pharmacy system.
• Used type I and type II dimensions to update and insert the information in the data warehouse.
• Managed the entire “ETL process” involving the access, manipulation, analysis, interpretation
and presentation of information from both internal and secondary data sources to customers
in sales area.
• Used Informatica Designer to create complex mappings using different transformations,
Mapplets and Reusable Transformations for cleansing and move data to a Data Warehouse.
• Developed and tested data conversion for different distribution plants Data Mart from oracle 8
to oracle 9i and used xml target to transport the file from one version to another.
• Created Contexts, Joins and Aliases for resolving Loops and checked the Integrity of the
Universes and exported the universes to the Repository to making resources available to the
users using Designer.
• Worked extensively with complex mappings using expressions, Alerter, aggregators, filters and
procedures to develop and feed Data Marts.
• Improved the performance of the mappings by moving the filter transformations early into the
transformation pipeline, performing the filtering at Source Qualifier for relational databases
and selecting the table with fewer rows as master table in joiner transformations.
• Used Informatica Power mart connections for extraction and conversion of data from DB2 to
Oracle
• Worked with different kind of source like xml files.
• Created both sequential and parallel loading of the data from workflow manager.
• Developed complex queries using different data providers in the same report
• Have written stored procedures providing wrapping of business logic from User Interface
Layer.
• Created a shell script to copy files from one folder to another using Informatica PowerCenter
8.1.1 tool.
• Proactively involved in fixing invalid Mappings, testing of Stored Procedures and Functions,
Unit and Integration Testing of Informatica Sessions, Batch Jobs and the Target Data.
Tools Used: Informatica 6.0/7.1, Windows 2000, Oracle 8/9i, DB2, VB, XML, IIS 4.0.
feb ‘05 – Oct’05 Data warehouse Developer RadiantInfoTech,
Tampa,FL
Key Highlights:
• Actively participated in all phases including requirement analysis, client interaction, design,
coding, testing and documentation.
• Pivotal in creating the Mappings using Informatica to create the input feed according to the
standard format work file to process for accounting engine.
• Assisted in creating various mappings and Mapplets, transformations, lookups etc to validate
the fields and derive the fields based on the input and converting raw input data into standard
accounting data format.
• Developed several stored procedures for recycling and other accounting extraction purposes.
• Optimized Query Performance, Session Performance and Reliability.
• Extensively used Transformations for heterogeneous data joins, complex aggregations and
external procedure calls; developed Mapplets using corresponding Source, Targets and
Transformations.
• Played a major role in creating sessions and batch management and Performance Tuning and
initial testing followed by the volume testing. Resolve technical issues with software
consultants and vendors.
• Instrumental in populating warehouse and documentation of existing mappings as per
standards.
• Responsible for Dimensional Data Modelling and populating the business rules using mappings
into the Repository for Meta Data management.
Tools Used: Informatica Power Center 7.1/PowerExchange 7.1, Cognos, Unix, Erwin, PL/SQL,
XML, Oracle 7.0, DB2, Windows 2000/NT.
feb ‘05 – Oct’05 SAP/ABAP Technical Consultant Engineering
Analysis Services (EASI), Dearborn, MI
Key Highlights:
• Efficiently created an ALV Report that gives the percentage of total sales and by sale rep
percentage.
• Played a major role in creating marketing report that fetches sales order and categorizes sales
order and puts in appropriate sales order category and calculates the percentage of total sales
and sales category for the given period of time.
• Created ALE using sales order IDOC for International Shipping CLS Application.
• Modified materials extract report by adding filtering condition to extract materials details from
the extract file.
• Modified material price report by filtering condition to extract material price and gird details
in the extract file.
• Created user exits for va01 and va02 for credit card details for payment terms 0001 and bank
details of payee for payment term echk and for quot, ppay defaulting to blank while creating
the order.
• Modified sap script for pick ticket form added Old Material number and formatted the form
layout.
• Created a BDC program to delete duplicate customers for the ov52 transaction.
• Modified embroidery sales order processing to check the time and date of the credit card
details sales order with the embroidery entry of processing to shipping module application.
• Modified shipping application program to send international shipping details in xml format
and adding the printer options in xml format to the other application.
Tools Used: SAP, SAP Scripts, Smart Forms, BDC, Call transactions, ALE, IDOC.
Mar‘01 – Aug’04 Software Engineer Indus Networks,
India
Key Highlights: Worked in creating websites and creating and updating Data access layers.
Tools Used: SQL Server 2000, Data reports, Windows API, Visual Source Safe 6.0, Microsoft Visio
2000, MS Access under Windows NT, Visual Basic, Microsoft Office Suite, Visio and HTML.

Más contenido relacionado

La actualidad más candente (19)

Senior software manager / Engineer
Senior software manager / EngineerSenior software manager / Engineer
Senior software manager / Engineer
 
Yakesh qa final
Yakesh qa finalYakesh qa final
Yakesh qa final
 
Reinhard Weiss 2015
Reinhard Weiss 2015Reinhard Weiss 2015
Reinhard Weiss 2015
 
Bradley grigson agile collaboration_pm_ 2016
Bradley grigson agile collaboration_pm_ 2016Bradley grigson agile collaboration_pm_ 2016
Bradley grigson agile collaboration_pm_ 2016
 
Sakthi_04112016
Sakthi_04112016Sakthi_04112016
Sakthi_04112016
 
Current vacancies 2016
Current vacancies 2016Current vacancies 2016
Current vacancies 2016
 
Loubert s resume2016 pmn
Loubert s resume2016 pmnLoubert s resume2016 pmn
Loubert s resume2016 pmn
 
CV_Castillo, Jecrison D.
CV_Castillo, Jecrison D.CV_Castillo, Jecrison D.
CV_Castillo, Jecrison D.
 
Current vacancies 2016
Current vacancies 2016Current vacancies 2016
Current vacancies 2016
 
MSQ_Resume_19092016
MSQ_Resume_19092016MSQ_Resume_19092016
MSQ_Resume_19092016
 
CV_Nishanthi_shanmugasundaram
CV_Nishanthi_shanmugasundaramCV_Nishanthi_shanmugasundaram
CV_Nishanthi_shanmugasundaram
 
CV_Sanjay
CV_SanjayCV_Sanjay
CV_Sanjay
 
Hema_Testing
Hema_TestingHema_Testing
Hema_Testing
 
Jgayatri-QA-Profile
Jgayatri-QA-ProfileJgayatri-QA-Profile
Jgayatri-QA-Profile
 
Current vacancies 2016 v2
Current vacancies 2016 v2Current vacancies 2016 v2
Current vacancies 2016 v2
 
Project Manager - PMP [Resume]
Project Manager - PMP [Resume]Project Manager - PMP [Resume]
Project Manager - PMP [Resume]
 
Prasad_BV_updated
Prasad_BV_updatedPrasad_BV_updated
Prasad_BV_updated
 
Karun mandadi resume
Karun mandadi resumeKarun mandadi resume
Karun mandadi resume
 
QA_resume
QA_resumeQA_resume
QA_resume
 

Similar a Venkata_Ramana_Sreepathi

Srinivas pendam resume-nyc
Srinivas pendam resume-nycSrinivas pendam resume-nyc
Srinivas pendam resume-nyc
spendam
 
Furqan ul karim_latest_cv
Furqan ul karim_latest_cvFurqan ul karim_latest_cv
Furqan ul karim_latest_cv
Furqan Ul Karim
 
Mohan_Resume
Mohan_ResumeMohan_Resume
Mohan_Resume
Mohan P
 
Abey_Thomas_Resume
Abey_Thomas_ResumeAbey_Thomas_Resume
Abey_Thomas_Resume
Abey Thomas
 
Kumuda_Krishnamachari
Kumuda_KrishnamachariKumuda_Krishnamachari
Kumuda_Krishnamachari
Kumuda K
 
AbbyBrownAB_Resume
AbbyBrownAB_ResumeAbbyBrownAB_Resume
AbbyBrownAB_Resume
Abby Brown
 
AbbyBrownAB_Resume
AbbyBrownAB_ResumeAbbyBrownAB_Resume
AbbyBrownAB_Resume
Abby Brown
 

Similar a Venkata_Ramana_Sreepathi (20)

Srinivas pendam resume-nyc
Srinivas pendam resume-nycSrinivas pendam resume-nyc
Srinivas pendam resume-nyc
 
pega cssa sample Resume
 pega cssa sample Resume pega cssa sample Resume
pega cssa sample Resume
 
Arnab Chakraborty CV
Arnab Chakraborty CVArnab Chakraborty CV
Arnab Chakraborty CV
 
Manju_Resume
Manju_ResumeManju_Resume
Manju_Resume
 
Navin Latest
Navin LatestNavin Latest
Navin Latest
 
nitaanresumeout
nitaanresumeoutnitaanresumeout
nitaanresumeout
 
Furqan ul karim_latest_cv
Furqan ul karim_latest_cvFurqan ul karim_latest_cv
Furqan ul karim_latest_cv
 
Gokulkrishna BA.DOC
Gokulkrishna  BA.DOCGokulkrishna  BA.DOC
Gokulkrishna BA.DOC
 
Mohan_Resume
Mohan_ResumeMohan_Resume
Mohan_Resume
 
Abey_Thomas_Resume
Abey_Thomas_ResumeAbey_Thomas_Resume
Abey_Thomas_Resume
 
Resume
ResumeResume
Resume
 
Pritpal singh 3 years of ETL and Automation Testing
Pritpal singh 3 years of ETL and Automation TestingPritpal singh 3 years of ETL and Automation Testing
Pritpal singh 3 years of ETL and Automation Testing
 
John Bustos (1)
John Bustos (1)John Bustos (1)
John Bustos (1)
 
Shefengr qa 1
Shefengr qa 1Shefengr qa 1
Shefengr qa 1
 
Jayanth_Resume
Jayanth_ResumeJayanth_Resume
Jayanth_Resume
 
Kumuda_Krishnamachari
Kumuda_KrishnamachariKumuda_Krishnamachari
Kumuda_Krishnamachari
 
Resume Complete 112416
Resume Complete 112416Resume Complete 112416
Resume Complete 112416
 
Mayank-Tamrakar
Mayank-TamrakarMayank-Tamrakar
Mayank-Tamrakar
 
AbbyBrownAB_Resume
AbbyBrownAB_ResumeAbbyBrownAB_Resume
AbbyBrownAB_Resume
 
AbbyBrownAB_Resume
AbbyBrownAB_ResumeAbbyBrownAB_Resume
AbbyBrownAB_Resume
 

Venkata_Ramana_Sreepathi

  • 1. Venkata Ramana Sreepathi Email: ramanasreepathi999@gmail.com Ph.: 91-7396553280 PROFILE ♦ Result oriented professional with 14+ years of extensive experience in Build and release/ DevOps Engineer, designing & developing, project management and team management. ♦ Working experience on version control tool Subversion (SVN), GIT. ♦ Having experience on Continuous Integration, used to increase the productivity. ♦ Good experience on ♦ Jenkins used to schedule a job as per the requirement. ♦ Experience on adding plug-in to the Jenkins to extend Jenkins functionality. ♦ Good work experience in end to end building and Deploying and configuring process and implementation from local QA environment to UAT and Production. ♦ Performed smoke tests on all the environments after a new build is deployed. ♦ Good experience in Deploying& Trouble shooting the files in Web server and Application server. ♦ Working experience on ANT and MAVEN Build Scripts. ♦ Knowledge on writing a SHELL Script to execute a bunch of commands. ♦ Deploying the files inall the environment using shell script. ♦ Ability to accept challenge, learn and grow in good professional manner. ♦ Knowledge on Data base Deployment, and schemas and how to fetch the records. ♦ Having Knowledge on JIRA as a Change management Tool. ♦ Good knowledge on Methodologies like Agile ♦ Competencies across:  Customer Relationship  Agile Methodologies  Technology Transfer  Business Analysis  Onsite / Offshore Leadership  Project Scheduling & Tracking  Team Management  Processes Improvement  Technical Architecting  Motivation / Feedback ♦ Superior track record in project management activities encompassing planning, budgeting, resource handling, risk management and team management & mentoring. Has often demonstrated superior technical leadershipz. ♦ Proven abilities in managing projects, stakeholder interests, business communication; designing the IT architecture and implementing end-to-end IT solutions in line with core
  • 2. business objectives. ♦ Functional knowledge of Information Technology and Business Domains. ♦ Experience in people and talent management for highly technical employees from performance management and career development. Drove highly technical teams of employees into very effective and productive team. ♦ Highly experienced in conducting technology/ product evaluation, technical reviews of business applications/ under-development products, and identifying risks & problems in the architecture across the life-cycle. ♦ Domain Experience: Capital Markets, Investment Banking, etc. ♦ An impressive communicator with strong leadership, coordination, relationship management, analytical and team management skills. Comfort in interacting with people across hierarchical levels for ensuring smooth project execution as per client specifications. ♦ Expertise in data warehousing using Informatica PowerCenter / PowerMart in UNIX and Windows Environments for medium to large enterprise data warehouses. ♦ Extensive experience in data conversion principles and concepts using different ETL Tools and different versions of Oracle Databases. ♦ Hand on experience in Extraction, Transformation and Loading (ETL) Development mechanism experience as a Sr. Informatica Developer using Informatica PowerCentre 7x/8x product version’s with good analytical skills. ♦ Gained exposure in UNIX working environment using VI Editor, Writing Shell Scripts, etc. ♦ Experience in SQL, PL/SQL Procedures, Functions, Packages and Database Triggers in Oracle, DB2, SQL Server 2008. ♦ Mapping requirements and coordinating in developing and implementing processes in line with pre-set guidelines. ♦ Monitoring the overall functioning of processes, identifying improvement areas and implementing adequate measures to maximize operational effectiveness. ♦ Ensuring continuous interaction with the vendors to make sure that area of concern can be worked upon for improved service levels. ♦ Managing various project related activities involving project planning, execution and management in tune with the core business objectives (including risk management, effort / time / cost estimation). ♦ Administering progress as per scheduled deadlines for various tasks. Taking necessary steps to ensure project completion within time, cost and effort parameters. Executing project plans within pre-set budgets and deadlines. ♦ Maintaining close coordination between Onsite and Offsite teams for ensuring seamless delivery of the project as per scheduled timelines. ♦ Resolving all support and operational issues in liaison with project managers & business group. ♦ Achieving customer satisfaction by ensuring compliance with service quality norms and building the brand image by exceeding customer expectations. SKILL SET Operating Systems : Windows 9x/NT/2003/XP, UNIX, Linux,Redhat Version Control System : Subversion (SVN), Git and Perforce. Build tools/Script language : ANT, Maven and Shell Scripting Programming Languages : Java, PL/SQL, PERL, Unix Shell Scripting, PERL, Java, SAP Script, Databases : SQL Server, MS-Access, Oracle 9i/10g/11g, DB2 8x, 9x, CI Tools : Jenkins Application Servers : Tomcat Websphere
  • 3. Data Warehouse ETL Tools : Informatica 7x/8x/9x, Erwin, MSBI (SSIS, SSRS, SSAS). ERP : SAP R/3 3.1H/ 3.1I/ 4.6C/ 4.7 (SD, MM, HR). Project Management Tools : Microsoft Project, Rational Portfolio Manager EDUCATION  Master of Computer Applications (M.C.A), 1998, Osmania University PROFESSIONAL EXPERIENCE May ’15 – Till Date VelagaSoft Solutions Private Limited. DevOps Engineer. Project: HUMANA Health care & life science, USA Humana lab data integration comes under clinical IT of EDW development, a part of data integration .The objective of this project is to integrate lab data from different lab vendors with Humana enterprise data warehouse (EDW). This will help Humana in better management of lab claims and HEDIS data reporting. This data is used for both HEDIS scoring and providing data for modelling. So Humana can contact members in the early stages of diseases and help members make better health choices sooner. So disease can either be stopped or slowed to manageable level. . Key Highlights • Worked within the Cloud AWS for integration processes. • Performed DevOps for Linux and Windows platforms. • Focused on automation and integration. • Monitored developed applications and fixed bugs. • Created a continuous delivery pipeline from the ground up built with git, puppet and Jenkins for Target's Finance Integration Team. • Deployed and configured WAS and Tomcat applications. • Each Java application is automatically built, packaged and tested with git hooks and then deployed to the various environments. • Maintained subversion repositories for devops environment automation code and configuration. • Developed bash scripts to redact sensitive data from apache access and error logs using a sed expression, deploy war files to environments in parallel. • Developed automation and deployment utilities using bash and python. • Wrote custom monitoring and integrated monitoring methods into deployment processes to develop self-healing solutions. • Created automation and deployment templates for relational including mssql, mysql. • Scheduling snapshots of volumes for backup and find root cause analysis of failures and documenting bugs and fixes; scheduled downtimes and maintenance of cluster. • Wrote Puppet Manifest files to deploy automated tasks to many servers at once • Provide direct server support during deployment and general production operations. • Worked with developers to ensure new environments both met their requirements and conformed to industry-standard best practices.
  • 4. Nov’12 –April’ 15 JPMorgan, Build & Release Engineer JPMC Pvt. Ltd. • Project: JPMorgan VCG IRIS-IVoRI is a Data warehouse financial project which provides Strategic Reporting of Independent Revaluation Infrastructure Solution Program project for the Valuation Control Group users. IVoRI project main purpose is to produce Daily and Monthly IPV/FVA (Independent Price Value/Fair Value Adjustment) model price scenarios Reporting from all the cross application source systems like fixed income, equities, Rates, Credit and Fx Rates and entitlement to the respective source system VCG users to view the data and signoff the prices for their respective securities positions. IVoRI is the IB’s independent control function with responsibility for ensuring inventory is held at fair-value. The global group owns the IB’s price and parameter verification process, manage fair value adjustments to account for uncertainty in trading books and face regulators for all valuation matters. JPMorgan is NOT a market leader with regard to independent valuation infrastructure capabilities. Current processes are supported by a collection of fragmented and manually controlled infrastructure solutions that have evolved over time in response to changing market and regulatory demands. A more sophisticated framework is required to respond to expected future increases in demand (internal & external) and to position JPMorgan as the market leader in a competitive environment where 95% of top-tier firms are planning to invest in their independent valuation infrastructure. Tools Used: SVN, Informatica Power Center 9.5.0, Athena, Aqua Studio, ORACLE, SYSBASE, Tortoise SVN, XML, Erwin, PL/SQL, Windows 2003/2008, JIRA. Jenkins, Ant, UNIX, PowerShell Scripts Key Highlights • Performing the tasks of Branching & Merging the code. • Creation of Release Branch. • Responsible for labeling and building the code to be deployed. • Perform the tasks of scheduling build automation, building service patch builds and scripts • Writing Implementation plan and driving the implementation during releases • Responsible for generating installation procedures and developing source code control systems • Perform the task of conducting post-implementation reviews and in evaluating the release quality • Tracking artifacts, Build and deployment information. • Handle responsibilities of drafting and maintaining written procedures for build and release management. • Coordinating with testing team once the build is deployed to testing environment. • Coordinating the Build errors and issues with development team. • Responsible for coordinating with development and quality assurance staff for developing infrastructure required for automating the release process • Responsible for tracking metrics related to each release • Handle the task of planning software deployment releases for in-house applications • Maintenance, support and enhancements on Jenkins. • Participate in DevOps and Release SCRUM meetings every day. • Taking technical and functional training sessions for new team members. Working CI/CD and DevOps concepts to automate the build and Deployment process for all the applications using various tools.
  • 5. • Involved in build and deploy, maintaining, and troubleshooting of various applications in various testing and Production environments • Created performance tuning techniques, being Informatica expert had identified few areas to fine- tune the performance of some of the IVoRI ETL processes and delivered with great results in terms of processing all RMS feeds files for both Daily/Monthly workflows. • Developed FXClose logic when the base currency is not equal to USD and performed calculation for those other currencies to have the conversion rate equalling to the base currency value to give the accurate values in the measure values balances in the Reports for both Daily/Monthly workflows. • Performed testing for the Unix Batch/Informatica ETL/Athena Reporting tasks for releases evidences and worked with the release management life cycle for changes/new programs in the project. • Modified the Termination of the anonymous FTP (KPI Metrics to Conquest) to sftp file transferring, worked with the keon UNIX group like creating user id and function id creations and their association and establishing connection to the remote machine to recognize the IVoRI function id/user id in their machine and changing my code from ftp to sftp calls to connect to May ‘12 – Jul’12 ETL Tech Lead - Consultant Zieta Technologies Pvt. Ltd., India • Project: BOI Key Highlights • Involved ETL Informatica design document analysis. • Modified the source system data flow workflows for loading into ODS. • Tested and Debugged workflow of the data from Source of Record to Staging area loading. • Created ETL Loaders for EDW loading. • Created Informatica Mappings into PDO compatible for Saving Accounts Teradata Warehouse processing. Tools Used: Informatica Power Center 9.1.0, Toad, ORACLE, Teradata, UNIX, XML, Erwin, PL/SQL, Windows May ‘12 – Jul’12 ETL Tech Lead - ConsultantTrinus/Mannara Technologies Pvt. Ltd., India Tools Used: Informatica Power Center 9.1.0/IDQ, L1/L2/L3 Support Toad, Teradata, ORACLE, UNIX, XML, Erwin, PL/SQL, Windows 7 Key Highlights • Developed ETL Estimate Guidelines Application Complexity Matrix (Very Simple, Simple, Medium, Complex, Very Complex) for determining modules base time hours for assessment, design, Build (Coding, Unit Test), Test (Integration, QA) and Implementation and build the package from dev to QA and finally to the production total hours and sum the grand total hours for the Project to complete. • Created Daily Status document template with Task's and their status along with remarks and also having a log of last update date, start date, code complete date and package delivery date to maintain project status report. • Analyzed Impact Analysis for the change requirements for every project cycle of coding and making sure that we get correct requirements. • Developed all the membership agreement and party agreement of Wyndam resorts
  • 6. Informatica mappings design are completed. • Created all membership class and membership status Informatica sessions are completed and developed target Oracle scripts in the database. • Created ETL Test Scripts to check the requirements in QA are met same as Integration and unit testing in DEV environment. • Build package for final deliver to the client production environment along with Functional and Technical documentation. • Provided production support by monitoring the processes running daily. Also assumed the role of an L1/L2/L3 support analyst for multiple Informatica and Marketing and Operations Data Mart based ETL systems, Desktop Client and Administrator (DA) Dec ‘09 – Jan’12 ETL Tech Lead – Consultant Nomura Structured Finance Service Pvt. Ltd., India Tools Used: Informatica Power Center 8.6.1/IDQ, PowerExchange 8.6.1, L1/L2/L3 Support Teradata, Toad, DB2, SQL Server 2008, ORACLE UNIX, XML, Erwin, PL/SQL, Windows 2003/XP. Key Highlights • Played a major role in creating Fixed Income Repo Collateral Intraday Informatica ETL Loaders. Different Credit Desk Management CMO, CMBS, ARM POOLS, ABS was responsible for administration of transactions including collateral allocation, marking securities to market and substitution of collateral. • Efficiently developed stored procedure to load the cube tables for generating stale price report. • Instrumental in creating Informatica ETL Swaps Securities Loading into Risk Data Warehouse to measure Delta, Gamma, Vega for generating these reports. • Developed Informatica ETL Loader for capturing options securities which are created in standardized form and traded on an anonymous options exchange among the general public and other over-the-counter options are customized ad hoc to the desires of the buyer into the securitized Data warehouse. • Successfully created Informatica ETL Derivatives loader to load all the fixed income derivative products like options, swaps, futures contracts as well as forward contracts into securitized data warehouse. • Developed Unix Shell Script to be able run individual Informatica workflows in a loop for specified start and end dates from command prompt. • Created Unix Shell Script to send Credit Risk Cube Cube Load Completion Notification E-Mail and created Autosys job for scheduling the process. • Changed the Shell Script Automation of Data Synchronization (Data of PAFICASH and SAFICASH) from one prod database and backup database. • Changed Apex Loader to add new feeds previously we used to get all live EOD REPO Trades from VERTIGO. • Created Informatica Loader, DB2 Table and Autosys Scheduler for loading Totoro NGFP US TFF CRM by position trade data. • Autosys process of Change all BTElls to BIngs in EOD and Intraday JobsWindow Server changing the files paths from one directory to another directory and change the autosys path to point to that new directory. • Performance Improvements in all the EOD and INTRADAY Data Loading Jobs, created index on date fields and modified the SQL queries inside Informatica workflow to use this index for the all the workflows and changed to performance tunings methodologies of all the Informatica
  • 7. mapping transformations and workflows. • Created Shell Script for Pre Validation Script to compare the Source data header row with the target data table in order to make sure that the number of column present in Target table are equal to the no of fields present in Source data file. • Designed and developed shell script for post-load data reconciliation components to check the source data file rows with target database table rows. • Created number of Windows batch scripts for the reports to be scheduled for autosys purpose. • Created batch script which loads the dev folders files which are used for autosys box from prod box, batch script takes the date parameter and loads them in the Dev Box from Prod Box. • Provided production support by monitoring the processes running daily. Also assumed the role of an L1/L2/L3 support analyst for multiple Informatica ETL systems for Front Office Fixed Income Source System, Unix operations, Database Operations and also for my Autosys Systems. Dec ‘09 – Jan’12 ETL Tech Lead – Consultant Nomura Structured Finance Service Pvt. Ltd., India Tools Used: Informatica Power Center 8.1, PowerExchange 8.1, Oracle 9i/10g, Teradata, Toad, DB2, UNIX, XML, Erwin, PL/SQL, Windows 2003/XP. Key Highlights: • Analyzed system documentation like Requirements document, Gap Analysis User Interface Specifications to develop technical document of the business flow and develop and execute the mappings. • Instrumental in writing PL/SQL procedures for processing business logic in the database and pulling data and exporting into databases. Performance tuning of SQL queries for better performance. • Pivotal in checking the data flow and extensively using SQL Queries to extract the data from the Database and conducted data quality profiling and Oracle explain plan. • Efficiently worked with different databases such as Oracle, SQL Server and used Informatica to extract data. • Successfully developed UNIX Shell Script to load environment variables needed to run PowerMart jobs using command tasks for invoking PL/SQL stored procedures to extract data into temp tables. • Created UNIX shell scripts to load data from flat files into databases and from databases to flat files and sending emails to update the status. • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations using multiple sources, transformations and targets from different databases and flat files to facilitate one time, and Weekly Loading of Data for 1-800, TROOP, COB. • Responsible for monitoring all the sessions that are scheduled, running completed and failed using Workflow Monitor. Involved in debugging the Mappings that failed using debugger to validate the mappings and gain troubleshooting information about data and error conditions. • Modified all the existing Informatica Mappings by applying performance tuning techniques for better performance. • Installed, configured, troubleshot and administered Informatica 8.1 Server. • Created Repository users, groups, folders, and logs and maintained them. Mar‘08 – May’09 Informatica Developer Relycom Inc,NJ,Plainsboro
  • 8. Key Highlights: • Involved with the team in dimensional modelling of the Data Warehouse and designed Erwin 4.0 business process, grain, dimensions and measured facts design. • Analyzed system documentation like Requirements document, Gap Analysis User Interface Specifications to develop technical document of the business flow and develop and execute the mappings. • Created Sales workflow for changed data and scheduled for every 1hour for sales rep data using Oracle CDC for registering sales dimension table using PowerExchange. • Checking the data flow and extensively using SQL Queries to extract the data from the Database and conducted data quality profiling and Oracle explain plan. • Writing PL/SQL procedures for processing business logic in the database. Performance tuning of SQL queries for better performance. • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations using multiple sources, transformations and targets from different databases and flat files to facilitate one time, Daily, Monthly and Yearly Loading of Data. • Created Mappings to pull data from the sap source table and loaded data into Oracle database. • Responsible for monitoring all the sessions that are scheduled, running completed and failed. Involved in debugging the Mappings that failed using debugger to validate the mappings and gain troubleshooting information about data and error conditions. • Modified all the existing Informatica Mappings by applying performance tuning techniques for better performance. • Documented processing times for each module, developed test cases and used them to run through each process. Performance Tuned the matching parameters based on test results. • Involved in Ftp’ing the files from and to the server. • Developed UNIX Shell Script to load environment variables needed to run PowerMart jobs using pmcmd command line. • Tested Datawarehouse after loading with the Business Intelligence environment reports. Tools Used: Used: Informatica Power Center 7.1/8.1, Power mart, PowerExchange/CDC, SAP ALE/IDOC, DIM, UNIX, Erwin, PL/SQL, Toad, Oracle 9i/10g, SQL SERVER, Cognos, DB2, XML, Windows 2003. Oct ‘05 – Nov’07 ETL Developer Logic Bytes,NJ Key Highlights: • Analyzed and interpreted the business requirements for a data warehouse system for the Pharmacy system. • Used type I and type II dimensions to update and insert the information in the data warehouse. • Managed the entire “ETL process” involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales area. • Used Informatica Designer to create complex mappings using different transformations, Mapplets and Reusable Transformations for cleansing and move data to a Data Warehouse. • Developed and tested data conversion for different distribution plants Data Mart from oracle 8 to oracle 9i and used xml target to transport the file from one version to another. • Created Contexts, Joins and Aliases for resolving Loops and checked the Integrity of the Universes and exported the universes to the Repository to making resources available to the
  • 9. users using Designer. • Worked extensively with complex mappings using expressions, Alerter, aggregators, filters and procedures to develop and feed Data Marts. • Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering at Source Qualifier for relational databases and selecting the table with fewer rows as master table in joiner transformations. • Used Informatica Power mart connections for extraction and conversion of data from DB2 to Oracle • Worked with different kind of source like xml files. • Created both sequential and parallel loading of the data from workflow manager. • Developed complex queries using different data providers in the same report • Have written stored procedures providing wrapping of business logic from User Interface Layer. • Created a shell script to copy files from one folder to another using Informatica PowerCenter 8.1.1 tool. • Proactively involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batch Jobs and the Target Data. Tools Used: Informatica 6.0/7.1, Windows 2000, Oracle 8/9i, DB2, VB, XML, IIS 4.0. feb ‘05 – Oct’05 Data warehouse Developer RadiantInfoTech, Tampa,FL Key Highlights: • Actively participated in all phases including requirement analysis, client interaction, design, coding, testing and documentation. • Pivotal in creating the Mappings using Informatica to create the input feed according to the standard format work file to process for accounting engine. • Assisted in creating various mappings and Mapplets, transformations, lookups etc to validate the fields and derive the fields based on the input and converting raw input data into standard accounting data format. • Developed several stored procedures for recycling and other accounting extraction purposes. • Optimized Query Performance, Session Performance and Reliability. • Extensively used Transformations for heterogeneous data joins, complex aggregations and external procedure calls; developed Mapplets using corresponding Source, Targets and Transformations. • Played a major role in creating sessions and batch management and Performance Tuning and initial testing followed by the volume testing. Resolve technical issues with software consultants and vendors. • Instrumental in populating warehouse and documentation of existing mappings as per standards. • Responsible for Dimensional Data Modelling and populating the business rules using mappings into the Repository for Meta Data management. Tools Used: Informatica Power Center 7.1/PowerExchange 7.1, Cognos, Unix, Erwin, PL/SQL, XML, Oracle 7.0, DB2, Windows 2000/NT. feb ‘05 – Oct’05 SAP/ABAP Technical Consultant Engineering Analysis Services (EASI), Dearborn, MI
  • 10. Key Highlights: • Efficiently created an ALV Report that gives the percentage of total sales and by sale rep percentage. • Played a major role in creating marketing report that fetches sales order and categorizes sales order and puts in appropriate sales order category and calculates the percentage of total sales and sales category for the given period of time. • Created ALE using sales order IDOC for International Shipping CLS Application. • Modified materials extract report by adding filtering condition to extract materials details from the extract file. • Modified material price report by filtering condition to extract material price and gird details in the extract file. • Created user exits for va01 and va02 for credit card details for payment terms 0001 and bank details of payee for payment term echk and for quot, ppay defaulting to blank while creating the order. • Modified sap script for pick ticket form added Old Material number and formatted the form layout. • Created a BDC program to delete duplicate customers for the ov52 transaction. • Modified embroidery sales order processing to check the time and date of the credit card details sales order with the embroidery entry of processing to shipping module application. • Modified shipping application program to send international shipping details in xml format and adding the printer options in xml format to the other application. Tools Used: SAP, SAP Scripts, Smart Forms, BDC, Call transactions, ALE, IDOC. Mar‘01 – Aug’04 Software Engineer Indus Networks, India Key Highlights: Worked in creating websites and creating and updating Data access layers. Tools Used: SQL Server 2000, Data reports, Windows API, Visual Source Safe 6.0, Microsoft Visio 2000, MS Access under Windows NT, Visual Basic, Microsoft Office Suite, Visio and HTML.