This document is a resume for Sujit Kumar Jha, an Oracle Certified Professional (OCP) and Tuning expert with 13 years of experience in PL/SQL development. He has extensive experience designing, developing and implementing solutions for clients in finance, telecom and insurance. Some of his key skills include Oracle SQL, PL/SQL, Unix shell scripting, data modeling, ETL processes, and working in Agile methodologies. He has worked as a lead developer on numerous projects involving building databases, ETL code, reports and batch processes to meet business requirements.
1. SUJIT KUMAR JHA 732-579-7618 Lead PL/SQL Developer
er.sujit@gmail.com
Professional Summary:
Oracle Certified Professional (OCP) and Tuning expert having 13 yr’s experience on medium to
large-scale global project in Finance, Telecom and Insurance domain
Core work experience in programming on Oracle SQL, PL/SQL, Unix Shell Scripting, Data
modeling, Analysis, Scheduling, Monitoring, Batch Processing, Release & Deployment
management
Proficient in analyzing and translating business requirements to functional or technical
requirements into solution blueprint
Proficient to use Toad Data Modeler database design tool to visually create, maintain and
document new or existing database system’s ER Diagram for Data Warehouse/ Data mart
To identify project requirements and develop/deliver high-quality solutions to clients in response
to varying business needs under Agile methodology in Agile environment
Proficient in entire SDLC from Business Requirement, FRs, SRs, POC, ADD, Development to
implementation with emphasis on programming in Oracle SQL, PL/SQL for schema, table and
other DB objects creations like Package, Procedure, Functions, Trigger, Cursor etc.
Expertise in using Oracle 9i/10g/11g/12c Enterprise edition
Extensive experience in Oracle PL/SQL and Unix Shell Scripting
Proficient in related tools like Toad, PL/SQL Developer, SQL Developer, Toad Data Modeler
Proficient in query optimization and troubleshooting
Experienced in basic BO server administration on Windows/Unix platform e.g. user management,
access management, release management etc.
Implementing ETL codes and process flows with various PL/SQL blocks and shell scripts to trigger
ETL automatically and scheduling through Crontab, Autosys
Extensive experience of working closely with Clients from various client locations of USA and
Liaison between RDC offshore team and onsite team
Ability to work independently as well as work with teams having varying backgrounds on complex
issues and have strong verbal and written communication skills
Measuring various SLA (Service Level Agreement) metrics for Environment and Configuration
team and taking corrective action plan during LAM (Look Ahead Meetings)
Extensive experience with analysis, design, development, customizations, Build, Deployment and
implementation of software applications using agile methodologies
Effective in meeting with BU, stakeholders and subject matter experts for eliciting requirement and
translating those needs into concise functional and non-functional requirements
Maintain a sense of pride in a job well done; detail-oriented, professional, and self-directed
Thirst for knowledge and how to apply learning to address current company projects requirement
Problem-solving skills and the ability to think out of the box
Coordinate between development team and testing teams to ensure quality gates and right
method followed by project team
Data Analysis for OnLine Transaction Processing (OLTP) and Data Warehousing (OLAP)
applications
Sound understanding of the Data modeling concepts like Star-Schema Modeling, Snowflake
Schema Modeling, Fact and Dimension tables
Expert on Data Modeling - Conceptual, Logical, and Physical Data Models
Experience in Extracting, Transforming and Loading (ETL) data from spreadsheets, database
tables and other sources using Informatica
Education:
Master in Computer Application (MCA)
Master in Science (M. Sc.)
2. Certification:
Oracle Certified Professional (OCP)
Oracle 10g Tuning Expert
Technical Skills:
Databases Oracle 12c/11g/10g/9i/8i, MS Access, MDM,
Programming Languages SQL, PL/SQL, UNIX Shell Scripting, C and Co
Development Tools TOAD, PL/SQL Developer, SQL Developer,
Manager, SQL*Plus, , PuTTY, WinSCP, AFT-
Data Modeling Tools Toad Data Modeler, Software Engineering
(SE&A), Rational Software Architect (RSA) 9
Manager (RPM) 6, Rational Team Concert (
7.1, MS Visio
Oracle Utilities SQL*LOADER, UTL_FILE , Data Pump (Expo
Operating Systems Windows 2003/XP/7, HP UNIX, Linux, Sun So
Version Control & Migration
Software
Visual Source Safe, SVN, Jira, HP Service De
CM Synergy, Change Synergy, SM7, iTrack,
Production Job Scheduler Crontab, Autosys, Net Term/Tivoli Maestro/ C
COMPOSER 7.0
Professional Experiences:
Client: Nationwide, PA
Project: AIME (Affinity Insight Marketing Engine)
Position: Lead PL/SQL Developer Feb’16 – Nov’16
NATIONWIDE offers a full range of insurance and financial services across the US, including car,
motorcycle, boat, homeowners, pet, farm, life and commercial insurance, as well as administrative
services, annuities, mortgages, mutual funds, pensions, long-term savings plans and specialty health
services.
Harte Hanks help NW with supporting their Nationwide Affinity Solutions (“NAS”) entire process. The NAS
process includes populating and creating the Affinity Insights Marketing Engine and matching Affinity
Member records to Policy records. This database includes the Affinity Member Repository which holds all
the affinity member data, quote, policy data, and campaign selection and history data. Harte Hanks has
implemented a database solution called Affinity Insight Marketing Engine (AIME) for Nationwide. AIME
allows for new campaign setup, tracking of key marketing activities, as well as to enable comprehensive
reporting and analysis.
Roles & Responsibilities:
Understand end-to-end Data flow and data integration rules, standardize and cleansed data
3. Design and develop conceptual, logical and physical data models for data marts
POC for Trillium job stream used for cleanse, parse, geocode, generate window key
Redesign of Pre-merge which handles exact match and merge process which handles fuzzy
duplicates by creating Fact and Dimension tables for AIME data mart
Code changes to fix split household issue and introduce mapping table to keep track of changes
Through analysis of many complex packages and modified to improve the performance
Design solutions to complex problem with an emphasis on efficiency, quality, and simplicity
Ability to lead a diverse team with different skill level and background challenged with complex
task
Abstracts the complexity of a system into a manageable model that describes the essence of a
system by exposing important details and significant constraints
Created oracle objects like Tables with appropriate constraints, Views, Indexes, Sequences etc
Peta-byte scale datamart tuning through AIME for campaign management / merge management
Additional DBA responsibilities like debugging performance bottleneck and tuning the sequel by
using latest analytical function
Developed various anonymous PL/SQL blocks and named PL/SQL blocks like procedure,
function, trigger, cursor and packages as per need basis and tuned the underlined sql
Used Toad Data Modeler to create logical, physical and universal Entity Relationship Diagrams
(ERD) - models
Existing databases reverse engineered into Toad Data Modeler models and displayed as Entity
Relationship Diagrams (ERD)
Co-ordinate with DBA team for physical implementation
Have designed upgrade/modifications of existing schemas as per business requirements
Used SE&A for architecting new schema from scratch and reverse engineering to connect to DB
Responsible for establishing data modeling best practices
Client: Bank of America Merrill Lynch, NJ
Project: CDB G/L migration
Position: Lead PL/SQL Developer Jul’15 – Dec’15
Bank of America Merrill Lynch Central Data Base (CDB) G/L (General Ledger) migration project; the
initiative to implement single software across all financial systems. This project is intended to migrate entire
Merrill Lynch code from their legacy system to existing e-ledger Central DB. This initiative is due to
acquisition of Merrill lynch by Bank of America.
Central Database currently uses the historical Merrill Lynch six digit expense codes. The Bank moved to
a new 15 digit expense code structure in January 2015. Central Database needs to be updated to include
the new expense codes, company code and related code description fields so that Finance chargeback
processes are not impacted. In addition to the expense code updates, several financial hierarchy data
fields that correspond to the expense codes need to be updated to the database to support cost
analysis/research requests. Obsolete data fields need to be removed.
Roles & Responsibilities:
Responsible for data analysis/reconciling gap for missing data elements between CDB and RDR
Modified DB objects like procedure and table to make it fit as per Legal Entity Allocation process
Designing new tables and appropriate constraints on Oracle 11g database
Maintaining and Creating new DB objects like tables, procedure, function, trigger, cursor and
packages as per need basis using PL/SQL
Modified CDB Expense Code table and all areas where expense code data is displayed or entered
(e.g., Statement Templates screens, Chargeback screens, Reports), as well as any internal CDB
batch processes/updates
Modified/migrated entire Merrill Lynch code from old legacy system to new e-ledger Central DB
Analyzes source data files to layout to develop data conversion requirements
Ensures all required data elements are being received after conversion
Develop DB report as per client requirement and help to integrate them with UI
Regularly working on Various Data analysis requests and producing reports
Automated the Batch loading reports using tools like Autosys to create .jil file
4. Prepares initial Data Conversion File request and schedule for receiving data files using Autosys
Improve the process without impacting existing applications by tuning
Partners with data conversion/platform configuration team to identify specific scenarios in order to
ensure quality
Participates in internal testing and review of data/configuration
Defines data edits that should be run to validate accuracy of data received
Have to coordinate with UI team to understand the PL/SQL requirements and develop objects
Client: AT & T, NJ
Project: BNCS –IT (LPP)
Position: Lead PL/SQL Developer Jun’13 - Jul’15
AT&T is an American multinational telecommunication corporation headquartered in Whitacre Tower,
downtown Dallas, Texas. AT&T is the largest provider both of mobile telephony and of fixed telephony in
the United States, and also provides broadband subscription television services. AT&T is the third-largest
company in Texas (the largest non-oil company, behind only ExxonMobil and ConocoPhillips, and also
the largest Dallas Company).
AT&T recognized as one of the leading worldwide providers of IP-based communications services to
businesses. AT&T also has the nation's fastest and most reliable 4G LTE network. We also have the
largest international coverage of any U.S. wireless carrier, offering the most phones that work in the most
countries. AT&T operates the nation's largest Wi-Fi network.
Roles & Responsibilities:
Understand Business Requirement and prepare FRs, SRs, POC, ADD (HLD & LLD), Development
to implementation with emphasis on programming in Oracle SQL, PL/SQL and Unix Shell Scripting
Used to create schema, table and other DB objects like Package, Procedure, Functions, Trigger,
Cursor etc. based on its suitability and requirement
Have to understand End-to-end Data flow and data integration rules to implement logic
Daily activities include, PL/SQL anonymous blocks & Objects creation, integrating them with shell
script, Generating Reports using SQL queries
Have to build the Package to implement business logic, the entire transformation and load is
happening through PL/SQL packages as per inbound feed receives
Have to coordinate with Java team to understand the PL/SQL requirements
Writing various packages for each activity of DML (Insert/Update/Delete/Search etc), to be
performed from Front-end
Write code to maintain Data integration and dependency rules in above cases e.g. : If user delete
some information from a main table, corresponding all depended table entries should be deleted
through program
Understand various search criteria available on the front end and coming up with very highly tuned
SQL queries and performance improvement
Creating highly tuned, complicated SQL queries for various reporting requirements which runs
daily, weekly, monthly, Quarterly, semi-annually and annually
Coordinate between development team and testing teams to ensure quality gates and right
method followed
Reviewing Low level design, code, unit test documents produced by team members
Responsible for identifying the need for different data setup required for the testing phase and set
up the same by working closely with DBA team
In addition to these, creating various batch Jobs to automate report generation and transferring
flat files (Reports) to various other locations through FTP/ SFTP /SCP etc.
Scheduling above mentioned jobs in Unix crontab Scheduler and monitoring periodically
Responsible for continuous Build and continuous install in all production and non-production
environments like IT/ST/JST/STM/PM/DEV/UAT/LSP
Build & Install on all environments and fixes any issue if arises on daily and weekly basis as per
management direction
Builds and deploys Web Applications in a clustered and load balanced N-Tier infrastructure
Manage Release Schedule & Related Processes/Approval to ensure project team adhere
5. Client: Bank of America, NC
Project: FSR (Financial Systems Roadmap)
Position: Business Analyst/ Modeler Jun’11 – May’13
Bank of America Financial Systems Roadmap (FSR) Project, the initiative to implement single software
across all financial systems, managing small to mid-sized projects and supporting annual technology
releases. Drives collaboration and preparation of required technical infrastructure deliverables. Performs
all roles in Master Control Room during software testing and implementation to ensure successful plan
execution and facilitation of communication for the purposes of troubleshooting and executive status
reporting. Maintains FSR Release Management Work Breakdown Structure and provides graphical and
statistical summaries for dash boarding purposes.
Roles & Responsibilities:
Worked on entire SDLC from Business Requirement, FRs, SRs, POC, ADD, Development to
implementation with emphasis on programming in Oracle SQL, PL/SQL for schema, table and
other DB objects creations like Package, Procedure, Functions, Trigger, Cursor etc.
Worked on Hyperion DRM (Data Relationship Management) where reporting structure are
maintained, analyzed and validated
Hands on experience in the following aspects of Hyperion DRM:
Version creation / maintenance
Hierarchy creation / maintenance
Nodes creation / maintenance
Modeling the users
Defining properties
Defining formulae for properties
Creating validations and assigning to the respective hierarchies
Updating the property values
Working with exports and imports
Perform impact analysis, configuration design and development of business flow change requests,
addition of new business processes based on client requirements
Propose implementation strategies for new business requirements
Understand changes to business functionality and developing test plans
Write change control plans to implement application changes by creating procedure, functions and
packages
Coordinate with infrastructure support teams during Staging and Production deployment
Maintain the compliant state of the applications by following approved Standard Operation
Procedures (SOP)
Analyze reported defects and fix system bugs using Shell scripting and TOAD
Provide training and support to team members in understanding the critical business processes
Review and ensure that deliverables are prepared to satisfy the project requirements and ensure
that deliverables satisfy the project cost and schedule
Maintain the Oracle Database and MDM application server's objects
Extended support and guidance to client for publishing submission using TFS and nexus
Good understanding of Batch loading processes and tools like Autosys to create .jil file
Improve the process without impacting existing applications by tuning
Trouble shoots application problems and provides temporary as well as permanent solution to
mitigate the risks and take corrective action
Work on onsite - offshore model and coordinate among team members
Client: Sprint – Nextel, USA
Project: Commission
Position: Database Lead Nov’08 - Apr’11
6. Sprint Nextel’s commissioning project was intended to calculate commissions and pay based on channel
used for specific lines of Business: Long Distance Telephony, Wireless (PCS) Telephony and Marketing
Identities. This project was broadly subdivided into below three major applications:
G7L – Indirect channel commission calculation
G8L – Corporate Business channel commission calculation
COS – Company Operated Store (Retail channel commission calculation by Truecomp)
Roles & Responsibilities:
Understand end-to-end Data flow and data integration rules
As an Onsite Coordinator, have worked on many FRs and passed to RDC team as per business
requirement
Designed upgrade/modifications of existing schemas as per Business requirement
Had to capture requirements from Business and provide solutions
Providing High level Design and getting it signed off with client
Had to coordinate requirement walkthrough for offshore development & Testing team
Had to review Low level design, code, unit test documents produced by team members
Automated COS (Commissions) payroll process and cloning process effectively so client provided
accolade for this effort as an individual contributor
Used Unix shell script (Solaris) to automate above manual process and schedule it in Crontab
Collaborate and leading the deliveries within Onshore-Offshore model
Involved in monitoring the performance in production and accordingly proposing
upgrades/modifies in existing design like adding/deleting indexes in tables, partitioning tables ,
replacing traditional extract logic with implementation of materialized views etc.
Regularly working on Various Data analysis requests and producing reports
Providing various weekly and monthly reports – creating PL/SQL blocks for each report to execute
periodically
Proficient in using TOAD, Oracle SQL Developer, MS Visio
As a Lead PL/SQL developer, involved in all phases of SDLC to ensure requirement
Understand the End-to-end Architecture of existing Business and each application and data
correlation across systems
Identifying relevant Databases and Entities and providing the end-to-end solution
Created Fact and Dimension tables to meet the Data modeling (Dimensional & Relational)
concepts like Star-Schema Modeling, Snowflake Schema Modeling
Had to review Low level design, code, unit test documents produced by team members
Used Visio, Oracle SQL Developer and TOAD Data Modeler
Monitored Production Environment and maintaining Application Health throughout 24*7
Staging the code before it goes to production deployment
Client: Sony Music & Entertainment Inc., USA
Project: GRS US & Canada
Position: Production Support Specialists Feb’07 - Oct’08
Sony Music Entertainment (SME) is a US-based wholly owned subsidiary and global recorded music arm
of Sony Corporation of America (SCA). This GRS project is intended for most effective performing rights
organization at collecting and distributing foreign royalties especially for US and Canada. Post royalties’
collection, they pay the copyright holder (generally the publisher) and the publisher’s pays the writer’s
percent of the money earned on monthly basis and reconcile it.
Roles & Responsibilities:
Scheduled and Monitored GRS US and Canada batch schedules having job runs Daily/ Weekly/
Monthly/ Quarterly/ Semi-annually/ Recurring using the Tivoli Maestro, Console Manager
Investigated and fixed problems related to batches/services failure
To find out the reason of abend, do analysis and further step had taken for resolution including
database related issues
Modification in database object using TOAD, PL/SQL Developer, Oracle SQL Develop
Converted oracle report into SQL* Report
7. Produced corresponding low level design which included design of Entities, attributes, constraints,
Primary keys, Foreign Keys, Index etc.
Developed various anonymous PL/SQL blocks and procedure, functions to be called in process
flow
Global Royalty business logic understanding for US and Canada and providing end to end
business rule implementation
Created UNIX scripts to automatically schedule the job in Maestro job scheduler
Set dependency on job by using conman IBM Tivoli Workload Scheduler production processes
Developed various PL/SQL blocks to accommodate changes in existing business logic
Client: Nielson, USA
Project: eMETA
Position: Database Developer Sep’06 - Feb’07
This project is for Nielson earlier VNU-eMedia, USA to accelerate the growth of paid revenue streams
generated from our audience customer base, Replace current access control technology with new 3rd party
software solution – eMETA. The purpose of this project is to ensure the VNU reporting and analysis needs
are supported with data from new fulfillment system, eMeta. Feasibility and quality of analysis & reporting
is totally dependent on the data which will be captured, generated, stored and managed in relation to this
system. This information will then need to be transferred in a way that it can be accommodated in the
current GoldRush database.
Roles & Responsibilities:
Involved in the Creation and modification of Procedure, Function, Trigger, Cursor as per business
need
Created and maintained connections to all the Source Databases
Interacted with system specialists to analyze the requirement changes and implemented the
solution
Basic User Administration like User creation, Group Creation, granting proper access etc.
Continuous drive for improved customer experience
Captured requirements from client and communicating same to team
Used Toad and Visio to accomplish above assignments
Client: Super Info Soft Pvt. Ltd., India
Project: Integrated Material Management
Position: PL/SQL Developer Jan’04 - Sep’06
Integrated Material Management is a general purpose application, which we find in all organization for
planning, controlling and coordinating goods at stages in firm operations. Super Info Soft Pvt. Ltd. is a
medium size organization, has several departments depends upon the storage for several material, that
are necessary for their daily operations. The system is meant for the management of different kinds of
materials of different department like computer department requires computer parts and stationeries in
store for procurement and fulfillment, so the scope of material management is very large in number of
items amounting to thousands. It invokes keeping the track procurement, storage and consumption of
materials. The objective of this project is to
1. Provide easy access to data
2. Support high-level data manipulation facilities
3. Reduce data access time and redundancy
4. Providing security
5. Developing user friendly system
6. Update software regularly as per need
Roles & Responsibilities:
Developed data-entry forms for master tables
Module level testing and writing PL/SQL code for triggers and stored procedures and Report
generation at various stages are also a part of my responsibility
8. Table created with appropriate constraints/validation specified
Involved in the designing different forms
Generated report at various stages with different styles for best presentation
Established standard SQL queries for various Data analysis rules across DBs
Creating PL/SQL blocks to update data across systems
Interacted with system specialists to analyze the root-cause for mismatched data and deriving
solution for brining those in sync
Provided daily/weekly reports to client to highlight stock gaps etc.
Capturing requirements from client and documenting same for the team
Trainings:
Oracle Certified Professional (OCP)
Oracle Database 10g: SQL Tuning Ed 2 PRV
GBS Cloud-Enabled Business Transformation
GBS Data Security & Privacy Awareness, Engagement specific DS&P
Cloud based Application Development with Bluemix
Mobile-Enabled Business Transformation
Microsoft Visio Training
ETL Tool (Informatica5.0) Training
RSA (Rational Software Architecture) Training
RTC (Rational Team Concert) 3.0
AT&T Tools and Frameworks (AFT) Software Manager (SWM)
Oracle Database Administration 11g Release 2 Workshop
Oracle 10g Performance Tuning
Telelogic CM Synergy & Change Synergy
HP Quality Control
SM7 Program Training – Configuration Management, Change Management, Incident
Management, Paging