SlideShare una empresa de Scribd logo
1 de 8
Prabhakar KR
Smart_kpr@yahoo.com
9972374519
Summary  8+ years of experience in SQL and PL/SQL.
 Extensive experience in developing tools like SQL* Plus, TOAD, SQL
Developer, DB Designer and software applications using PL/SQL, SQL.
 Good knowledge of key Oracle performance-related features, such as
Execution Plans, Hints, Bitmap Indexes,B-Tree Indexes, Dbms_stats.
 More than 3 years of Onsite Experience at US, Sydney,
Singapore
 Extensive experience in areas such as Table Partitioning,
 Good Experience in Bulk Collections for Maintaining Large Table
Transactions.
 Worked extensively on loading data from flat files, CSV files using utilities
like Sql*loader, Utl_file, External Tables .
 Export and Import, Data Pump utilities for data backup.
 Worked extensively on Large Databases.
 Performance Tuning and using the Tools like TKProf, Explain Plan,
Execution Plan
 Experience in Version control tools like Visual Source Safe (VSS), SVN.
 Worked on Windows, Unix Platforms.
 Excellent communication skills, creative, problem solver and team player.
 Very good experience on Dbms_parallel_execute .
 Very good experience on Dbms_AQ, Dbms_Cripto, dbms_flashback,
Dbms_profiler, dbms_scheduler .
 I worked on Forms & Reports (ver.4.5 & 2.5)
Experience System Analyst, Mahindra Satyam/TechMahindra. [08/06/2007 – 23/05/2014].
Senior Developer , Sonata Software 25 July'2014 to Till Date.
Education Master of Business Administration (M.B.A)(Information Technology) from Mahatma
Gandhi University.
Certifications
Oracle PL/SQL Developer Certified
Associate. 9i
Oracle Advanced SQL Developer Certified
Professional ( 11g ) with 98%
Languages SQL , PL/SQL
Databases Oracle 9i , Oracle 10g , Oracle 11g
Tools & Utilities Sql Loader, Utl_file, TKPROF, External Tables
OperatingSystems Windows XP/7, MSDOS, UNIX
Expertise/Interest Areas SQL Tuning , Pl/Sql Tuning, Import, Export, Data Pump.
Experience Summary
Project 1:
Title Search Optimization
Client DELL International
Period 4th Auguest'2014 to Till date
Role Individual Contributor
Team size 4
Technologies
SQL, PL/SQL , Oracle 11g
Description
Dell Customer can login into the portal and can see his/her entitlements fast 7 years
data.
SLM portal has customer user name and pwd for login. Once he/she login into the
portal successfully, then he can enter any word related hardware, software key
word for getting entitlement details. The entitlement details has to display with in 3
seconds. This is the biggest challenge. Initially we tried with Materialized views, but
not succeed. Then we choose result cache, then table partitions, then oracle text ,
then DNT(denormalized tables). Last we are successfully completed with DELL Fluid
Cache San. Here the major work is on Performance tuning. We re-write the code
from the top to bottom. (Global Temp tables, ANSI joins, TKProf, AWR reports).
Responsibilities
• Interact with onsite team everyday over phone, video calls.
• Understanding the requirements thoroughly take the action plans .
• Preparing Coding standards for the development team.
• Lead the team with professionally as well friendly
• Administering Database Objects Like Tables, Indexes, sequences, views.
• Create materialized views based on requirement
• Oracle Text for fast searching
• Query Optimization
• Dbms_profiler for pl/sql tuning
• Exception Handling.
• Writing a complex queries in a optimized mode
• Table Partitions.
.
Project 2:
Title Data Deduplication – Securities
Client Sony India Software Centre Pvt.Ltd
Period 27-02-2013 TO 10th May'2014.
Role Individual Contributor
Team size 3
Technologies
SQL, PL/SQL , Oracle 11g
Description
Customer de-duplication is a project for Sony Latin America. The goal of this project
is to remove the direct access to the customer data and bring a synchronized way of
data access using web services. There are 2 phases in this project.
1) Make sure that all the application access customer database using web services,
so that there is no discrepancy in the data inserted by different application. Then
clean the customer records according to the rules defined in the requirements.
2) De-duplicate the customer data base by removing the duplicate records which
will be identified by the rules defined in the requirements. Then develop an internal
tool to recover the removed duplicate records from the DB if they are required by
the functional analysts
Responsibilities
• Every day Con-calls for understanding the client requirements
• Based on the requirement distributing the tasks to team members.
• Based on the criteria create database objects. Tables, views, sequences, indexes,
• Stored Functions, Stored procedures, Triggers, Packages.
• Dbms_parallel_execute package usage
• Dbms_cripto package for pwd encryptions, decryptions
• Tkprof
• Sql *loader, Utl_file
• Exception Handling.
• Bulk Collections & Dbms_flashback Queries.
Project 3:
Title Kaname2 Data Transformation
Client Bridgestone Belgium, Europe
Period 17-01-2012 to 23-01-2013
Role Module Leader
Team size 6
Technologies SQL, PL/SQL, Sql Developer
Oracle 10g
Windows XP
Description
The Kaname2 project is a data transformation project in which data is transformed from
various database schemas to a single database called OMP+. OMP+ stands for the
software that will be used for planning and scheduling in Kaname2 project. This
transformation is done for the four plants of the BRIDGESTONE EU PSR plants namely,
POZNAN, BURGOS, BETHUNE AND BARI.
Responsibilities
• Administering Database objects, including tables, Synonyms, indexes, views,
sequences, Functions, Stored Procedures, Triggers and Packages.
• Creation of Database objects as per requirement.
• Handling data using Utilities like SQL*Loader
• Handling data using UTL_FILE.
• Exception Handling.
• Bulk collections for large amount of data manipulating & performance
• Performance Tuning (Query)
Project 4:
Title SCP (Sony Corp. Project – Japan - Data Integration)
Client i2 Technologies (JDA)
Period 27-02-2010 to 10-12-2011
Role System Analyst
Team size 5
Technologies Oracle, Sql, Pl/Sql, Pl/Sql Developer
Description
This Project covers integration between individual JDA modules, specification of data across
these modules and the details of extracting this data from external systems(e.g. extracting
data from a staging database to Demand Planning) and the necessary logic to transform
data from source to destination format.
The Project also includes the details of data workflows (e.g. an overall system diagram
detailing the individual modules and their interaction) and details of timing (e.g. frequency
of data exchange and transformation) and performance (e.g. expected duration of each
exchange or transformation). The data conversion and mapping logic, procedures for
validating data, trapping and handling errors are also dealt with in this project.
Responsibilities
• Creation of Database objects as per requirement.
• Carrying out performance tuning at the Query level.
• Exception Handling.
• Bulk collections for large amount of data manipulating & performace
• Job Scheduling Using Dbms Scheduler
• Sql*Loader/External Tables, Utl_File
Project 5:
Title LOMM (Loss of Mitigation Management)
Client GENWORTH FINANCIAL (AUS - Sydney)
Period 09-jul-2009 to 12-02-2010
Role Individual Contributor
Team size 3
Technologies Oracle10g, Sql, Pl/Sql, Toad, Guideware
Description
Genworth Australia currently administers all policies across two applications. The primary
policy administration system is called eSolve, this houses the vast majority of Genworth
policies and is Genworth’s strategic solution.
A Small subset of Genworth policies are administered in the Legacy solution, current
estimates put this number at approximately 230000 however this number includes policies
that are considered declined, spoilt or withdrawn. In order to support the defined strategic
plan to have a single policy administration system, these policies will need to be migrated
from Legacy to eSolve. This work is not considered in Scope for the LOMM project but does
impact on the overall LOMM delivery.
Current Genworth Asset management system – “LEGACY” is labour-intensive, there are no
data interfaces, and manual work rounds are required to fulfill the business needs. The
system limitations are longer processing times, Increase the risk of error or omission, Less
satisfaction of staff and customer, Expose Genworth to possible contractual and regulatory
compliance issues. To address problems, Genworth is looking for a build or packaged
solution to improve the operation efficiency of claims Process. The current database design
could not support or be supported by any new solutions so a migration of the current data is
inevitable.
Responsibilities
• Creation of DB objects based on the requirement. Tables, views, sequences, synonyms,
indexes, Procedures, Functions, Packages, Triggers.
• Query Optimization
• Exception Handling
• Job creations using dbms_job
• Oracle Utilities Sql Loader, Utl_File, TkProf
• Complex Query Writing
• Analytical Functions
Project 6:
Title WMS (WAREHOUSE MANAGEMENT SYSTEM)
Client TARGET INDIA CORP
Period 10-03-2008 to 24-05-2009
Role System Analyst
Team size 6
Technologies Oracle9i , Sql, Pl/sql, Web Focus, HTML, Java SCript
Description
This project is all about Database Integration. The integration database/schema will be created at HQ
and all MA FCs will send transactional data(PIX: Perpetual Inventory Transactions) to integration
database where it gets transformed , summarized and then published to various hosts systems like
PRISM, POET, TRB and EDW. T.com hosts systems eDW (E commerce data warehouse) and Redbox
(Guest order / Pick ticket host) require WMS to send inventory transaction data in a summarized
form. The integration schema will store all the transactional data, summarized data and will own the
process to summarize transform transactional data.Target.com Item Host system is PRISM. Product
Item setup and maintenance will be performed on PRISM host system, WMS will send updated Items
back to PRISM host system .Purchase Order Execution Tool (POET) is used to create or maintain
purchase orders information in the purchase order application of Network Strategy program. WMS
will send ASN verification as inventory adjustments for the Inbound Purchase Orders, Transfer Orders
received to the FCs in the Target.com network.Target.com Order Management Host system is OMS
and middleware application is TRB. An updated Amazon OMS (Order Management System) allocates
guest orders across multiple FC locations. The Amazon OMS system will send guest orders to the
WMS system via a Target.com middleware application called Target RedBox (TRB).Target.com e-
Commerce Data Warehouse system (eDW) is capable of tracking inventory and guest orders across
multiple FC locations. WMS sends Inventory adjustment/Sync transactions and invoice updates to
eDW.
Responsibilities
• Creation of DB objects based on the requirement. Tables, views, sequences, synonyms,
indexes, Procedures, Functions, Packages, Triggers.
• Query Optimization
• Exception Handling
• Job creations using dbms_job
• Oracle Utilities Sql Loader, Utl_File, TkProf
• Complex Query Writing
• Analytical Functions
Project 7:
Title SMB-MM (SMALL MEDIUM BUSINESS)
Client CISCO
Period 08-06-2007 to 21-02-2008
Role Team Member
Team size 6
Technologies Oracle9i, Pl/Sql, Toad, Kintana
Description
This project is intended to create a new party attribute (Commercial Business Council
Segment (CBCS) to classify parties, modify attribute manager to compute CBCS attributes for
all parties in CR and one time data migration to EDWTD (Enterprise data ware terra data). The
Commercial Marketing team has the sub-segmentation like Small and Medium Business
(SMB) and Mid Market (MM) which helps in understanding how commercial investment is
performing. AM(Attribute Manager) will compute CBCS values for all existing organization
parties in CR. Now CBCS Attribute will be depending upon GU Sales Channel Code, GU
Employee Counts, GU Cisco Booking counts, and Partner Certification.
Responsibilities
• Administering Database objects, including tables, Synonyms, indexes, views,
sequences, Functions, Stored Procedures, Triggers and Packages.
• Creation of Database objects as per requirement.
• Handling data using Utilities like SQL*Loader
• Handling data using UTL_FILE.
• Exception Handling.
• Bulk collections for large amount of data manipulating & performance
• Performance Tuning (Query)
Personal Particulars
Name Prabhakar KR
Passport Number F5713562 Having USA –B1 valid Visa up to 2016
Passport Valid through 31/12/2015
Mobile Phone 9972374519
Email Id Smart_kpr@yahoo.com
Other Interests Training people in DB related areas

Más contenido relacionado

La actualidad más candente

Himanshu_Oracle_DBA_Resume
Himanshu_Oracle_DBA_ResumeHimanshu_Oracle_DBA_Resume
Himanshu_Oracle_DBA_ResumeHimanshu Jain
 
zahidCvFinal(Updated Jan 17)-2
zahidCvFinal(Updated Jan 17)-2zahidCvFinal(Updated Jan 17)-2
zahidCvFinal(Updated Jan 17)-2Zahid Ayub
 
srikanth-2015_resume
srikanth-2015_resumesrikanth-2015_resume
srikanth-2015_resumeb srikanth
 
Sujit lead plsql
Sujit lead plsqlSujit lead plsql
Sujit lead plsqlSujit Jha
 
Mukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar Mohammed
 
Bhabani_Bal -RMS_DWH
Bhabani_Bal -RMS_DWHBhabani_Bal -RMS_DWH
Bhabani_Bal -RMS_DWHBhabani Bal
 
Avanthi Guduru ( Oracle DBA) Resume
Avanthi Guduru ( Oracle DBA) ResumeAvanthi Guduru ( Oracle DBA) Resume
Avanthi Guduru ( Oracle DBA) ResumeAvanthi Guduru
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17Shipra Jaiswal
 
Resume for DB2 DBA LUW/AIX
Resume for DB2 DBA LUW/AIXResume for DB2 DBA LUW/AIX
Resume for DB2 DBA LUW/AIXMadan Gupta
 
Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exprajarao marisa
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna Kishore
 

La actualidad más candente (20)

Himanshu_Oracle_DBA_Resume
Himanshu_Oracle_DBA_ResumeHimanshu_Oracle_DBA_Resume
Himanshu_Oracle_DBA_Resume
 
Subhabrata Deb Resume
Subhabrata Deb ResumeSubhabrata Deb Resume
Subhabrata Deb Resume
 
Mallikarjun_Konduri
Mallikarjun_KonduriMallikarjun_Konduri
Mallikarjun_Konduri
 
zahidCvFinal(Updated Jan 17)-2
zahidCvFinal(Updated Jan 17)-2zahidCvFinal(Updated Jan 17)-2
zahidCvFinal(Updated Jan 17)-2
 
PG_resume (2)
PG_resume (2)PG_resume (2)
PG_resume (2)
 
Neethu_Abraham
Neethu_AbrahamNeethu_Abraham
Neethu_Abraham
 
srikanth-2015_resume
srikanth-2015_resumesrikanth-2015_resume
srikanth-2015_resume
 
Magesh_Babu_Resume
Magesh_Babu_ResumeMagesh_Babu_Resume
Magesh_Babu_Resume
 
Sujit lead plsql
Sujit lead plsqlSujit lead plsql
Sujit lead plsql
 
Vineet Kurrewar
Vineet KurrewarVineet Kurrewar
Vineet Kurrewar
 
NeilResume
NeilResumeNeilResume
NeilResume
 
Mukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar resume etl_developer
Mukhtar resume etl_developer
 
Bhabani_Bal -RMS_DWH
Bhabani_Bal -RMS_DWHBhabani_Bal -RMS_DWH
Bhabani_Bal -RMS_DWH
 
Avanthi Guduru ( Oracle DBA) Resume
Avanthi Guduru ( Oracle DBA) ResumeAvanthi Guduru ( Oracle DBA) Resume
Avanthi Guduru ( Oracle DBA) Resume
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 
Resume for DB2 DBA LUW/AIX
Resume for DB2 DBA LUW/AIXResume for DB2 DBA LUW/AIX
Resume for DB2 DBA LUW/AIX
 
ananth_resume
ananth_resumeananth_resume
ananth_resume
 
Shankar_C
Shankar_CShankar_C
Shankar_C
 
Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exp
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
 

Similar a My C.V (20)

bharathplsql1
bharathplsql1bharathplsql1
bharathplsql1
 
resume_abdul_up
resume_abdul_upresume_abdul_up
resume_abdul_up
 
Madhu_Resume
Madhu_ResumeMadhu_Resume
Madhu_Resume
 
PERIASAMY R_SQL_DBA
PERIASAMY R_SQL_DBAPERIASAMY R_SQL_DBA
PERIASAMY R_SQL_DBA
 
PERIASAMY R_SQL_DBA
PERIASAMY R_SQL_DBAPERIASAMY R_SQL_DBA
PERIASAMY R_SQL_DBA
 
Resume - Deepak v.s
Resume -  Deepak v.sResume -  Deepak v.s
Resume - Deepak v.s
 
VINODBABURESUME
VINODBABURESUMEVINODBABURESUME
VINODBABURESUME
 
ashok3.4 yrs resume
ashok3.4 yrs resumeashok3.4 yrs resume
ashok3.4 yrs resume
 
Anil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETLAnil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETL
 
Richa_Profile
Richa_ProfileRicha_Profile
Richa_Profile
 
Ajith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETLAjith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETL
 
Mohd_Shaukath_5_Exp_Datastage
Mohd_Shaukath_5_Exp_DatastageMohd_Shaukath_5_Exp_Datastage
Mohd_Shaukath_5_Exp_Datastage
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
 
Navendu_Resume
Navendu_ResumeNavendu_Resume
Navendu_Resume
 
Sudhanshu_CV
Sudhanshu_CVSudhanshu_CV
Sudhanshu_CV
 
Ramesh_CV_4_Years_Experience
Ramesh_CV_4_Years_ExperienceRamesh_CV_4_Years_Experience
Ramesh_CV_4_Years_Experience
 
PLSQL - Raymond Wu
PLSQL - Raymond WuPLSQL - Raymond Wu
PLSQL - Raymond Wu
 
vikram ch resume
vikram ch resumevikram ch resume
vikram ch resume
 
Rajiv Ranjan ODI_Developer
Rajiv Ranjan ODI_DeveloperRajiv Ranjan ODI_Developer
Rajiv Ranjan ODI_Developer
 

My C.V

  • 2. Summary  8+ years of experience in SQL and PL/SQL.  Extensive experience in developing tools like SQL* Plus, TOAD, SQL Developer, DB Designer and software applications using PL/SQL, SQL.  Good knowledge of key Oracle performance-related features, such as Execution Plans, Hints, Bitmap Indexes,B-Tree Indexes, Dbms_stats.  More than 3 years of Onsite Experience at US, Sydney, Singapore  Extensive experience in areas such as Table Partitioning,  Good Experience in Bulk Collections for Maintaining Large Table Transactions.  Worked extensively on loading data from flat files, CSV files using utilities like Sql*loader, Utl_file, External Tables .  Export and Import, Data Pump utilities for data backup.  Worked extensively on Large Databases.  Performance Tuning and using the Tools like TKProf, Explain Plan, Execution Plan  Experience in Version control tools like Visual Source Safe (VSS), SVN.  Worked on Windows, Unix Platforms.  Excellent communication skills, creative, problem solver and team player.  Very good experience on Dbms_parallel_execute .  Very good experience on Dbms_AQ, Dbms_Cripto, dbms_flashback, Dbms_profiler, dbms_scheduler .  I worked on Forms & Reports (ver.4.5 & 2.5) Experience System Analyst, Mahindra Satyam/TechMahindra. [08/06/2007 – 23/05/2014]. Senior Developer , Sonata Software 25 July'2014 to Till Date. Education Master of Business Administration (M.B.A)(Information Technology) from Mahatma Gandhi University. Certifications Oracle PL/SQL Developer Certified Associate. 9i Oracle Advanced SQL Developer Certified Professional ( 11g ) with 98% Languages SQL , PL/SQL Databases Oracle 9i , Oracle 10g , Oracle 11g Tools & Utilities Sql Loader, Utl_file, TKPROF, External Tables OperatingSystems Windows XP/7, MSDOS, UNIX Expertise/Interest Areas SQL Tuning , Pl/Sql Tuning, Import, Export, Data Pump.
  • 3. Experience Summary Project 1: Title Search Optimization Client DELL International Period 4th Auguest'2014 to Till date Role Individual Contributor Team size 4 Technologies SQL, PL/SQL , Oracle 11g Description Dell Customer can login into the portal and can see his/her entitlements fast 7 years data. SLM portal has customer user name and pwd for login. Once he/she login into the portal successfully, then he can enter any word related hardware, software key word for getting entitlement details. The entitlement details has to display with in 3 seconds. This is the biggest challenge. Initially we tried with Materialized views, but not succeed. Then we choose result cache, then table partitions, then oracle text , then DNT(denormalized tables). Last we are successfully completed with DELL Fluid Cache San. Here the major work is on Performance tuning. We re-write the code from the top to bottom. (Global Temp tables, ANSI joins, TKProf, AWR reports). Responsibilities • Interact with onsite team everyday over phone, video calls. • Understanding the requirements thoroughly take the action plans . • Preparing Coding standards for the development team. • Lead the team with professionally as well friendly • Administering Database Objects Like Tables, Indexes, sequences, views. • Create materialized views based on requirement • Oracle Text for fast searching • Query Optimization • Dbms_profiler for pl/sql tuning • Exception Handling. • Writing a complex queries in a optimized mode • Table Partitions. . Project 2: Title Data Deduplication – Securities Client Sony India Software Centre Pvt.Ltd
  • 4. Period 27-02-2013 TO 10th May'2014. Role Individual Contributor Team size 3 Technologies SQL, PL/SQL , Oracle 11g Description Customer de-duplication is a project for Sony Latin America. The goal of this project is to remove the direct access to the customer data and bring a synchronized way of data access using web services. There are 2 phases in this project. 1) Make sure that all the application access customer database using web services, so that there is no discrepancy in the data inserted by different application. Then clean the customer records according to the rules defined in the requirements. 2) De-duplicate the customer data base by removing the duplicate records which will be identified by the rules defined in the requirements. Then develop an internal tool to recover the removed duplicate records from the DB if they are required by the functional analysts Responsibilities • Every day Con-calls for understanding the client requirements • Based on the requirement distributing the tasks to team members. • Based on the criteria create database objects. Tables, views, sequences, indexes, • Stored Functions, Stored procedures, Triggers, Packages. • Dbms_parallel_execute package usage • Dbms_cripto package for pwd encryptions, decryptions • Tkprof • Sql *loader, Utl_file • Exception Handling. • Bulk Collections & Dbms_flashback Queries. Project 3: Title Kaname2 Data Transformation Client Bridgestone Belgium, Europe Period 17-01-2012 to 23-01-2013 Role Module Leader Team size 6 Technologies SQL, PL/SQL, Sql Developer Oracle 10g
  • 5. Windows XP Description The Kaname2 project is a data transformation project in which data is transformed from various database schemas to a single database called OMP+. OMP+ stands for the software that will be used for planning and scheduling in Kaname2 project. This transformation is done for the four plants of the BRIDGESTONE EU PSR plants namely, POZNAN, BURGOS, BETHUNE AND BARI. Responsibilities • Administering Database objects, including tables, Synonyms, indexes, views, sequences, Functions, Stored Procedures, Triggers and Packages. • Creation of Database objects as per requirement. • Handling data using Utilities like SQL*Loader • Handling data using UTL_FILE. • Exception Handling. • Bulk collections for large amount of data manipulating & performance • Performance Tuning (Query) Project 4: Title SCP (Sony Corp. Project – Japan - Data Integration) Client i2 Technologies (JDA) Period 27-02-2010 to 10-12-2011 Role System Analyst Team size 5 Technologies Oracle, Sql, Pl/Sql, Pl/Sql Developer Description This Project covers integration between individual JDA modules, specification of data across these modules and the details of extracting this data from external systems(e.g. extracting data from a staging database to Demand Planning) and the necessary logic to transform data from source to destination format. The Project also includes the details of data workflows (e.g. an overall system diagram detailing the individual modules and their interaction) and details of timing (e.g. frequency of data exchange and transformation) and performance (e.g. expected duration of each exchange or transformation). The data conversion and mapping logic, procedures for validating data, trapping and handling errors are also dealt with in this project. Responsibilities • Creation of Database objects as per requirement. • Carrying out performance tuning at the Query level. • Exception Handling. • Bulk collections for large amount of data manipulating & performace • Job Scheduling Using Dbms Scheduler • Sql*Loader/External Tables, Utl_File Project 5:
  • 6. Title LOMM (Loss of Mitigation Management) Client GENWORTH FINANCIAL (AUS - Sydney) Period 09-jul-2009 to 12-02-2010 Role Individual Contributor Team size 3 Technologies Oracle10g, Sql, Pl/Sql, Toad, Guideware Description Genworth Australia currently administers all policies across two applications. The primary policy administration system is called eSolve, this houses the vast majority of Genworth policies and is Genworth’s strategic solution. A Small subset of Genworth policies are administered in the Legacy solution, current estimates put this number at approximately 230000 however this number includes policies that are considered declined, spoilt or withdrawn. In order to support the defined strategic plan to have a single policy administration system, these policies will need to be migrated from Legacy to eSolve. This work is not considered in Scope for the LOMM project but does impact on the overall LOMM delivery. Current Genworth Asset management system – “LEGACY” is labour-intensive, there are no data interfaces, and manual work rounds are required to fulfill the business needs. The system limitations are longer processing times, Increase the risk of error or omission, Less satisfaction of staff and customer, Expose Genworth to possible contractual and regulatory compliance issues. To address problems, Genworth is looking for a build or packaged solution to improve the operation efficiency of claims Process. The current database design could not support or be supported by any new solutions so a migration of the current data is inevitable. Responsibilities • Creation of DB objects based on the requirement. Tables, views, sequences, synonyms, indexes, Procedures, Functions, Packages, Triggers. • Query Optimization • Exception Handling • Job creations using dbms_job • Oracle Utilities Sql Loader, Utl_File, TkProf • Complex Query Writing • Analytical Functions Project 6: Title WMS (WAREHOUSE MANAGEMENT SYSTEM) Client TARGET INDIA CORP Period 10-03-2008 to 24-05-2009
  • 7. Role System Analyst Team size 6 Technologies Oracle9i , Sql, Pl/sql, Web Focus, HTML, Java SCript Description This project is all about Database Integration. The integration database/schema will be created at HQ and all MA FCs will send transactional data(PIX: Perpetual Inventory Transactions) to integration database where it gets transformed , summarized and then published to various hosts systems like PRISM, POET, TRB and EDW. T.com hosts systems eDW (E commerce data warehouse) and Redbox (Guest order / Pick ticket host) require WMS to send inventory transaction data in a summarized form. The integration schema will store all the transactional data, summarized data and will own the process to summarize transform transactional data.Target.com Item Host system is PRISM. Product Item setup and maintenance will be performed on PRISM host system, WMS will send updated Items back to PRISM host system .Purchase Order Execution Tool (POET) is used to create or maintain purchase orders information in the purchase order application of Network Strategy program. WMS will send ASN verification as inventory adjustments for the Inbound Purchase Orders, Transfer Orders received to the FCs in the Target.com network.Target.com Order Management Host system is OMS and middleware application is TRB. An updated Amazon OMS (Order Management System) allocates guest orders across multiple FC locations. The Amazon OMS system will send guest orders to the WMS system via a Target.com middleware application called Target RedBox (TRB).Target.com e- Commerce Data Warehouse system (eDW) is capable of tracking inventory and guest orders across multiple FC locations. WMS sends Inventory adjustment/Sync transactions and invoice updates to eDW. Responsibilities • Creation of DB objects based on the requirement. Tables, views, sequences, synonyms, indexes, Procedures, Functions, Packages, Triggers. • Query Optimization • Exception Handling • Job creations using dbms_job • Oracle Utilities Sql Loader, Utl_File, TkProf • Complex Query Writing • Analytical Functions Project 7: Title SMB-MM (SMALL MEDIUM BUSINESS) Client CISCO
  • 8. Period 08-06-2007 to 21-02-2008 Role Team Member Team size 6 Technologies Oracle9i, Pl/Sql, Toad, Kintana Description This project is intended to create a new party attribute (Commercial Business Council Segment (CBCS) to classify parties, modify attribute manager to compute CBCS attributes for all parties in CR and one time data migration to EDWTD (Enterprise data ware terra data). The Commercial Marketing team has the sub-segmentation like Small and Medium Business (SMB) and Mid Market (MM) which helps in understanding how commercial investment is performing. AM(Attribute Manager) will compute CBCS values for all existing organization parties in CR. Now CBCS Attribute will be depending upon GU Sales Channel Code, GU Employee Counts, GU Cisco Booking counts, and Partner Certification. Responsibilities • Administering Database objects, including tables, Synonyms, indexes, views, sequences, Functions, Stored Procedures, Triggers and Packages. • Creation of Database objects as per requirement. • Handling data using Utilities like SQL*Loader • Handling data using UTL_FILE. • Exception Handling. • Bulk collections for large amount of data manipulating & performance • Performance Tuning (Query) Personal Particulars Name Prabhakar KR Passport Number F5713562 Having USA –B1 valid Visa up to 2016 Passport Valid through 31/12/2015 Mobile Phone 9972374519 Email Id Smart_kpr@yahoo.com Other Interests Training people in DB related areas