SlideShare a Scribd company logo
1 of 2
Bank ofAmerica, NY/NJ Aug 2015 – till date
Sr. Java Developer
Project: IB Custom Process onboarding for Clients
Data Transformation Engine (DTE) is the core application which handles the data that comes from client
& than move it further to the downstream systems after transformation as per there requirements. Client
data as comes in the normal text and in format plain file, fixed length, csv,excel or xml. Now to move
it further to the downstream in their language / coding as per the business logic. DTE plays the important
role. Because downstream includes the Mainframe systems as well as data keeping vendors with their
own restrictions & layouts defined. IB Layouts are the biggest layouts with 135 + fields.
 Worked as main developer to set up the generic layout setup for the new client.
 Coordinate with other Business team to work on the continuously changing layout that client has to
setup data.
 Worked as main developer to apply the Multi-threading to the current process so as to handle multiple
files coming from the client.
 Worked on Core Java, Concurrency, collections and data structures extensively.
 Worked extensively on Unix Scripting setup for layout changes & accommodate preprocessing such
as sort & split before processing the file.
 Worked with UNIXscripting to match & verify the Header trailer verification for the file coming
from client.
 Worked on setting up the payment info layout setups for the clients.
 Created and deployed template layout setup using the Toad Oracle client & sql plus.
 worked on bat file & Ant to generate jar for application delivery
 Worked with other developer to perform the regression test for all existing files in production.
 Worked on the scripts as to make the regression test faster & less manual intervention.
 Developed and modify few ksh scripts to add up the new client custom setup
 Worked on setting up Autosys jobs to update the crontab setup for the new client.
 Coordinate with project manager / business Team to update proper status & keep the Layout changes
align as per the client requirement & same time to fit in our layouts
 Coordinate with Offshore team to get the day-to-day work.
 Coordinating with other module teams as to work for moving code to delivery pipeline, along with
other release going on.
 Helps & participate in defining training material and workshops for other developers
 Conduct multiple Test run sessions for demonstrating processing to client & get the sign-off
Environment: JAVA,Multi-threading, Collections, ANT,Junit, UNIX, shell script, Eclipse, Visual
studio TFS, TOAD Oracle client, SQL plus, Oracle 10g/11g.
Bank ofAmerica, NY July 2015 – Sep 2015.
Sr. Java Developer
Project: IB Platform update to Spring-batch framework
Data Transformation Engine (DTE) is the core application which handles the data that comes from client
& than move it further to the downstream systems after transformation as per there requirements. Client
data as comes in the normal text and in format plain file, fixed length, csv,excel or xml. Now to move
it further to the downstream in their language / coding as per the business logic. DTE plays the important
role. Because downstream includes the Mainframe systems as well as data keeping vendors with their
own restrictions & layouts defined. IB Layouts are the biggest layouts with 135 + fields.
 Worked extensively to define the Application Batch Context setup.
 Worked on batch context Jobs, steps & task let setups. Also to breakdown context in different files for
reader,writers & processor.
 Setup the database context for the Jobs to run.
 Collect the requirement of different jars for spring-batch to run the IB setup & work along with other
team lead to get those jars approved.
 Work on transforming the normal file reading logic with file reader to the spring-batch way of
reading using standard readers.
 Worked on transforming the big sql call of jdbc to spring-batch database readers.
 Worked on the step execution to make the reading & processing of files recursive.
 Applied the log4j to the new spring-batch application to gather the application logs.
 Worked to setup the log4j configuration using java, as always log file names are need to be setup
using the transmission Id.
 Work on the custom reader using jump to () to apply the header / footer validation of the client file
by reading the complete file earlier.
 Work on the custom writer to accommodate the logic of writing the records on plan based files even
though we were using the item Writer.
 Worked on setting up the spring-batch project & run in UNIXenvironment to process the file.
 Create and provide the reference-training document for the spring-batch project.
Environment: JAVA,Spring-Batch framework, ANT, UNIX, shell script, Eclipse, Visual studio TFS,
TOAD Oracle client, SQL Plus, Oracle 10g/11g.

More Related Content

What's hot

Java Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized EfficiencyJava Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized EfficiencySridharSudarsan
 
Engineering Student MuleSoft Meetup#2 - API Design Using Restful API Modelin...
Engineering Student MuleSoft  Meetup#2 - API Design Using Restful API Modelin...Engineering Student MuleSoft  Meetup#2 - API Design Using Restful API Modelin...
Engineering Student MuleSoft Meetup#2 - API Design Using Restful API Modelin...Jitendra Bafna
 
Document databases
Document databasesDocument databases
Document databasesQframe
 
NoSQL Database: Classification, Characteristics and Comparison
NoSQL Database: Classification, Characteristics and ComparisonNoSQL Database: Classification, Characteristics and Comparison
NoSQL Database: Classification, Characteristics and ComparisonMayuree Srikulwong
 
Spring batch for large enterprises operations
Spring batch for large enterprises operations Spring batch for large enterprises operations
Spring batch for large enterprises operations Ignasi González
 
Database and Java Database Connectivity
Database and Java Database ConnectivityDatabase and Java Database Connectivity
Database and Java Database ConnectivityGary Yeh
 
Types of Drivers in JDBC
Types of Drivers in JDBCTypes of Drivers in JDBC
Types of Drivers in JDBCHemant Sharma
 
JDBC Architecture and Drivers
JDBC Architecture and DriversJDBC Architecture and Drivers
JDBC Architecture and DriversSimoniShah6
 

What's hot (20)

jdbc
jdbcjdbc
jdbc
 
jdbc
jdbcjdbc
jdbc
 
Java Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized EfficiencyJava Batch for Cost Optimized Efficiency
Java Batch for Cost Optimized Efficiency
 
Engineering Student MuleSoft Meetup#2 - API Design Using Restful API Modelin...
Engineering Student MuleSoft  Meetup#2 - API Design Using Restful API Modelin...Engineering Student MuleSoft  Meetup#2 - API Design Using Restful API Modelin...
Engineering Student MuleSoft Meetup#2 - API Design Using Restful API Modelin...
 
Xml
XmlXml
Xml
 
Document databases
Document databasesDocument databases
Document databases
 
MERN stack roadmap
MERN stack roadmapMERN stack roadmap
MERN stack roadmap
 
JDBC Driver Types
JDBC Driver TypesJDBC Driver Types
JDBC Driver Types
 
NoSQL Database: Classification, Characteristics and Comparison
NoSQL Database: Classification, Characteristics and ComparisonNoSQL Database: Classification, Characteristics and Comparison
NoSQL Database: Classification, Characteristics and Comparison
 
Jdbc
JdbcJdbc
Jdbc
 
Jdbc
JdbcJdbc
Jdbc
 
Spring batch for large enterprises operations
Spring batch for large enterprises operations Spring batch for large enterprises operations
Spring batch for large enterprises operations
 
Jdbc driver types
Jdbc driver typesJdbc driver types
Jdbc driver types
 
Jdbc
JdbcJdbc
Jdbc
 
Database and Java Database Connectivity
Database and Java Database ConnectivityDatabase and Java Database Connectivity
Database and Java Database Connectivity
 
Rajesh jdbc
Rajesh   jdbcRajesh   jdbc
Rajesh jdbc
 
Mule hdfs connector
Mule hdfs connectorMule hdfs connector
Mule hdfs connector
 
Types of Drivers in JDBC
Types of Drivers in JDBCTypes of Drivers in JDBC
Types of Drivers in JDBC
 
atul_resume
atul_resumeatul_resume
atul_resume
 
JDBC Architecture and Drivers
JDBC Architecture and DriversJDBC Architecture and Drivers
JDBC Architecture and Drivers
 

Viewers also liked

Viewers also liked (20)

Estilo css
Estilo cssEstilo css
Estilo css
 
How Rodan + Fields Is Poised For Growth At Home And Abroad
How Rodan + Fields Is Poised For Growth At Home And AbroadHow Rodan + Fields Is Poised For Growth At Home And Abroad
How Rodan + Fields Is Poised For Growth At Home And Abroad
 
Taif - Resume
Taif -  Resume Taif -  Resume
Taif - Resume
 
Admin Least Privilege on Shared Cloud Accounts
Admin Least Privilege on Shared Cloud AccountsAdmin Least Privilege on Shared Cloud Accounts
Admin Least Privilege on Shared Cloud Accounts
 
Moebel port
Moebel portMoebel port
Moebel port
 
Anzalone2015_Standard_2x3.5_front
Anzalone2015_Standard_2x3.5_frontAnzalone2015_Standard_2x3.5_front
Anzalone2015_Standard_2x3.5_front
 
GR Internet Addiction
GR Internet AddictionGR Internet Addiction
GR Internet Addiction
 
Prudential Insurance Exp
Prudential Insurance ExpPrudential Insurance Exp
Prudential Insurance Exp
 
MLawson resume Mar2015 rem
MLawson resume Mar2015 remMLawson resume Mar2015 rem
MLawson resume Mar2015 rem
 
Ahmed_behery Sept 2015
Ahmed_behery Sept 2015Ahmed_behery Sept 2015
Ahmed_behery Sept 2015
 
Ankit Chohan - Java
Ankit Chohan - JavaAnkit Chohan - Java
Ankit Chohan - Java
 
CV M.HUSNI
CV M.HUSNICV M.HUSNI
CV M.HUSNI
 
Saranya resume
Saranya resumeSaranya resume
Saranya resume
 
Final paper rizky citra islami
Final paper  rizky citra islamiFinal paper  rizky citra islami
Final paper rizky citra islami
 
The Power of Collaboration
The Power of CollaborationThe Power of Collaboration
The Power of Collaboration
 
Uk luxury property market ebrima jah
Uk luxury property market ebrima jahUk luxury property market ebrima jah
Uk luxury property market ebrima jah
 
Control load via web page
Control load via web page Control load via web page
Control load via web page
 
Actualizado 2015
Actualizado 2015Actualizado 2015
Actualizado 2015
 
Visualizing Healthcare: You have the data, but can you see the story?
Visualizing Healthcare: You have the data, but can you see the story?Visualizing Healthcare: You have the data, but can you see the story?
Visualizing Healthcare: You have the data, but can you see the story?
 
resume-OFMcDaniel
resume-OFMcDanielresume-OFMcDaniel
resume-OFMcDaniel
 

Similar to Bank of America Exp

Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerramaprasad owk
 
jhanz_RS102016_PC_1
jhanz_RS102016_PC_1jhanz_RS102016_PC_1
jhanz_RS102016_PC_1Josh Hanz
 
Varun v resume_tc
Varun v resume_tcVarun v resume_tc
Varun v resume_tcVarun V
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developerbasha shaik
 
Prasad Rompalli latest Resume
Prasad Rompalli latest ResumePrasad Rompalli latest Resume
Prasad Rompalli latest ResumeRsv Prasad
 
Mahesh_Resume
Mahesh_ResumeMahesh_Resume
Mahesh_ResumeMahesh B
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developerbasha shaik
 
Mukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar Mohammed
 
Ramesh BODS_IS
Ramesh BODS_ISRamesh BODS_IS
Ramesh BODS_ISRamesh Ch
 
Saranteja gutta wells
Saranteja gutta wellsSaranteja gutta wells
Saranteja gutta wellsramesh5080
 
Maharshi_Amin_416
Maharshi_Amin_416Maharshi_Amin_416
Maharshi_Amin_416mamin1411
 
Mukhtar_Resume_ETL_Developer
Mukhtar_Resume_ETL_DeveloperMukhtar_Resume_ETL_Developer
Mukhtar_Resume_ETL_DeveloperMukhtar Mohammed
 
Was l iberty for java batch and jsr352
Was l iberty for java batch and jsr352Was l iberty for java batch and jsr352
Was l iberty for java batch and jsr352sflynn073
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna Kishore
 

Similar to Bank of America Exp (20)

Sindhumathi Vellaidurai
Sindhumathi VellaiduraiSindhumathi Vellaidurai
Sindhumathi Vellaidurai
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
jhanz_RS102016_PC_1
jhanz_RS102016_PC_1jhanz_RS102016_PC_1
jhanz_RS102016_PC_1
 
Nitin Paliwal
Nitin PaliwalNitin Paliwal
Nitin Paliwal
 
Varun v resume_tc
Varun v resume_tcVarun v resume_tc
Varun v resume_tc
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developer
 
Prasad Rompalli latest Resume
Prasad Rompalli latest ResumePrasad Rompalli latest Resume
Prasad Rompalli latest Resume
 
Mahesh_Resume
Mahesh_ResumeMahesh_Resume
Mahesh_Resume
 
Ganesh CV
Ganesh CVGanesh CV
Ganesh CV
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developer
 
Mukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar resume etl_developer
Mukhtar resume etl_developer
 
Ramesh BODS_IS
Ramesh BODS_ISRamesh BODS_IS
Ramesh BODS_IS
 
Resume ratna rao updated
Resume ratna rao updatedResume ratna rao updated
Resume ratna rao updated
 
Resume_Ratna Rao updated
Resume_Ratna Rao updatedResume_Ratna Rao updated
Resume_Ratna Rao updated
 
r4
r4r4
r4
 
Saranteja gutta wells
Saranteja gutta wellsSaranteja gutta wells
Saranteja gutta wells
 
Maharshi_Amin_416
Maharshi_Amin_416Maharshi_Amin_416
Maharshi_Amin_416
 
Mukhtar_Resume_ETL_Developer
Mukhtar_Resume_ETL_DeveloperMukhtar_Resume_ETL_Developer
Mukhtar_Resume_ETL_Developer
 
Was l iberty for java batch and jsr352
Was l iberty for java batch and jsr352Was l iberty for java batch and jsr352
Was l iberty for java batch and jsr352
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
 

Bank of America Exp

  • 1. Bank ofAmerica, NY/NJ Aug 2015 – till date Sr. Java Developer Project: IB Custom Process onboarding for Clients Data Transformation Engine (DTE) is the core application which handles the data that comes from client & than move it further to the downstream systems after transformation as per there requirements. Client data as comes in the normal text and in format plain file, fixed length, csv,excel or xml. Now to move it further to the downstream in their language / coding as per the business logic. DTE plays the important role. Because downstream includes the Mainframe systems as well as data keeping vendors with their own restrictions & layouts defined. IB Layouts are the biggest layouts with 135 + fields.  Worked as main developer to set up the generic layout setup for the new client.  Coordinate with other Business team to work on the continuously changing layout that client has to setup data.  Worked as main developer to apply the Multi-threading to the current process so as to handle multiple files coming from the client.  Worked on Core Java, Concurrency, collections and data structures extensively.  Worked extensively on Unix Scripting setup for layout changes & accommodate preprocessing such as sort & split before processing the file.  Worked with UNIXscripting to match & verify the Header trailer verification for the file coming from client.  Worked on setting up the payment info layout setups for the clients.  Created and deployed template layout setup using the Toad Oracle client & sql plus.  worked on bat file & Ant to generate jar for application delivery  Worked with other developer to perform the regression test for all existing files in production.  Worked on the scripts as to make the regression test faster & less manual intervention.  Developed and modify few ksh scripts to add up the new client custom setup  Worked on setting up Autosys jobs to update the crontab setup for the new client.  Coordinate with project manager / business Team to update proper status & keep the Layout changes align as per the client requirement & same time to fit in our layouts  Coordinate with Offshore team to get the day-to-day work.  Coordinating with other module teams as to work for moving code to delivery pipeline, along with other release going on.  Helps & participate in defining training material and workshops for other developers  Conduct multiple Test run sessions for demonstrating processing to client & get the sign-off Environment: JAVA,Multi-threading, Collections, ANT,Junit, UNIX, shell script, Eclipse, Visual studio TFS, TOAD Oracle client, SQL plus, Oracle 10g/11g. Bank ofAmerica, NY July 2015 – Sep 2015. Sr. Java Developer Project: IB Platform update to Spring-batch framework Data Transformation Engine (DTE) is the core application which handles the data that comes from client & than move it further to the downstream systems after transformation as per there requirements. Client data as comes in the normal text and in format plain file, fixed length, csv,excel or xml. Now to move it further to the downstream in their language / coding as per the business logic. DTE plays the important role. Because downstream includes the Mainframe systems as well as data keeping vendors with their own restrictions & layouts defined. IB Layouts are the biggest layouts with 135 + fields.  Worked extensively to define the Application Batch Context setup.
  • 2.  Worked on batch context Jobs, steps & task let setups. Also to breakdown context in different files for reader,writers & processor.  Setup the database context for the Jobs to run.  Collect the requirement of different jars for spring-batch to run the IB setup & work along with other team lead to get those jars approved.  Work on transforming the normal file reading logic with file reader to the spring-batch way of reading using standard readers.  Worked on transforming the big sql call of jdbc to spring-batch database readers.  Worked on the step execution to make the reading & processing of files recursive.  Applied the log4j to the new spring-batch application to gather the application logs.  Worked to setup the log4j configuration using java, as always log file names are need to be setup using the transmission Id.  Work on the custom reader using jump to () to apply the header / footer validation of the client file by reading the complete file earlier.  Work on the custom writer to accommodate the logic of writing the records on plan based files even though we were using the item Writer.  Worked on setting up the spring-batch project & run in UNIXenvironment to process the file.  Create and provide the reference-training document for the spring-batch project. Environment: JAVA,Spring-Batch framework, ANT, UNIX, shell script, Eclipse, Visual studio TFS, TOAD Oracle client, SQL Plus, Oracle 10g/11g.