AnilKumarT_Resume_latest

RESUME
Anil Kumar Thyagarajan
#203, 3rd
Main 6th
Cross, HAL 3rd
Stage, Bangalore 560075
Phone: +91 9845403599
Email: anil.thyagarajan@gmail.com
URL: http://in.linkedin.com/pub/anil-thyagarajan/2/ab2/29a/
With 15+ years of IT experience primarily in Software development for various domains
such as Big Data Analytic, Structured Data, AWS/Azure Cloud Computing, Payment
Gateways, Search Advertising, Systems/Networking/ISP, Capital Market financial products
and Supply Chain Products. Articulate and professional business skills, highly productive in
team and individual projects, strong research and time-management with programming
skills driven to maintain the industry knowledge and technical skills through formal and
independent training.
Objective:
Seeking technical lead role in the area of Big Data Analytics, Cloud Computing and
Infrastructure development.
Career Profile:
Senior SDE Big Data Platform, Microsoft R&D India, Oct2014 – Till date
Technical Specialist R&D Cloud Computing, Nokia, Aug2011 – Oct2014
Principal Engineer for Yahoo India, Oct2009 – Aug2011
Sr Systems Programmer for AOL Inc, Aug2005 – Oct2009
Consultant for Capco IT services, India, Mar 2004 – Aug-2005
Software Engineer for HP India, India, Jun 2003 – Mar 2004
Product Engineer for i2 Technologies Inc, Bangalore, India, Jan 2000 – May 30 2003
Professional Strength:
• Leading team in technical direction, delivery and deploy.
• Bridging the gap with Architecture and development team, by transforming design to
product.
• Involve in Architectural design and contribute with new ideas.
• Effective team player with clear communication.
• Cloud Computing on Amazon Web Services for Building Big data Analytics platform.
• Distributed computing - Hadoop, NoSQL
• Expertise in areas of System/Network Tools development and large infrastructure
monitoring.
• Advanced Perl programming skills.
• Intermediate skills in Java & Python
• Experience with Amazon Web Services like EC2, S3 and EMR.
• Analytics tools experience – MapReduce, Hive
• Worked on multiple horizontal domains.
• Diverse functions – Development, Service Engineering, Infrastructure Tools and
Products.
• Practical thinker and quick learner.
Last Updated on 14 April, 2016 Page 1 of 7
Education:
Bachelors in Electrical & Electronics Engineering (1st Class)
(B M S College of Engineering, Bangalore, 1995-1999)
Technical Skills:
Operating Systems: Unix, Linux & Windows
Databases : Postgres, Mysql, Oracle 10g, Sybase
Languages : Advanced PERL, Python, C#, Java
Cloud : Azure, AWS
Distributed Systems : Virtualization, Hadoop ecosystem, NoSQL
Training & Courses
Technical Details:
• Perl - POE, CGI, Catalyst
• C#, .NET
• Java Spring resteasy
• Products on Amazon Web Services
• BGP, OSPF & ISIS configuration for routers.
• Java, ExtJS, HTML, Servlets, JSP
• Webserver – Apache, Jboss, Tomcat
• Hadoop Developer Training (Cloudera)
• Shell & Awk programming (System Logic, Bangalore)
• PL/SQL with Oracle9i (i2 Technologies)
• Data Ware Housing Tools(Erwin, Informatica, BI) – ETL
• C, C++ (LearnSoft, Bangalore)
• Linux Kernel Internals (Linux Training Center, Bangalore)
• Javascript, CSS (Yahoo Internal Training)
Management & Functional:
• License to Lead (Nokia Internal)
• PMPC Classes – Project Management Practitioners conference, PMI Bangalore
• Fundamentals of Capital Market (Capco, Sunnyvale US)
• PRM (partner Relationship Management) portal study and analysis(HP, Cupertino
USA)
• Quality System Awareness (Zandig TQM Solutions)
Last Updated on 14 April, 2016 Page 2 of 7
WORK ASSIGNMENTS
• Oct 2014 – Till Date ( Microsoft )
Working on Azure-HDInsight Big Data platform - A managed Apache Hadoop, Spark,
R, HBase, and Storm cloud service made easy.
- Worked on the platform to enable Hadoop ResourceManager High Availability
(RM-HA) in HDInsight Hadoop cluster.
- Enabled Azure Datalake store for HDInsight Linux clusters.
- Enhanced features in Linux Hadoop clusters offerings by introducing new
workflows.
- Working on tools to enhance monitoring and deployment productivity.
• Aug 2011 – Oct 2014 ( NOKIA )
Worked as a Tech Lead in Big Data Analytics. Being part of a cloud platform team, I
was engaged in development of platform tools and integrating functionalities around NoSQL
and Hadoop systems. My recent role was in building a platform for Analytics as a Service on
Amazon cloud (AWS).
Work Details:
- Part of the architecture team to fully design Hadoop eco system in Amazon
(AWS) as Analytic service. Also played a role of lead and mentor for the team.
Was responsible for the end-to-end execution of the project right from inception
to deployment in AWS.
o This included provisioning user/group and datasets with access control
using IAM and S3 buckets policy.
o Building many smaller services for data ingestion, extraction and user
authentication.
o Job Management for abstracting AWS EMR as part of the Analytics service
platform.
o Query Service for real time execution of Hive query in both synchronous
and asynchronous model over EMR as part of the Analytics service
platform
o Designing the orchestration of all the components that makes the
Analytics Service platform for Nokia
- Developed Tools and executed migration of 2PetaBytes of data from Hadoop to
Amazon-S3. This is very challenging in terms of handling millions of file and
validating the data integrity after transfer.
- Was part of the team to integrate Qubole (http://www.qubole.com - Big Data as
a Service) to our Analytics platform in AWS.
- Designed and developed a fully functional visual performance tool for Structured
Data (NoSQL). User interface in ExtJS, Middle layer using Perl-POE and load
driver in Python. This tool generates distributed load for large RPS (Req/sec) in a
timeline series.
- Experience on working with structured NoSQL key value store.
- Was part of the design and development of distributed system deployment tool.
Last Updated on 14 April, 2016 Page 3 of 7
- End2End Design and development of REST API layer for Hadoop. This included
client authentication strategy, rest layer authorization with HDFS using KDC and
exposing different functionalities as resources. JAVA Spring resteasy framework
was used to build the rest system.
- Experience on working with Map Reduce and tools around the Hadoop eco
system.
• Oct 2009 – Aug 2011 ( YAHOO India)
Played a role as Principal Engineer in Service Engineering of Yahoo’s Search
Advertising product called Panama (http://searchmarketing.yahoo.com). I also work
as a lead engineer for Yahoo payment gateway (wallet.yahoo.com).
Activities Include:
- Writing software in Perl to monitor large-scale diverse applications. Enhancing
framework for metric collection system.
- Writing plugins for monitoring large infrastructure through Nagios using Yahoo
built in wrappers.
- Deployment operations related to product development cycle and SCM.
- Managing the complete operations of the Search advertisement applications
related to technical issues and escalations. Contributing to system architecture
and designs.
- Complete ownership of hardware Systems related to sustenance and capacity
planning.
- Tool development related to System infrastructure, like vip viewers, network
topology and resources.
- New launches and release planning.
- 12X7 support and escalation for critical Oncall activities.
• Aug 2005 – Oct 2009 ( AOL India)
Details of Work Involved
I worked in the systems & network infrastructure team building and designing
solutions for the range of infrastructure products that directly impact the ISP back
bone and Internet services. Perl programming language is extensively used in
building network products that monitor and collect metric from thousands of host
and network devices like switches & routers.
• MEMOSYS – Metrics & Monitoring system is developed using the POE
networking CPAN module in Perl. This complex framework written in Perl uses
event driven mechanism to gather statistics. Socket, pipes and Tcl are heavily
used for inter process communication.
• ATDN ( AOL Transit Data Network ) - Writing frameworks & software for the
network engineering team. These involve communicating with Cisco/Juniper
routers, switches for configuration read & write. The software’s written in Perl
are used by network engineers to manage the network devices for various
topology related and maintenance activities.
• IRR – Internet routing registry ( whois.aoltw.net ) This is a daemon running a
database with information of worldwide prefixes commonly known as cidr’s.
We manage this critical data that gets pushed to all the ATDN pop and
backbone routers. Software written in Perl using POE for network data
collection and building GUI interfaces for customers.
Last Updated on 14 April, 2016 Page 4 of 7
• KPI – Key performance Indices is a system to statistically determine the
network usage for different parameters like device uplink utilization, pop to
backbone traffic etc.
• Billing Application – AOL ISP business bills its networking peers via this
application. The information of bandwidth used like the megabytes transferred
in/out through a router interface is stored in a huge Berkley DB and Perl
framework pulls data and builds reports that are used by finance analysts.
• NMS – Network management System is the centre point for all the meta
information of network devices. This is built around a complex Perl regex
engine to parse all the network device command output. This is a common
point of network data access for all other dependent applications. This has an
exposed GUI via Ruby on Rails.
• IVY (Install View & Yawn) – Automated Linux installing framework for all
verities of chip architecture like i386, x86_64. This framework written in
Perl(POE based) helps in remote installation and upgrading servers from
different versions.
• Standalone monitors – There are thousands of services that AOL provides and
they need to be monitored. So Perl is extensively used to monitor these
processes with POE (Perl object Environment) module and other CPAN
modules and home grown modules for router communication, DNS query,
IRR(Routing registry), Database connections etc.
• Mar 2004 – Aug 2005 ( Capco )
Capco is the first services and technology solutions provider exclusively focused on
forming the future of the financial services industry. We unite thought leadership and
practical application to improve efficiency and profitability for our clients.
Details of Work Involved
• Detailed system study on different products in the Finance Capital Market
domain (USA, CA SanJose)
• Products like GIM (Global Index Monitor), SW (Sector Watch) etc
programmed using perl and Sybase.
• Enhancement and new application development.
• Designed the enterprise frame work using Appconfig, SPOPS and Openinteract
to build a complete perl based engine for Global Dividend forecast.
• Jun 2003 – Mar 2004 (Digital – HP Company)
Digital being a subsidiary of HP executes many of its IT requirements. One of them
being the portal for IPG group called Pweb. This is a two way communication vehicle
between HP and the distributor/retailers. Many applications are hosted on this portal
which involves the Partners transacting for most of the business requirements with
HP.
Details of Work Involved
• Detailed system study and analysis for changing business requirements.
• Enhancement and new application development.
• 24/7 L3/L4 support of the system and escalations based on SLA.
• System Topology – HP Unix (Requires Unix admin skills)
Oracle 9i (Latest techniques used)
Last Updated on 14 April, 2016 Page 5 of 7
Perl & Mod Perl (For CGI programming)
Apache (Web server)
• System Architecture - Audiencing technique (To generate separate views for
every user)
Application interacts with multiple remote databases.
• Working with the above said technologies wrt complex business requirements.
• Jan 2000 – May 30 2003 (i2 Technologies)
i2 is a leading provider of value chain management solutions. i2's value chain management
solutions help companies plan and execute the activities involved in managing supply and
demand. These solutions span the entire scope of value chain interactions, including
supplier relationship management, supply chain management and demand chain
management. As a product Engineer in Content operations which is a life blood for other i2
solutions I was given the responsibility of vendor Management. Most of the database
development was created in the vendor location(Outsourced work). I was actively involved
in Vendor development, Process Building and development of required tools i2 is a pioneer
in MRO(Maintenance & Repair Operations) vertical and hence it required lot of expertise &
technical skills in the Core Electrical & Electronics component segment. Complex parent-
Class hierarchy is being developed to accommodate part information of technical products
across thousands of Manufacturers and suppliers. Parametric Data created is integrated or
Migrated into multiple customer databases The databases are then linked to many of the i2
Products to reap the benefits of an overall Supply Chain Management Solution.
Details of Work Involved
• Support and Maintenance of i2 Enterprise Solution products (SRM) and also
deploying enhancements in products for licensed clients.
• Shell programming on Unix (Sun OS)/Linux.
• Sed & AWK programming for processing of large amount of data before release to
customer database, Legal validation of content, file handling and report generation.
• PERL scripting for data insertion into Oracle database with the required modules,
report generation and process handling. Updations of data with current and complete
content with cross verification across database. Pattern searching/Mapping for critical
technical/commerce data across MRO manufacturers technical catalogs.
• Designing the schema/model for the database that will be best suited for the
activities involved in the implementation of the business logics.
• Writing procedures in Oracle for data hopping between databases through VPN
network. Report generation and updations with PL/SQL and J2EE. Implementation of
oracle-Triggers to construct module based actions.
• Vendor management and Development as most of the content development in MRO
vertical was outsourced. Process and Guideline building to take the necessary
proactive step in ever changing dynamic requirements of the customers. Actively
involved in the recruitments of personals at vendor premises.
• Technical consultant for MRO electrical content between i2 and clients/vendors,
which included detailed understanding & analysis of components across
Electrical/Electronics vertical for building mission critical parametric content
database.
• Integration & migration of data using knowledge based tools such as i2Explore and
Discovery.
• Developed and deployed the complete online catalog tracking system for the i2 Input
Management team. Business components included JSP, Java Beans and Oracle
Dbase on Tomcat Web server.
• Quality Control with implementation of documentation and Corrective & Preventive
action for the running of tools and correctness of data.
Last Updated on 14 April, 2016 Page 6 of 7
References
Available on Request.
Last Updated on 14 April, 2016 Page 7 of 7

Recomendados

Apache Spark—Apache HBase Connector: Feature Rich and Efficient Access to HBa... por
Apache Spark—Apache HBase Connector: Feature Rich and Efficient Access to HBa...Apache Spark—Apache HBase Connector: Feature Rich and Efficient Access to HBa...
Apache Spark—Apache HBase Connector: Feature Rich and Efficient Access to HBa...Spark Summit
2.7K vistas27 diapositivas
Creating the Internet of Your Things por
Creating the Internet of Your ThingsCreating the Internet of Your Things
Creating the Internet of Your ThingsDataWorks Summit/Hadoop Summit
671 vistas16 diapositivas
Hadoop & Cloud Storage: Object Store Integration in Production por
Hadoop & Cloud Storage: Object Store Integration in ProductionHadoop & Cloud Storage: Object Store Integration in Production
Hadoop & Cloud Storage: Object Store Integration in ProductionDataWorks Summit/Hadoop Summit
851 vistas26 diapositivas
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc... por
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...
Modernizing Business Processes with Big Data: Real-World Use Cases for Produc...DataWorks Summit/Hadoop Summit
894 vistas28 diapositivas
Dancing Elephants - Efficiently Working with Object Stories from Apache Spark... por
Dancing Elephants - Efficiently Working with Object Stories from Apache Spark...Dancing Elephants - Efficiently Working with Object Stories from Apache Spark...
Dancing Elephants - Efficiently Working with Object Stories from Apache Spark...DataWorks Summit/Hadoop Summit
948 vistas30 diapositivas
Introduction to Apache NiFi dws19 DWS - DC 2019 por
Introduction to Apache NiFi   dws19 DWS - DC 2019Introduction to Apache NiFi   dws19 DWS - DC 2019
Introduction to Apache NiFi dws19 DWS - DC 2019Timothy Spann
1.8K vistas22 diapositivas

Más contenido relacionado

La actualidad más candente

Rajeev kumar apache_spark & scala developer por
Rajeev kumar apache_spark & scala developerRajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developerRajeev Kumar
66 vistas6 diapositivas
Innovation in the Enterprise Rent-A-Car Data Warehouse por
Innovation in the Enterprise Rent-A-Car Data WarehouseInnovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseDataWorks Summit
1K vistas27 diapositivas
Spark + Hadoop Perfect together por
Spark + Hadoop Perfect togetherSpark + Hadoop Perfect together
Spark + Hadoop Perfect togetherIsheeta Sanghi
3.4K vistas22 diapositivas
Visualizing Big Data in Realtime por
Visualizing Big Data in RealtimeVisualizing Big Data in Realtime
Visualizing Big Data in RealtimeDataWorks Summit
1.3K vistas20 diapositivas
Using Familiar BI Tools and Hadoop to Analyze Enterprise Networks por
Using Familiar BI Tools and Hadoop to Analyze Enterprise NetworksUsing Familiar BI Tools and Hadoop to Analyze Enterprise Networks
Using Familiar BI Tools and Hadoop to Analyze Enterprise NetworksMapR Technologies
1.5K vistas31 diapositivas
Running Zeppelin in Enterprise por
Running Zeppelin in EnterpriseRunning Zeppelin in Enterprise
Running Zeppelin in EnterpriseDataWorks Summit
927 vistas28 diapositivas

La actualidad más candente(20)

Rajeev kumar apache_spark & scala developer por Rajeev Kumar
Rajeev kumar apache_spark & scala developerRajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developer
Rajeev Kumar66 vistas
Innovation in the Enterprise Rent-A-Car Data Warehouse por DataWorks Summit
Innovation in the Enterprise Rent-A-Car Data WarehouseInnovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data Warehouse
DataWorks Summit1K vistas
Spark + Hadoop Perfect together por Isheeta Sanghi
Spark + Hadoop Perfect togetherSpark + Hadoop Perfect together
Spark + Hadoop Perfect together
Isheeta Sanghi3.4K vistas
Visualizing Big Data in Realtime por DataWorks Summit
Visualizing Big Data in RealtimeVisualizing Big Data in Realtime
Visualizing Big Data in Realtime
DataWorks Summit1.3K vistas
Using Familiar BI Tools and Hadoop to Analyze Enterprise Networks por MapR Technologies
Using Familiar BI Tools and Hadoop to Analyze Enterprise NetworksUsing Familiar BI Tools and Hadoop to Analyze Enterprise Networks
Using Familiar BI Tools and Hadoop to Analyze Enterprise Networks
MapR Technologies1.5K vistas
MLOps with a Feature Store: Filling the Gap in ML Infrastructure por Data Science Milan
MLOps with a Feature Store: Filling the Gap in ML InfrastructureMLOps with a Feature Store: Filling the Gap in ML Infrastructure
MLOps with a Feature Store: Filling the Gap in ML Infrastructure
Data Science Milan478 vistas
Boost Performance with Scala – Learn From Those Who’ve Done It! por Cécile Poyet
Boost Performance with Scala – Learn From Those Who’ve Done It! Boost Performance with Scala – Learn From Those Who’ve Done It!
Boost Performance with Scala – Learn From Those Who’ve Done It!
Cécile Poyet214 vistas
Hortonworks tech workshop in-memory processing with spark por Hortonworks
Hortonworks tech workshop   in-memory processing with sparkHortonworks tech workshop   in-memory processing with spark
Hortonworks tech workshop in-memory processing with spark
Hortonworks9.9K vistas
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin... por DataWorks Summit
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
DataWorks Summit1.4K vistas
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data por DataWorks Summit
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDruid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
DataWorks Summit3.4K vistas
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa... por Databricks
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
Databricks2.3K vistas
Hadoop first ETL on Apache Falcon por DataWorks Summit
Hadoop first ETL on Apache FalconHadoop first ETL on Apache Falcon
Hadoop first ETL on Apache Falcon
DataWorks Summit2.6K vistas
Treat your enterprise data lake indigestion: Enterprise ready security and go... por DataWorks Summit
Treat your enterprise data lake indigestion: Enterprise ready security and go...Treat your enterprise data lake indigestion: Enterprise ready security and go...
Treat your enterprise data lake indigestion: Enterprise ready security and go...
DataWorks Summit1.6K vistas

Destacado

Mike resume, Nov 2014 por
Mike resume, Nov 2014Mike resume, Nov 2014
Mike resume, Nov 2014Michael Wong
382 vistas3 diapositivas
Shashank_Venkataramanacharya por
Shashank_VenkataramanacharyaShashank_Venkataramanacharya
Shashank_VenkataramanacharyaShashank Venkataramanacharya
133 vistas4 diapositivas
ASHWINI RANE RESUME por
ASHWINI RANE RESUMEASHWINI RANE RESUME
ASHWINI RANE RESUMEAshwini Rane
264 vistas2 diapositivas
Muthuraj_resume por
Muthuraj_resumeMuthuraj_resume
Muthuraj_resumeMuthu Raj
414 vistas4 diapositivas
Pooja_Koparde_Testing por
Pooja_Koparde_TestingPooja_Koparde_Testing
Pooja_Koparde_TestingPooja Koparde
94 vistas2 diapositivas
JyothishNewResume5exp por
JyothishNewResume5expJyothishNewResume5exp
JyothishNewResume5expJyothish menon
564 vistas4 diapositivas

Similar a AnilKumarT_Resume_latest

Resume por
ResumeResume
Resumenagapandu
447 vistas2 diapositivas
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible) por
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)Gaurav Srivastav
709 vistas4 diapositivas
Resume por
ResumeResume
Resumerajkarove
208 vistas4 diapositivas
Abhijit_Saurabh_Resume por
Abhijit_Saurabh_ResumeAbhijit_Saurabh_Resume
Abhijit_Saurabh_ResumeAbhijit Saurabh
204 vistas1 diapositiva
Sanjaykumar Kakaso Mane_MAY2016 por
Sanjaykumar Kakaso Mane_MAY2016Sanjaykumar Kakaso Mane_MAY2016
Sanjaykumar Kakaso Mane_MAY2016Sanjay Mane
269 vistas7 diapositivas
Bigdata.sunil_6+yearsExp por
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpbigdata sunil
54 vistas4 diapositivas

Similar a AnilKumarT_Resume_latest(20)

Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible) por Gaurav Srivastav
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav Srivastav709 vistas
Sanjaykumar Kakaso Mane_MAY2016 por Sanjay Mane
Sanjaykumar Kakaso Mane_MAY2016Sanjaykumar Kakaso Mane_MAY2016
Sanjaykumar Kakaso Mane_MAY2016
Sanjay Mane269 vistas
Hadoop Big Data Resume por arbind_jha
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
arbind_jha2.7K vistas
Hadoop Big Data Resume por arbind_jha
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
arbind_jha1.3K vistas
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml por Shubham Mallick
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-mlShubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham Mallick54 vistas
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra por Rawud Manasseh
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & InfraClient Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
Rawud Manasseh295 vistas
Architecting an Open Source AI Platform 2018 edition por David Talby
Architecting an Open Source AI Platform   2018 editionArchitecting an Open Source AI Platform   2018 edition
Architecting an Open Source AI Platform 2018 edition
David Talby1.1K vistas
Chandan's_Resume por Chandan Das
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
Chandan Das313 vistas
Ahmad_Resume_Ar por Ahmad Shaik
Ahmad_Resume_ArAhmad_Resume_Ar
Ahmad_Resume_Ar
Ahmad Shaik134 vistas
Jesy George_CV_LATEST por Jesy George
Jesy George_CV_LATESTJesy George_CV_LATEST
Jesy George_CV_LATEST
Jesy George91 vistas

AnilKumarT_Resume_latest

  • 1. RESUME Anil Kumar Thyagarajan #203, 3rd Main 6th Cross, HAL 3rd Stage, Bangalore 560075 Phone: +91 9845403599 Email: anil.thyagarajan@gmail.com URL: http://in.linkedin.com/pub/anil-thyagarajan/2/ab2/29a/ With 15+ years of IT experience primarily in Software development for various domains such as Big Data Analytic, Structured Data, AWS/Azure Cloud Computing, Payment Gateways, Search Advertising, Systems/Networking/ISP, Capital Market financial products and Supply Chain Products. Articulate and professional business skills, highly productive in team and individual projects, strong research and time-management with programming skills driven to maintain the industry knowledge and technical skills through formal and independent training. Objective: Seeking technical lead role in the area of Big Data Analytics, Cloud Computing and Infrastructure development. Career Profile: Senior SDE Big Data Platform, Microsoft R&D India, Oct2014 – Till date Technical Specialist R&D Cloud Computing, Nokia, Aug2011 – Oct2014 Principal Engineer for Yahoo India, Oct2009 – Aug2011 Sr Systems Programmer for AOL Inc, Aug2005 – Oct2009 Consultant for Capco IT services, India, Mar 2004 – Aug-2005 Software Engineer for HP India, India, Jun 2003 – Mar 2004 Product Engineer for i2 Technologies Inc, Bangalore, India, Jan 2000 – May 30 2003 Professional Strength: • Leading team in technical direction, delivery and deploy. • Bridging the gap with Architecture and development team, by transforming design to product. • Involve in Architectural design and contribute with new ideas. • Effective team player with clear communication. • Cloud Computing on Amazon Web Services for Building Big data Analytics platform. • Distributed computing - Hadoop, NoSQL • Expertise in areas of System/Network Tools development and large infrastructure monitoring. • Advanced Perl programming skills. • Intermediate skills in Java & Python • Experience with Amazon Web Services like EC2, S3 and EMR. • Analytics tools experience – MapReduce, Hive • Worked on multiple horizontal domains. • Diverse functions – Development, Service Engineering, Infrastructure Tools and Products. • Practical thinker and quick learner. Last Updated on 14 April, 2016 Page 1 of 7
  • 2. Education: Bachelors in Electrical & Electronics Engineering (1st Class) (B M S College of Engineering, Bangalore, 1995-1999) Technical Skills: Operating Systems: Unix, Linux & Windows Databases : Postgres, Mysql, Oracle 10g, Sybase Languages : Advanced PERL, Python, C#, Java Cloud : Azure, AWS Distributed Systems : Virtualization, Hadoop ecosystem, NoSQL Training & Courses Technical Details: • Perl - POE, CGI, Catalyst • C#, .NET • Java Spring resteasy • Products on Amazon Web Services • BGP, OSPF & ISIS configuration for routers. • Java, ExtJS, HTML, Servlets, JSP • Webserver – Apache, Jboss, Tomcat • Hadoop Developer Training (Cloudera) • Shell & Awk programming (System Logic, Bangalore) • PL/SQL with Oracle9i (i2 Technologies) • Data Ware Housing Tools(Erwin, Informatica, BI) – ETL • C, C++ (LearnSoft, Bangalore) • Linux Kernel Internals (Linux Training Center, Bangalore) • Javascript, CSS (Yahoo Internal Training) Management & Functional: • License to Lead (Nokia Internal) • PMPC Classes – Project Management Practitioners conference, PMI Bangalore • Fundamentals of Capital Market (Capco, Sunnyvale US) • PRM (partner Relationship Management) portal study and analysis(HP, Cupertino USA) • Quality System Awareness (Zandig TQM Solutions) Last Updated on 14 April, 2016 Page 2 of 7
  • 3. WORK ASSIGNMENTS • Oct 2014 – Till Date ( Microsoft ) Working on Azure-HDInsight Big Data platform - A managed Apache Hadoop, Spark, R, HBase, and Storm cloud service made easy. - Worked on the platform to enable Hadoop ResourceManager High Availability (RM-HA) in HDInsight Hadoop cluster. - Enabled Azure Datalake store for HDInsight Linux clusters. - Enhanced features in Linux Hadoop clusters offerings by introducing new workflows. - Working on tools to enhance monitoring and deployment productivity. • Aug 2011 – Oct 2014 ( NOKIA ) Worked as a Tech Lead in Big Data Analytics. Being part of a cloud platform team, I was engaged in development of platform tools and integrating functionalities around NoSQL and Hadoop systems. My recent role was in building a platform for Analytics as a Service on Amazon cloud (AWS). Work Details: - Part of the architecture team to fully design Hadoop eco system in Amazon (AWS) as Analytic service. Also played a role of lead and mentor for the team. Was responsible for the end-to-end execution of the project right from inception to deployment in AWS. o This included provisioning user/group and datasets with access control using IAM and S3 buckets policy. o Building many smaller services for data ingestion, extraction and user authentication. o Job Management for abstracting AWS EMR as part of the Analytics service platform. o Query Service for real time execution of Hive query in both synchronous and asynchronous model over EMR as part of the Analytics service platform o Designing the orchestration of all the components that makes the Analytics Service platform for Nokia - Developed Tools and executed migration of 2PetaBytes of data from Hadoop to Amazon-S3. This is very challenging in terms of handling millions of file and validating the data integrity after transfer. - Was part of the team to integrate Qubole (http://www.qubole.com - Big Data as a Service) to our Analytics platform in AWS. - Designed and developed a fully functional visual performance tool for Structured Data (NoSQL). User interface in ExtJS, Middle layer using Perl-POE and load driver in Python. This tool generates distributed load for large RPS (Req/sec) in a timeline series. - Experience on working with structured NoSQL key value store. - Was part of the design and development of distributed system deployment tool. Last Updated on 14 April, 2016 Page 3 of 7
  • 4. - End2End Design and development of REST API layer for Hadoop. This included client authentication strategy, rest layer authorization with HDFS using KDC and exposing different functionalities as resources. JAVA Spring resteasy framework was used to build the rest system. - Experience on working with Map Reduce and tools around the Hadoop eco system. • Oct 2009 – Aug 2011 ( YAHOO India) Played a role as Principal Engineer in Service Engineering of Yahoo’s Search Advertising product called Panama (http://searchmarketing.yahoo.com). I also work as a lead engineer for Yahoo payment gateway (wallet.yahoo.com). Activities Include: - Writing software in Perl to monitor large-scale diverse applications. Enhancing framework for metric collection system. - Writing plugins for monitoring large infrastructure through Nagios using Yahoo built in wrappers. - Deployment operations related to product development cycle and SCM. - Managing the complete operations of the Search advertisement applications related to technical issues and escalations. Contributing to system architecture and designs. - Complete ownership of hardware Systems related to sustenance and capacity planning. - Tool development related to System infrastructure, like vip viewers, network topology and resources. - New launches and release planning. - 12X7 support and escalation for critical Oncall activities. • Aug 2005 – Oct 2009 ( AOL India) Details of Work Involved I worked in the systems & network infrastructure team building and designing solutions for the range of infrastructure products that directly impact the ISP back bone and Internet services. Perl programming language is extensively used in building network products that monitor and collect metric from thousands of host and network devices like switches & routers. • MEMOSYS – Metrics & Monitoring system is developed using the POE networking CPAN module in Perl. This complex framework written in Perl uses event driven mechanism to gather statistics. Socket, pipes and Tcl are heavily used for inter process communication. • ATDN ( AOL Transit Data Network ) - Writing frameworks & software for the network engineering team. These involve communicating with Cisco/Juniper routers, switches for configuration read & write. The software’s written in Perl are used by network engineers to manage the network devices for various topology related and maintenance activities. • IRR – Internet routing registry ( whois.aoltw.net ) This is a daemon running a database with information of worldwide prefixes commonly known as cidr’s. We manage this critical data that gets pushed to all the ATDN pop and backbone routers. Software written in Perl using POE for network data collection and building GUI interfaces for customers. Last Updated on 14 April, 2016 Page 4 of 7
  • 5. • KPI – Key performance Indices is a system to statistically determine the network usage for different parameters like device uplink utilization, pop to backbone traffic etc. • Billing Application – AOL ISP business bills its networking peers via this application. The information of bandwidth used like the megabytes transferred in/out through a router interface is stored in a huge Berkley DB and Perl framework pulls data and builds reports that are used by finance analysts. • NMS – Network management System is the centre point for all the meta information of network devices. This is built around a complex Perl regex engine to parse all the network device command output. This is a common point of network data access for all other dependent applications. This has an exposed GUI via Ruby on Rails. • IVY (Install View & Yawn) – Automated Linux installing framework for all verities of chip architecture like i386, x86_64. This framework written in Perl(POE based) helps in remote installation and upgrading servers from different versions. • Standalone monitors – There are thousands of services that AOL provides and they need to be monitored. So Perl is extensively used to monitor these processes with POE (Perl object Environment) module and other CPAN modules and home grown modules for router communication, DNS query, IRR(Routing registry), Database connections etc. • Mar 2004 – Aug 2005 ( Capco ) Capco is the first services and technology solutions provider exclusively focused on forming the future of the financial services industry. We unite thought leadership and practical application to improve efficiency and profitability for our clients. Details of Work Involved • Detailed system study on different products in the Finance Capital Market domain (USA, CA SanJose) • Products like GIM (Global Index Monitor), SW (Sector Watch) etc programmed using perl and Sybase. • Enhancement and new application development. • Designed the enterprise frame work using Appconfig, SPOPS and Openinteract to build a complete perl based engine for Global Dividend forecast. • Jun 2003 – Mar 2004 (Digital – HP Company) Digital being a subsidiary of HP executes many of its IT requirements. One of them being the portal for IPG group called Pweb. This is a two way communication vehicle between HP and the distributor/retailers. Many applications are hosted on this portal which involves the Partners transacting for most of the business requirements with HP. Details of Work Involved • Detailed system study and analysis for changing business requirements. • Enhancement and new application development. • 24/7 L3/L4 support of the system and escalations based on SLA. • System Topology – HP Unix (Requires Unix admin skills) Oracle 9i (Latest techniques used) Last Updated on 14 April, 2016 Page 5 of 7
  • 6. Perl & Mod Perl (For CGI programming) Apache (Web server) • System Architecture - Audiencing technique (To generate separate views for every user) Application interacts with multiple remote databases. • Working with the above said technologies wrt complex business requirements. • Jan 2000 – May 30 2003 (i2 Technologies) i2 is a leading provider of value chain management solutions. i2's value chain management solutions help companies plan and execute the activities involved in managing supply and demand. These solutions span the entire scope of value chain interactions, including supplier relationship management, supply chain management and demand chain management. As a product Engineer in Content operations which is a life blood for other i2 solutions I was given the responsibility of vendor Management. Most of the database development was created in the vendor location(Outsourced work). I was actively involved in Vendor development, Process Building and development of required tools i2 is a pioneer in MRO(Maintenance & Repair Operations) vertical and hence it required lot of expertise & technical skills in the Core Electrical & Electronics component segment. Complex parent- Class hierarchy is being developed to accommodate part information of technical products across thousands of Manufacturers and suppliers. Parametric Data created is integrated or Migrated into multiple customer databases The databases are then linked to many of the i2 Products to reap the benefits of an overall Supply Chain Management Solution. Details of Work Involved • Support and Maintenance of i2 Enterprise Solution products (SRM) and also deploying enhancements in products for licensed clients. • Shell programming on Unix (Sun OS)/Linux. • Sed & AWK programming for processing of large amount of data before release to customer database, Legal validation of content, file handling and report generation. • PERL scripting for data insertion into Oracle database with the required modules, report generation and process handling. Updations of data with current and complete content with cross verification across database. Pattern searching/Mapping for critical technical/commerce data across MRO manufacturers technical catalogs. • Designing the schema/model for the database that will be best suited for the activities involved in the implementation of the business logics. • Writing procedures in Oracle for data hopping between databases through VPN network. Report generation and updations with PL/SQL and J2EE. Implementation of oracle-Triggers to construct module based actions. • Vendor management and Development as most of the content development in MRO vertical was outsourced. Process and Guideline building to take the necessary proactive step in ever changing dynamic requirements of the customers. Actively involved in the recruitments of personals at vendor premises. • Technical consultant for MRO electrical content between i2 and clients/vendors, which included detailed understanding & analysis of components across Electrical/Electronics vertical for building mission critical parametric content database. • Integration & migration of data using knowledge based tools such as i2Explore and Discovery. • Developed and deployed the complete online catalog tracking system for the i2 Input Management team. Business components included JSP, Java Beans and Oracle Dbase on Tomcat Web server. • Quality Control with implementation of documentation and Corrective & Preventive action for the running of tools and correctness of data. Last Updated on 14 April, 2016 Page 6 of 7
  • 7. References Available on Request. Last Updated on 14 April, 2016 Page 7 of 7