Sunday 27 September 2015

Sample Resume of Hadoop Developer with 3 years experience

hadoop resume with 3 years experience

overview

3 years of experience in software development life cycle design, development, and support of systems application architecture. 
• More than two years of experience in Hadoop Development/Administration built on six years of experience in Java Application Development. 
• Good knowledge of Hadoop ecosystem, HDFS, Big Data, RDBMS. 
Experienced on working with Big Data and Hadoop File System (HDFS)
• Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce, Flume, OoZie. 
• Strong Knowledge of Hadoop and Hive and Hive's analytical functions. 
• Capturing data from existing databases that provide SQL interfaces using Sqoop. 
• Efficient in building hive, pig and map Reduce scripts. 
• Implemented Proofs of Concept on Hadoop stack and different big data analytic tools, migration from different databases (i.e Teradata, Oracle,MYSQL ) to Hadoop. 
• Successfully loaded files to Hive and HDFS from MongoDB, Cassandra, HBase 
• Loaded the dataset into Hive for ETL Operation. 
• Good knowledge on Hadoop Cluster architecture and monitoring the cluster. 
• Experience in using DBvisualizer, Zoo keeper and cloudera Manager. 
• Hands on experience in IDE tools like Eclipse, Visual Studio. 
• Experience in database design using Stored Procedure, Functions, Triggers and strong experience in writing complex queries for DB2, SQL Server. 
Experience with Business Objects and SSRS, created Universe, developed many Crystal reports and webi reports. 
• Excellent problem solving skills, high analytical skills, good communication and interpersonal skills.

Work Experience

Hadoop Developer

Xyz comapny
February 2013 to Present
Install raw Hadoop and NoSQL applications and develop programs for sorting and analyzing data. 

Responsibilities: 

• Replaced default Derby metadata storage system for Hive with MySQL system. 
• Executed queries using Hive and developed Map-Reduce jobs to analyze data. 
• Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS. 
• Developed the Pig UDF's to preprocess the data for analysis. 
• Developed Hive queries for the analysts. 
Utilized Apache Hadoop environment by Hortonworks. 
• Involved in loading data from LINUX and UNIX file system to HDFS. 
• Supported in setting up QA environment and updating configurations for implementing scripts with Pig. 

Environment: Core Java, Apache Hadoop (Horton works), HDFS, Pig, Hive, Cassandra, Shell Scripting, My Sql, LINUX, UNIX

Hadoop Developer

abc company,Banglore
March 2012 to January 2013
Import-export data into HDFS format, analyze V=Big data using Hadoop environment, Developed UDFs using Hive, Pig Latin and Java. 

Responsibilities: 

• Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase NoSQL database and Sqoop. 
• Importing and exporting data in HDFS and Hive using Sqoop
• Extracted files from MongoDB through Sqoop and placed in HDFS and processed. 
• Experience with NoSQL databases. 


• Written Hive UDFS to extract data from staging tables. 
• Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way. 
• Familiarized with job scheduling using Fair Scheduler so that CPU time is well distributed amongst all the jobs. 
Involved in the regular Hadoop Cluster maintenance such as patching security holes and updating system packages
• Managed Hadoop log files. 
• Analyzed the web log data using the HiveQL. 
Environment: Java 6, Eclipse, Hadoop, Hive, Hbase, MangoDB, Linux, Map Reduce, HDFS, Shell Scripting, Mysql

Education:
Master of Engineering
Jntu 2011

Technical Skills: 

Programming Language: Java, C++, C, SQL, Python 
Java Technologies: JDBC,JSP, Servlets 
RDBMS/NoSQL: SQL server, DB2, HBase, Cassandra, MangoDB 
Scripting: Shell Scripting 
IDE: Eclipse, Netbeans 
Operating Systems: Linux, UNIX, Windows 98/00/xp 
Hadoop Ecosystem: Map Reduce, Sqoop, Hive, Pig, Hbase,Cassandra, HDFS, Zookeeper


Tuesday 15 September 2015

Hadoop admin resume with 2 years experinece


Overview:

• Around 2years  of experience in Hadoop Administration & Big Data Technologies 
• Experience with complete Software Design Lifecycle including design, development, testing and implementation of moderate to advanced complex systems. 
Hands on experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4), Yarn distributions. 
• Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting. 
• Design Big Data solutions for traditional enterprise businesses. 
• Used Network Monitoring Daemons like Ganglia and Service monitoring tools like Nagios. 
Adding/removing new nodes to an existing hadoop cluster
• Backup configuration and Recovery from a NameNode failure. 
• Decommissioning and commissioning the Node on running hadoop cluster. 
• Installation of various Hadoop Ecosystems and Hadoop Daemons. 
• Installation and configuration of Sqoop and Flume
• Involved in bench marking Hadoop/HBase cluster file systems various batch jobs and workloads   
• Experience monitoring and troubleshooting issues with Linux memory, CPU, OS, storage and network 
• Good experience on Design, configure and manage the backup and disaster recovery for Hadoopdata. 
• Hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause. 
• Experience on Commissioning, Decommissioning, Balancing, and Managing Nodes and tuning server for optimal performance of the cluster. 
• As a admin involved in Cluster maintenance, trouble shooting, Monitoring and followed proper backup& Recovery strategies. 
• Experience in HDFS data storage and support for running map-reduce jobs. 
• Installing and configuring hadoop eco system like sqoop, pig, hive
• Knowledge on Hbase and zookeeper. 
• Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. 
• Optimizing performance of Hbase/Hive/Pig jobs. 
• Hands on experience on Nagios and Ganglia tool. 
• Scheduling all hadoop/hive/sqoop/Hbase jobs using Oozie. 

Work Experience

Hadoop Admin
 – M (2 year 0 months)--Mumbai-CompanyName
Responsibilities: 
• Involved in start to end process of hadoop cluster setup where in installation, configuration and monitoring the Hadoop Cluster
• Responsible for Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting, Manage and review data backups, Manage & review Hadoop log files. 
• Monitoring systems and services, architecture design and implementation of hadoop deployment, configuration management, backup, and disaster recovery systems and procedures. 
• Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement 
Importing and exporting data into HDFS using Sqoop
• Experienced in define being job flows with Oozie. 
• Loading log data directly into HDFS using Flume. 
• Experienced in managing and reviewing Hadoop log files. 
• Installation of various Hadoop Ecosystems and Hadoop Daemons. 
Installation and configuration of Sqoop and Flume, Hbase 
• Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately. 
As a admin followed standard Back up policies to make sure the high availability of cluster
• Involved in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references. 
• Worked with systems engineering team to plan and deploy new hadoop environments and expand existing hadoop clusters. 
• Monitored multiple hadoop clusters environments using Ganglia and Nagios. Monitored workload, job performance and capacity planning using Cloudera Manager. 
Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the hadoop cluster
• Involved in Installing and configuring Kerberos for the authentication of users and hadoop daemons. 

Environment: Hadoop, HDFS, Hive, Sqoop, Flume, Zookeeper and HBase, Oracle 9i/10g/11g RAC with Solaris/redhat, Exadata Machines X2/X3, Big Data Cloudera CDH Apache Hadoop, Toad, SQL plus, Oracle Enterprise Manager (OEM), RMAN, Shell Scripting, Golden Gate, Redhat/Suse Linix, EM Cloud Control

Education:

Bachelor of Science, Computer Technology
JNTU -2011

Technical Skills: 

• Hadoop Framework: HDFS, Map Reduce, Pig, Hive, Hbase, sqoop, zookeeper, Oozie, flume,Hue
• Microsoft: MS Office, MS Project, MS Visio, MS Visual Studio […] 
• Databases: Oracle 8i/9i/10g, SQL Server, PL/SQL Developer. 
• Operating Systems: Linux, MacOS, WINDOWS 98/00/NT/XP
• Scripting: Shell Scripting, HTML Scripting 
• Programming: C,C++, Core Java, PL/SQL.


Certifications

Metion your certificates