Rss

sujitha H

Developer

Occupation:

Developer

Education Level:

Master

Will Relocate:

YES

CollapseDescription

PROFESSIONAL SUMMARY Around 9 Years of experience in the field of Information Technology which includes a major concentration on Big Data Tools and Technologies, various Relational Databases and NoSQL Databases, Java Programming language and J2EE technologies with highly recommended software practices. * Having 4+ years of experience in the field of software development creating solutions using Enterprise Applications and Web based Applications using JAVA & J2EE Technologies. * Having 4 years of experience as a Big Data Engineer with good understanding of Hadoop framework, Big Data Tools and Technologies for implementing Data analytics. * Hadoopdeveloper: Excellent hands on experience using Hadoop tools like HDFS, Hive, Pig, Apache Spark, Apache Sqoop, Flume, Oozie, Apache Kafka, Apache storm, Yarn, Impala, Zookeeper, Hue. Experience in analyzing data using HiveQL, Pig Latin, and MapReduce Programs. * Experienced in ingesting data into HDFS from various Relational databases like MYSQL, Oracle, DB2, Teradata using sqoop. * Experienced in importing real time streaming logs and aggregating the data to HDFS using Kafka and Flume. * Excellent knowledge on creating real-time data streaming solutions using Apachestorm, sparkstreaming and building sparkapplications using scala. * Well versed with various Hadoop distributions which includes apache Hadoop, cloudera, Hortonworks and knowledge on MAPR distribution. * Experienced in creating various tables in Hive which include Managed Tables and External tables and loading data into Hive from HDFS. * Implemented Bucketing and partitioning concepts in Hive and also wrote UDF's in order to create user defined functionalities. * Implemented Pig scripts for analyzing large data sets in the HDFS by performing various transformations. * Experience in analyzing data using HiveQL, PigLatin, HBase. * Capable of processing large sets of structured, semi-structured and unstructured data and supporting system application architecture. * Experience working on NoSQL Databases like HBase, Cassandra and MongoDB. * Experience in Python, Scala, shell scripting. * Experience in Creating various Oozie jobs to manage processing workflows. * Experience in using Amazon Cloud components S3, EC2, Elastic beanstalk and DynamoDB. * Experience in using various file formats including XML, JSON, CSV and other file formats like text, sequence files, avro, ORC and Parquette using various compression techniques like snappy, LZO. * Experience with Testing Map Reduce programs using MRUnit, Junit and EasyMock. * Knowledge on Machine Learning and Predictive Analysis. * Worked on Tableu data visualization tools and also integrated the data using Talend. * Worked on various Relational Databases like Postgres, MySQL, Oracle 10g, DB2. * Created java applications which are used to connect to database using JDBC, JSP, Spring and Hibernate. * Knowledge on design, development of web based applications using HTML, DHTML, CSS, JavaScript, JQuery, JSP and Servlets. * Experience on various build tools like ANT, MAVEN, Graddle, SBT. * Knowledge on creating dashboards/reports using reporting tools like Tableu, Qlickview. * Development experience with IDE's Eclipse, NetBeans, IntelliJ and repositories SVN, GIT and CVS. * Having good experience in different software methodologies like waterfall and agile approach. * Knowledge on writing YARN applications. * Passionate about working on the most cutting-edge Big Data technologies. * Ability to adapt to evolving technology, strong sense of responsibility and accomplishment. * Willing to update my knowledge and learn new skills according to business requirement.

Right_template4_bottom

CollapseAccomplishments

Highlights:

Left_template4_bottom

CollapseKeywords

Left_template4_bottom