Description
PROFESSIONAL SUMMARY: * 8+ years of experience in a various IT related technologies, which includes 4 years of hands-on experience in Big Data technologies. * Proficient in installing, configuring and using Apache Hadoop ecosystems such as MapReduce, Hive, Pig, Flume, Yarn, HBase, Sqoop, Spark, Storm, Kafka, Oozie, and Zookeeper. * Strong comprehension of Hadoop daemons and Map-Reduce topics. * Used informatica Power Center for Extraction, Transformation, and Loading (ETL) of information from numerous sources like Flat files, XML documents, and Databases. * Experienced in developing UDFs for Pig and Hive using Java. * Strong knowledge of Spark for handling large data processing in streaming process along with Scala. * Hands On experience on developing UDF, DATA Frames and SQL Queries in Spark SQL. * Highly skilled in integrating kafka with Spark streaming for high speed data processing. * Worked with NoSQL databases like HBase, Cassandra and MongoDB for information extraction and place huge amount of data. * Understanding of data storage and retrieval techniques, ETL, and databases, to include graph stores, relational databases, tuple stores * Experienced in writing Storm topology to accept the events from Kafka producer and emit into Cassandra DB. * Ability to develop Map Reduce program using Java and Python. * Good understanding and exposure to Python programming. * Exporting and importing data to and from Oracle using SQL developer for analysis. * Developed PL/SQL programs (Functions, Procedures, Packages and Triggers). * Good experience in using Sqoop for traditional RDBMS data pulls. * Worked with different distributions of hadoop like Hortonworks and Cloudera. * Strong database skills in IBM- DB2, Oracle and Proficient in database development, including Constraints, Indexes, Views, Stored Procedures, Triggers and Cursors. * Extensive experience in Shell scripting. * Extensive use of Open Source Software and Web/Application Servers like Eclipse 3.x IDE and Apache Tomcat 6.0. * Experience in designing a component using UML Design-Use Case, Class, Sequence, and Development, Component diagrams for the requirements. * Aptitude abilities in J2EE, J2SE, Servlets, Spring, Hibernate, JUnit, JSP, JDBC, Java Multithreading, Object Oriented Design Patterns, Exception Handling, Garbage Collection, HTML, Struts, Hibernate, Enterprise Java Beans, RMI, JNDI and XML-related innovations. * Involved in reports development using reporting tools like Tableau. Used excel sheet, flat files, CSV files to generated Tableau adhoc reports. * Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance tuning of mappings. * Experience in understanding the security requirements for Hadoop and integrate with Kerberos authentication and authorization infrastructure. * Experience in cluster monitoring tools like Ambari & Apache hue. * Solid Technical foundation, great investigative capacity, cooperative person, and objective arranged, with a promise toward incredibleness. * Outstanding communication and presentation skills, willing to learn, adapt to new technologies and third-party products.
Work Experience
COMPANY | POSITION HELD | DATES WORKED |
---|---|---|
(Confidential) | Big Data Engineer | 3/2017 - Present |
Visa Inc Ca | Hadoop Developer | 1/2016 - 3/2017 |
Bank Of America Nj | Data Analyst | 11/2014 - 12/2015 |
Dell International Services | Big Data Developer | 5/2012 - 10/2014 |
Accomplishments
Highlights:
Companies I like:
apple computers