Description
A passionate DevOps Engineer with experience working on Linux administration & executing DevOps principles for Hadoop Distributed File System (HDF's), Spark, Big data & Web applications in on-premises & Cloud environments like AWS & Azure platforms with tools like Jenkins, Terraform, Python, Docker and Kubernetes. Deep cross-functional knowledge working as a DevOps Engineer & Data Engineer for solving data problems. Involved in designing & deploying large applications utilizing Cloud infrastructure & managed mission critical, high availability environments & ETL pipeline. * Configured & worked extensively on AWS services like EC2, ELB, Auto- Scaling, S3, Route53, IAM, VPC, RDS, Dynamo DB, Cloud Trail, Cloud Watch, Elastic Cache, SNS, SQS, Cloud Formation Template, Cloud Front, Lambda EMR, OpsWorks & Elastic Beanstalk. * Hands on experience on Azure web application like Azure App & Cloud Services, Azure Data Factory, Azure SQL Data Warehouse, Azure Blob Storage, Web API, VM creation, ARM Templates, Lift & Shift, Azure storage, Fabric controller, Azure AD, Azure search, & Notification hub. * A Unique experience with Pivotal Cloud Foundry (PCF) architecture & design, troubleshooting issues with platform components & developing global/multi-regional deployment models & patterns for large scale developments or deployments on Cloud Foundry. * 4+ years of experience in working with Big Data Technologies on systems which comprises of massive amount of data running in highly distributive Hadoop environment. * Designed & developed continuous deployment pipelines (CI/CD) in Azure DevOps & Jenkins & maintained pipelines to manage the IAC for all the application. * Experience in designing & implementing of CI/CD processes using CloudFormation, Terraform Templates & Containerization of applications using Docker. * Converted existing AWS infrastructure to serverless architecture using AWS Lambda & automated using Python scripts deployed via Terraform & AWS Cloud Formation Template. * Good understanding in Docker Containers, Kubernetes Clusters, & Mesos & implemented a production ready, load balanced, highly available, fault tolerant Kubernetes infrastructure. * Managed K8s charts using Helm & created reproducible builds of the Kubernetes applications. * Implemented microservices application deployment & migrated to AWS/Azure services using tools as Azure DevOps, Kubernetes Service, Container Registry, Cosmos DB, Grafana, Azure pipelines, Monitor, RBAC, AWS Kubernetes & Kubernetes API to run workloads on EKS Clusters. * Experience in Building & deploying code using Jenkins to Kubernetes. * Implemented Docker to containerize applications & implemented auto-scaling, rolling updates with zero downtime & experience working on Docker components such as Docker Engine & created Docker Images, setup Docker Hub & handled multiple images deployed in ECS & EKS. * Worked with Terraform key features such as IAC, Execution plans, Resource Graphs, Change Automation & extensively used Auto scaling for launching Cloud instances while deploying microservices. * Used Terraform resources to automate infrastructure and manage versions in production infrastructure. * Experienced in Provisioning of IAAS, PAAS VM's & web/worker roles on Microsoft Azure Classic & ARM. * Experienced in migrating an On-premises instances & Azure Classic instances to Azure ARM Subscription with Azure Site Recovery. * Experience in Virtualization technologies like VMWare, Vagrant & worked with containerizing applications with Docker & cluster orchestration services like EKS, ECS and AKS. * Experience with build tools like Bamboo, Hudson/Jenkins, Teamcity, & uBuild. * Proficient in Python, Ruby, Power Shell, JSON, YAML, Groovy scripting. * Installing, configuring, & administering Jenkins CI tool on Linux machines along with adding/updating plugins such as SVN, Maven, & ANT. * Implemented CI/CD for J2EE, SOA, JAVA, NodeJS, .Net Core & Micro services architecture environment using Jenkins & uDeploy. * Automated the front-ends platform into highly scalable, consistent, repeatable infrastructure using high degree of automation using Chef, Ansible, Vagrant, Jenkins, & Cloud Formation Template. * Experience in Ansible setup, managed host file using YAML, authoring various playbooks, & custom Ansible modules with Ansible Playbooks. * Strong Knowledge on architecture and components of Spark, and efficient in working with Spark Core, SparkSQL and Spark streaming. * Experience in configuring Spark Streaming to receive real time data from the Apache Kafka and store the stream data to HDFS using Scala. * Hands on experience in using Hadoop ecosystem components like Hadoop, Hive, Sqoop, Cassandra, Spark, Spark Streaming, Spark SQL, Kafka and Yarn. * Strong understanding of various Hadoop services, Big data applications and YARN architecture. * Integrated Hadoop with Kafka to upload stream data from Kafka to HDFS. * Good knowledge on configuring Apache NiFi to automate the data movement between different Hadoop systems.
Work Experience
COMPANY | POSITION HELD | DATES WORKED |
---|---|---|
(Confidential) | Sr. Cloud/Aws Engineer | 2/2020 - Present |
Kivyo Inc | Sr. Cloud/Azure Engineer Infrastructure Engineer | 9/2018 - 12/2019 |
3 M | Sr. Build And Release Engineer, J Hancock Financial | 5/2016 - 6/2018 |
Symbiosis Technologies | System / Linux Engineer | 8/2013 - 10/2014 |