We're seeking a remote Senior Data DevOps Engineer to join our dynamic team for a new project focused on developing and managing data infrastructure in the cloud, primarily using AWS, Azure, or GCP.
In this role, you will be responsible for designing, deploying, and managing data systems, developing automation scripts and workflows for infrastructure provisioning, deployment, and monitoring, and optimizing performance, scalability, and reliability of data platforms and systems.
You will work closely with the data engineering team to ensure efficient data pipelines and processes, automating data workflows using Python. You will also be responsible for setting up and maintaining continuous integration and delivery (CI/CD) pipelines using tools such as Jenkins, GitHub Actions, or similar cloud-based CI/CD tools.
Want more jobs like this?
Get jobs in Bogota, Colombia delivered to your inbox every week.
We accept CVs in English only.
#LI-DNI#LI-AP13
Responsibilities
- Design, deploy, and manage data infrastructure in the cloud, primarily using AWS, Azure, or GCP
- Develop and implement automation scripts and workflows for infrastructure provisioning, deployment, and monitoring using tools like Terraform or similar Infrastructure as Code (IaC) tools
- Ensure efficient data pipelines and processes, automating data workflows using Python
- Work closely with the data engineering team to ensure efficient data pipelines and processes, automating data workflows using Python
- Set up and maintain continuous integration and delivery (CI/CD) pipelines using tools such as Jenkins, GitHub Actions, or similar cloud-based CI/CD tools
- Collaborate with cross-functional teams to optimize the performance, scalability, and reliability of data platforms and systems
- Install, configure, and maintain data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools in both on-premises and cloud environments
- Monitor and troubleshoot data systems, proactively identifying and resolving performance, scalability, and reliability issues
- Minimum of 3 years of experience in data infrastructure management and DevOps
- Strong proficiency in Python, and Batch experience
- Professional mastery of the Linux operating system
- Strong knowledge of Cloud technologies (AWS, GCP or Azure)
- Solid understanding of network protocols and mechanisms such as TCP, UDP, ICMP, DHCP, DNS, and NAT
- Hands-on experience using or setting up data tools such as Spark, Airflow, R
- Proficiency with SQL
- Experience with Infrastructure as Code (IaC) tools
- Proficiency with setting up and managing CI/CD pipelines using tools like Jenkins, Bamboo, TeamCity, GitLab CI, GitHub Actions, or similar cloud-based CI/CD tools
- Experience installing and configuring data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools
- Good verbal and written communication skills in English at a B2+ level
- Expertise in AWS CloudFormation
- Knowledge of Terraform and Ansible
- Azure DevOps skills
- Learning Culture - We want you to be the best version of yourself, that is why we offer unlimited access to learning platforms, a wide range of internal courses, and all the knowledge you need to grow professionally
- Health Coverage - Health and wellness are important, that is why we have you and up to four family members in a premiere health plan. We have a couple of options, so you can choose what is best for you and your family
- Visual Benefit - Seeing your work for us would be a sight for sore eyes. We want your vision to always be at 100% which is why we offer up to $200.000 COP for any visual health expenses
- Life Insurance Plan - We have partnered with MetLife to offer a full-coverage Ife insurance plan. So, your family is covered, even if you are gone
- Medical Leave Coverage - We are one of the few companies that cover 100% of your medical leave, for up to 90 days. Your health is the most important thing to us
- Professional Growth Opportunities - We have designed a highly competitive and complete development process, where you will have all the tools to get where you have always wanted to be, personally and professionally
- Stock Option Purchase Plan - As an EPAMer you can be more than just an employee, you will also have the opportunity to purchase stock at a reduced price and become a part owner of our organization
- Additional Income - Besides your regular salary, you will also have the chance to earn extra income by referring talent, being a technical interviewer, and many more ways
- Community Benefit - You will be part of a worldwide community of over 50,000 employees, where you can learn, challenge yourself, stand out, and share your knowledge and experience with multicultural teams!
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.