We are seeking a Senior Machine Learning Engineer to join our remote team, contributing to ML pipeline design, development, and operating lifecycle based on best practices.
In this role, you will design, create, maintain, troubleshoot, and optimize ML pipeline steps, as well as own and contribute to the ML prediction endpoints design and implementation. Your collaboration with System Engineers to configure ML lifecycle management environment and support improvement of coding practices will be crucial.
If you're passionate about innovation, we invite you to apply and become part of our team!
We accept CVs in English only.
#LI-DNI
Responsibilities
- Contribute to ML pipeline design, development, and operating lifecycle based on best practices
- Design, create, maintain, troubleshoot, and optimize ML pipeline steps
- Own and contribute to the ML prediction endpoints design and implementation
- Cooperate with System Engineers to configure ML lifecycle management environment
- Write specifications, documentation, and user guides for developed applications
- Support improvement of coding practices and repository organization in the science work cycle
- Establish and configure pipelines for projects
- Continuously identify technical risks and gaps, and devise mitigation strategies
- Work with data scientists to productionalize predictive models while understanding the scope and purpose of the models built by data scientists and creating scalable data preparation pipelines
Want more jobs like this?
Get jobs in Bogota, Colombia delivered to your inbox every week.
- 3+ years of programming language experience, ideally in Python, with a strong SQL knowledge
- Strong MLOps experience (Sagemaker, Vertex, or Azure ML)
- Intermediate level in Data Science, Data Engineering, and DevOps Engineering
- Experience with at least one project delivered to production in an MLE role
- Expert level in Engineering Best Practices
- Practical experience in the implementation of Data Products using the Apache Spark Ecosystem (Spark SQL, MLlib/SparkML) or alternative technologies
- Experience with Big Data technologies (e.g., Hadoop, Spark, Kafka, Cassandra, GCP BigQuery, AWS Redshift, Apache Beam, etc.)
- Experience with automated data pipeline and workflow management tools, i.e., Airflow, Argo Workflow, etc
- Experience in different data processing paradigms (batch, micro-batch, streaming)
- Practical experience working with at least one of the major Cloud Providers such as AWS, GCP, and Azure
- Production experience in integrating ML models into complex data-driven systems
- DS experience with Tensorflow/PyTorch/XGBoost, NumPy, SciPy, Scikit-learn, Pandas, Keras, Spacy, HuggingFace, Transformers
- Experience with different types of databases (Relational, NoSQL, Graph, Document, Columnar, Time Series, etc.)
- Practical experience with Databricks MLOps-related tools/technologies like MLFlow, Kubeflow, TensorFlow Extended (TFX)
- Experience with performance testing tools like JMeter or LoadRunner
- Knowledge of containerization technologies like Docker
- Learning Culture - We want you to be the best version of yourself, that is why we offer unlimited access to learning platforms, a wide range of internal courses, and all the knowledge you need to grow professionally
- Health Coverage - Health and wellness are important, that is why we have you and up to four family members in a premiere health plan. We have a couple of options, so you can choose what is best for you and your family
- Visual Benefit - Seeing your work for us would be a sight for sore eyes. We want your vision to always be at 100% which is why we offer up to $200.000 COP for any visual health expenses
- Life Insurance Plan - We have partnered with MetLife to offer a full-coverage Ife insurance plan. So, your family is covered, even if you are gone
- Medical Leave Coverage - We are one of the few companies that cover 100% of your medical leave, for up to 90 days. Your health is the most important thing to us
- Professional Growth Opportunities - We have designed a highly competitive and complete development process, where you will have all the tools to get where you have always wanted to be, personally and professionally
- Stock Option Purchase Plan - As an EPAMer you can be more than just an employee, you will also have the opportunity to purchase stock at a reduced price and become a part owner of our organization
- Additional Income - Besides your regular salary, you will also have the chance to earn extra income by referring talent, being a technical interviewer, and many more ways
- Community Benefit - You will be part of a worldwide community of over 50,000 employees, where you can learn, challenge yourself, stand out, and share your knowledge and experience with multicultural teams!
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.