We are seeking a Lead Data Software Engineer to join our remote team.
The chosen candidate will play a crucial role in developing innovative analytical solutions using Spark/PySpark, NoSQL, and other Big Data-related technologies. Additionally, they will implement new features in Cloud solutions (AWS, GCP, Azure). The candidate will work closely with product and engineering teams to understand requirements and influence decisions. This position offers an exciting opportunity to significantly contribute to the growth and success of our company.
Apply to leverage your expertise!
#LI-DNI
Responsibilities
- Development and execution of innovative analytical solutions using Spark/PySpark, NoSQL, and other Big Data-related technologies
- Collaboration with product and engineering teams to understand requirements and influence decisions
- Regular interaction with architects, technical leads, and other key individuals within different functional groups
- Analysis of business problems and technical environments for the implementation of quality technical solutions
- Participation in code review and testing solutions to ensure adherence to best practice specifications
- Documentation of project details
Want more jobs like this?
Get jobs in San Javier, Chile delivered to your inbox every week.
- Minimum 5 years engineering experience in Data Management, Data Storage, Data Visualization, Operation, and Security
- Proficiency in at least one programming language: Python, Java, Scala, or Kotlin
- Experience with SQL and strong familiarity with Big Data tools, mainly Spark/PySpark, and optionally Hadoop, Hive, and Flink
- Comprehensive understanding of Cloud Solutions for Data environments, such as AWS, Azure, and GCP
- Understanding of CI/CD and software development using Big Data technologies
- Experience with software development with Big Data technologies, including administration, configuration management, monitoring, debugging, and performance tuning
- Familiarity with data ingestion pipelines, Data Warehousing, and Data Lakes
- Experience with data modeling and development of scalable, available, and fault-tolerant systems
- Analytical approach to problems with excellent interpersonal and communication skills
- Motivation, independence, efficiency, and ability to work under pressure with a solid sense for setting priorities
- Knowledge of Containers and Resource Management systems like Docker, Kubernetes, and Yarn
- Experience in direct customer communications
- Experience with Event Streaming tools like Kafka, Pub/Sub, Kinesis, Events Hub
- Familiarity with Orchestrator tools such as Airflow and Databricks
- Knowledge of Snowflake is advantageous
- Improved medical coverage - EPAMers are eligible to participate in a supplementary health insurance program that shall have the usual coverage in the industry, with the Company funding 100% of the value of the monthly premium for participation
- Lunch Allowance - You will receive a daily allowance of CLP $ 7.000 per working day. Enjoy a nice meal on us
- Allowance for internet and electricity - You will receive an allowance of CLP$15.000 per month to cover internet and electricity expense
- National Holiday Bonus - We celebrate joining the Chilean Market. That is why all our employees will receive a bonus of CLP $86,646 in September
- Christmas Bonus - You will receive an End of Year bonus of CLP $170,539. It will be paid during the month of December, to ensure you have a Happy Holiday!
- Learning Culture - We want you to be the best version of yourself, that is why we offer unlimited access to learning platforms, a wide range of internal courses, and all the knowledge you need to grow professionally
- Additional Income - Besides your regular salary, you will also have the chance to earn extra income by referring talent, being a technical interviewer, and many more ways
- Are you open to relocation? - If you want to relocate to another country and we have the right project, we will assist you every step of the way, to help you and your family, reach your new home