Do you have a sofware engineering background and strong knowledge in Big Data? Are you an open-minded professional with good English skills? If it sounds like you, this could be the perfect opportunity to join EPAM as a Senior Big Data Engineer.
Our teams work in highly agile working environments for Fortune 1000 clients, following XP practices and best CI/CD practices. We are looking for a Senior Big Data Engineer with open-minded personality, who can join our friendly environment and become a core contributor to our team of experts.
#LI-DNI
Responsibilities
- Develop and implement innovative analytical solutions using Cloud Native, Big Data, and NoSQL related technologies
- Develop and implement Cloud/On-Premise/Hybrid solutions using best in the class data frameworks
- Work with product and engineering teams to understand requirements, evaluate new features and architecture to help drive decisions
- Build collaborative partnerships with architects, technical leads and key individuals within other functional groups
- Perform detailed analysis of business problems and technical environments and use this in designing quality technical solution
- Actively participate in code review and test solutions to ensure it meets best practice specifications
- Build and foster a high performance engineering culture, mentor team members and provide team with the tools and motivation
- Write project documentation
Want more jobs like this?
Get jobs in Madrid, Spain delivered to your inbox every week.
- Coding experience with one of the following programming languages: Python/Java/Scala
- Experience with Linux OS: configure services and write basic shell scripts, understanding of network fundamentals
- Good knowledge of SQL and relational algebra
- Advanced experience in software development with Data technologies (e.g. administration, configuration management, monitoring, debugging and performance tuning)
- Engineering experience and practice in Data Management, Data Storage, Data Visualization, Disaster Recovery, Integration, Operation, Security
- Experience building data ingestion pipelines, Data Warehouse or Database architecture
- Experience with data modeling; hands-on development experience with modern Big Data components
- Cloud: experience in designing, deploying and administering scalable, available and fault tolerant systems
- Good understanding of CI/CD principles and best practices
- Analytical approach to problem; excellent interpersonal, mentoring and communication skills
- Data-oriented personality and possessing compliance awareness, such as PI, GDPR, HIPAA
- Motivated, independent, efficient and able work under pressure with a solid sense for setting priorities
- Ability to work in a fast-paced (startup like) agile development environment
- Experience in high load and IoT Data Platform architectures and infrastructures
- Vast experience with Containers and Resource Management systems: Docker and Kubernetes
- Experience in direct customer communications
- Solid skills in infrastructure troubleshooting, support and practical experience in performance tuning and optimization, bottleneck problem analysis
- Experienced in different business domains
- English proficiency
- Advanced understanding of distributed computing principles
- Python/Java/Scala/Kotlin and SQL
- Cloud-Native stack: Databricks, Azure DataFactory, AWS Glue, AWS EMR, Athena, GCP DataProc, GCP DataFlow
- Big Data stack: Spark Core, Spark SQL, Spark ML, Kafka, Kafka Connect, Airflow, Nifi, Streamset
- NoSQL: CosmosDB, DynamoDB, Cassandra, HBase; MongoDB
- Queues and Stream processing: Kafka Streams; Flink; Spark Streaming
- Data Visualization: Tableau, PowerBI, Looker
- Operation: Cluster operation, Cluster planning
- Elasticsearch/ELK
- Solid Cloud experience with one of the leading cloud providers (AWS/Azure/GCP): Storage; Compute; Networking; Identity and Security; NoSQL; RDBMS and Cubes; Big Data Processing; Queues and Stream Processing; Serverless; Data Analysis and Visualization; ML as a service (SageMaker; Tensorflow)
- Enterprise Design Patterns (Secure Inversion of Control etc)
- Development Methods (TDD, BDD, DDD)
- Version Control Systems (Git)
- Testing: Component/ Integration Testing, Unit testing (JUnit)
- Deep understanding of SQL queries, joins, stored procedures, relational schemas, and SQL optimization
- Experience in various messaging systems, such as Kafka, RabbitMQ, Event Hub, Pub/Sub
- Rest, Thrift, GRPC
- Build Systems: Maven, SBT, Ant, Gradle
- Docker, Kubernetes
- Private health insurance
- EPAM Employees Stock Purchase Plan
- 100% paid sick leave
- Referral Program
- Professional certification
- Language courses
- WORK & LIFE BALANCE. Enjoy more of your personal time with flexible work options, 24 working days of annual leave and paid time off for numerous public holidays.
- CONTINUOUS LEARNING CULTURE. Craft your personal Career Development Plan to align with your learning objectives. Take advantage of internal training, mentorship, sponsored certifications and LinkedIn courses.
- CLEAR & DIFFERENT CAREER PATHS. Grow in engineering or managerial direction to become a People Manager, in-depth technical specialist, Solution Architect, or Project/Delivery Manager.
- STRONG PROFESSIONAL COMMUNITY. Join a global EPAM community of highly skilled experts and connect with them to solve challenges, exchange ideas, share expertise and make friends.