Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Senior Data Engineer

AT Exadel
Exadel

Senior Data Engineer

Sofia, Bulgaria

We are seeking an experienced Senior Data Engineer to play an essential role in implementing and maintaining a data warehouse. A career in Exadel means you will work alongside exceptional colleagues and be empowered to reach your professional and personal goals.

Work at Exadel - Who We Are 

We don’t just follow trends—we help define them. For 25+ years, Exadel has transformed global enterprises. Now, we’re leading the charge in AI-driven solutions that scale with impact. And it’s our people who make it happen—driven, collaborative, and always learning.

About the Customer

The world's largest publisher of investment research has connected top asset and wealth managers with nearly 1,000 research firms across 50 countries for over two decades. With offices in Durham (HQ), New York, London, Edinburgh, and Timisoara, the client enhances the efficient exchange of investment insights, improving collaboration and security throughout the information lifecycle. Its ecosystem addresses users' specific needs, streamlining the publication and application of investment research content.

Requirements

Want more jobs like this?

Get jobs delivered to your inbox every week.

Select a location
By signing up, you agree to our Terms of Service & Privacy Policy.
  • 5+ years of background in Data Engineering, working with large-scale databases and data pipelines
  • Proficiency in Snowflake, including data modeling, performance tuning, and query optimization
  • Expertise in PostgreSQL, with experience in scaling, tuning, and optimizing queries
  • Hands-on experience with ETL/ELT tools and data pipeline orchestration
  • Competency in Python for data processing and automation
  • Understanding of data governance, security, and compliance best practices

Nice to Have

  • Knowledge of dbt
  • Experience with cloud-based data solutions (AWS Redshift, Google BigQuery, Azure Synapse, etc.)
  • Understanding CI/CD pipelines for data deployment and automation
  • Skills in streaming technologies like Kafka or Apache Spark
  • Exposure to machine learning pipelines and data science workflows
  • Familiarity with containerization (Docker, Kubernetes) for data engineering workloads

English level

Upper-Intermediate

Responsibilities

  • Design and implement scalable data pipelines in Snowflake, ensuring efficiency, reliability, and performance
  • Optimize and scale PostgreSQL databases, improving query performance and overall system efficiency
  • Develop and maintain ETL/ELT processes, transforming and loading large datasets for analytical and operational use
  • Implement best practices for data modeling, governance, and security within Snowflake
  • Work closely with cross-functional teams to support data needs for analytics, reporting, and business intelligence
  • Automate data ingestion and transformation processes to ensure seamless data flow
  • Monitor, troubleshoot, and optimize database performance and data pipeline efficiency
  • Leverage AWS cloud services for scalable data processing and storage solutions
  • Ensure data integrity, compliance, and security in alignment with business and regulatory requirements
  • Collaborate with Data Analysts, Data Scientists, and Software Engineers to support data-driven decision-making
Client-provided location(s): Sofia, Bulgaria; Georgia; Hungary; Lithuania; Poland; Romania; Uzbekistan
Job ID: 5498893004
Employment Type: Other