Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Senior Data Engineer (Snowflake)

AT Exadel
Exadel

Senior Data Engineer (Snowflake)

Sofia, Bulgaria

We are seeking an experienced Senior Data Engineer to play an essential role in implementing and maintaining a data warehouse. A career in Exadel means you will work alongside exceptional colleagues and be empowered to reach your professional and personal goals.

Work at Exadel - Who We Are 

Since 1998, Exadel has been engineering its products and custom software for clients of all sizes. Headquartered in Walnut Creek, California, Exadel has 2,000+ employees in development centers across America, Europe, and Asia. Our people drive Exadel’s success and are at the core of our values.

About the Customer

The world's largest publisher of investment research. For over two decades it connects the world's leading asset and wealth managers with nearly 1,000 research firms in more than 50 countries and serves internal teams across multi-national corporations from its offices located in Durham (HQ), New York, London, Edinburgh, and Timisoara.

Want more jobs like this?

Get jobs delivered to your inbox every week.

Select a location
By signing up, you agree to our Terms of Service & Privacy Policy.

The client facilitates the equitable exchange of critical investment insights by improving efficiency, collaboration, and security across the complete information lifecycle. The ecosystem is designed to meet users' bespoke needs, from compliance tracking to interactive publishing, by removing friction from the publication, dissemination, consumption, and application of investment research content.

Requirements

  • 5+ years of background in Data Engineering, working with large-scale databases and data pipelines
  • Proven experience with Snowflake, including data modeling, performance tuning, and query optimization
  • Strong expertise in PostgreSQL, with experience in scaling, tuning, and optimizing queries
  • Hands-on experience with ETL/ELT tools and data pipeline orchestration
  • Proficiency in Python for data processing and automation
  • Strong understanding of data governance, security, and compliance best practices
  • Ability to work independently, problem-solve, and collaborate with cross-functional teams

Nice to Have

  • Experience with cloud-based data solutions (AWS Redshift, Google BigQuery, Azure Synapse, etc.)
  • Knowledge of CI/CD pipelines for data deployment and automation
  • Experience with streaming technologies like Kafka or Apache Spark
  • Knowledge of Airflow, dbt, or similar workflow orchestration tools
  • Exposure to machine learning pipelines and data science workflows
  • Familiarity with containerization (Docker, Kubernetes) for data engineering workloads

English level

Upper-Intermediate

Responsibilities

  • Design and implement scalable data pipelines in Snowflake, ensuring efficiency, reliability, and performance
  • Optimize and scale PostgreSQL databases, improving query performance and overall system efficiency
  • Develop and maintain ETL/ELT processes, transforming and loading large datasets for analytical and operational use
  • Implement best practices for data modeling, governance, and security within Snowflake
  • Work closely with cross-functional teams to support data needs for analytics, reporting, and business intelligence
  • Automate data ingestion and transformation processes to ensure seamless data flow
  • Monitor, troubleshoot, and optimize database performance and data pipeline efficiency
  • Leverage AWS cloud services for scalable data processing and storage solutions
  • Ensure data integrity, compliance, and security in alignment with business and regulatory requirements
  • Collaborate with Data Analysts, Data Scientists, and Software Engineers to support data-driven decision-making
Client-provided location(s): Sofia, Bulgaria; Georgia; Hungary; Lithuania; Poland; Romania; Uzbekistan
Job ID: 5432167004
Employment Type: Other