Responsibilities
About the Team
The Data Management Suite team is building products that cover the whole lifecycle of data pipeline, including data ingestion and Integration, data development, data catalog, data security and data governance. These products support various businesses, so data engineers and data scientists could greatly boost their productivity.
As a software engineer in the data management suite team, you will have the opportunity to build, optimize and grow one of the largest data platforms in the world. You'll have the opportunity to gain hands-on experience on core systems in the data platform ecosystem. Your work will have a direct and huge impact on the company's core products as well as hundreds of millions of users.
Want more jobs like this?
Get Data and Analytics jobs in Singapore delivered to your inbox every week.
Responsibilities:
- Design, develop, and optimize the architecture of large-scale data ingestion systems to support real-time data pipelines, ensuring high throughput, low latency, and fault tolerance.
- Enhance the performance, scalability, and reliability of data ingestion pipelines.
- Develop and implement automated and intelligent operation and maintenance systems to monitor, diagnose, and ensure the stability and reliability of data ingestion pipelines..
- Collaborate with cross-functional teams to deliver event data ingestion, transformation, and storage solutions that meet diverse business requirements.
- Troubleshoot and resolve complex issues in production systems, ensuring minimal downtime and optimal performance.
Qualifications
Minimum Qualifications:
- Bachelor's degree in Computer Science or equivalent practical experience;
- 2+ years of experience in software development with proficiency in one or more programming languages, such as Java, Scala, Python, or Go.
- Strong understanding of data structures, algorithms, and distributed systems principles.
- Hands-on experience with big data technologies, such as Hadoop, Flink, Kafka, or similar frameworks.
Preferred Qualifications:
- 2+ years of experience in real-time data processing frameworks like Flink. Contributions to open-source projects are a plus.
- Experience in the event tracking domain, including event collection, real-time processing, governance, quality assurance and cost optimization.
- Deep understanding of data lakehouse architectures and the integration of stream and batch processing.
- Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment.