Minimum qualifications:
- Master's degree in Statistics, Data Science, Mathematics, Physics, Economics, Operations Research, Engineering, a related quantitative field, or equivalent practical experience.
- 5 years of work experience using analytics to solve product or business problems, coding (e.g., Python, R, SQL), querying databases or statistical analysis, or 3 years of work experience with a PhD degree.
- 8 years of work experience using analytics to solve product or business problems, coding (e.g., Python, R, SQL), querying databases or statistical analysis, or 6 years of work experience with a PhD degree.
About the job
The Karmel team focuses on climate AI initiatives. Our mission is to enable actionable information to mitigate and adapt to climate change at scale. We aim to make communities more resilient, improving global welfare.
Want more jobs like this?
Get jobs in Tel Aviv, Israel delivered to your inbox every week.
Our projects span climate mitigation (e.g., GreenLight (optimizing traffic lights to reduce road transport emissions) and Methane leak detection) as well as climate adaptation efforts including flood forecasting, wildfires boundary tracking, post disaster damage assessment (SKAI), food security and weather predictions (Oya).
Additionally we support horizontal efforts such as geo foundational models and Mmeka (globally mapping buildings and fields).
Responsibilities
- Collaborate with stakeholders in cross-projects and team settings to identify and clarify business or product questions to answer. Provide feedback to translate and refine business questions into tractable analysis, evaluation metrics, or mathematical models.
- Use custom data infrastructure or existing data models as appropriate, using specialized knowledge. Design and evaluate models to mathematically express and solve defined problems with limited precedent.
- Gather information, business goals, priorities, and organizational context around the questions to answer, as well as the existing and upcoming data infrastructure.
- Own the process of gathering, extracting, and compiling data across sources via relevant tools (e.g., SQL, R, Python). Format, re-structure, and/or validate data to ensure quality, and review the dataset to ensure it is ready for analysis.
- Be responsible for bridging between the research models, capabilities, and end user needs.