Sr Data Engineer - GE07BE
We're determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals - and to help others accomplish theirs, too. Join our team as we help shape the future.
The Hartford is seeking a Senior Machine learning Engineer within Actuarial to design, develop, and implement modern and sustainable MLOps framework to fuel machine learning and artificial intelligence solutions across a wide range of strategic initiatives.
This role will be part of a dedicated hybrid actuarial/data science team designing and delivering powerful analytical tools utilizing statistical modeling, machine learning, cloud computing, and big data platforms to enhance or overhaul core actuarial processes For Commercial Lines this team is responsible for all modeling that pertains to pricing, class plans, and profitability. For Actuarial Strategic Modeling this team is responsible for a subset of models used to support Actuarial analytics. The individual will work closely with our data science and data engineering teams to develop training and deployment pipelines for core actuarial models.
Want more jobs like this?
Get Data and Analytics jobs delivered to your inbox every week.
As a Senior Machine Learning Engineer, you will participate in the entire software development lifecycle process in support of continuous data delivery, while growing your knowledge of emerging technologies. We use the latest data technologies, software engineering practices, MLOPs, Agile delivery frameworks, and are passionate about building well-architected and innovative solutions that drive business value. This cutting edge and forward focused organization presents the opportunity for collaboration, self-organization within the team, and visibility as we focus on continuous business data delivery.
Responsibilities:
- Work closely with Tech leads, Product Manager and Product Owner to deliver MLOPs platform solution in AWS using Python and other tools for the Actuarial community.
- Work with data engineers/Data Scientist to tackle challenging AIOps problems.
- Maintain and manage current CI/CD ecosystem and tools.
- Find ways to automate and continually improve current CI/CD processes and release processes.
- Help innovate standardize machine learning development practices.
- Prototype high impact innovations, catering to changing business needs, by leveraging new technologies.
- Consult with cross-functional stakeholders in the analysis of short and long-range business requirements and recommend innovations which anticipate the future impact of changing business needs.
- Formulates logical statements of business problems and devises, tests and implements efficient, cost-effective application program solutions.
- Establish data pipelines guidelines that align to modern software development principles for further analytical consumption.
- Develop designs that enables real-time modeling solutions to be ingested into front-end systems.
- Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other data science teams.
Qualifications:
- Bachelor's in Computer Science, Engineering, Physics, MIS, or a related discipline
- 3+ years experience in a Ml Ops, cloud engineer, or technical data scientist role or any similar distributed or public Cloud platform.
- 5+ years hands-on experience in integrating, deploying, operationalizing ML models at speed and scale, including integration with enterprise applications and APIs.
- Experience with AWS Services (i.e. SAM, EMR etc).
- Experience developing with SQL, NoSQL, ElasticSearch, MongoDB, and Spark, Python, PySpark for model development and ML Ops.
- Expertise in ingesting data from a variety of structures including relational databases, Hadoop/Spark, cloud data sources, JSON.
- Expertise in ETL concerning metadata management and data validation.
- Expertise in Unix and Git.
- Experience in Automation tools (Autosys, Cron, Airflow, etc.)
- Experience with Cloud data warehouses, automation, and data pipelines (i.e. Snowflake) a plus
- Able to communicate effectively with both technical and non-technical teams.
- Able to translate complex technical topics into business solutions and strategies as well as turn business requirements into a technical solution.
Candidate must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$113,360 - $170,040
Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits