Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile / Scrum methodology; contribute to overall architecture, frameworks and patterns for processing and storing large data volumes; evaluate and utilize new technologies/tools/frameworks centered around high-volume data processing; drive the implementation of new data projects and the optimization of existing solutions; translate product backlog items into engineering designs and logical units of work; profile and analyze data for the purpose of designing scalable solutions; define and apply appropriate data acquisition and consumption strategies for given technical scenarios; design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem; build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns; implement complex automated routines using workflow orchestration tools; drive collaborative reviews of designs, code, and test plans; work with architecture, engineering leads and other teams to ensure quality solutions are implemented, and engineering standard methodologies are defined and followed; anticipate, identify and tackle issues concerning data management to improve data quality; build and incorporate automated unit tests and participate in integration testing efforts; utilize and advance continuous integration and deployment frameworks; solve complex data issues and perform root cause analysis; collaborate closely with Product team counterparts; work across teams to resolve operational & performance issues; provide work estimates and represent work progress and challenges; identify and remove technical bottlenecks for your engineering squad; and provide leadership, guidance and mentorship to other data engineers. Telecommuting is available from anywhere in the U.S., except from SD, VT, and WV.
Want more jobs like this?
Get jobs in Beaverton, OR delivered to your inbox every week.
Employer will accept a Master's degree in Computer Science, Information Technology, or Information Systems and two (2) years of experience in the job offered or in an engineering-related occupation.
Experience must include:
- Python;
- SQL;
- Spark;
- AWS;
- Big Data;
- Airflow;
- Data Warehousing;
- Data Modeling;
- SCALA;
- Docker;
- Data Transformation and Integration; and
- Data analysis
Apply at www.jobs.nike.com(Job #R-42284).
#LI-DNI