Co-op Opportunity
Looking for a PAWsome co-op opportunity? The Chewy Campus Recruiting Team is seeking a motivated Data Engineer Co-op to join our growing team in our Bellevue, WA office location!
As part of the Data Engineering team at Chewy, you will have the opportunity to gain hands-on experience by working on projects building data pipelines, ETL processes, and data warehouse management. The ideal candidate should have a strong curiosity about building and maintaining cloud databases, ingesting data using a variety of methods (including non-SQL technologies like SOAP and REST) and working on joining datasets from different cloud-based source systems in a centralized database.
Want more jobs like this?
Get jobs in Bellevue, WA delivered to your inbox every week.
Co-op Timeframe: June 2, 2025 – December 5, 2025 (must be available for the full duration)
Qualified Students: Rising Seniors graduating in Spring 2026
What You'll Do
- Assist in the development and maintenance of data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse
- Configure custom data pipelines within Snowflake/AWS/Databricks for ingestion into Data Mart
- Design and implement solutions on a cloud platform using Infrastructure as code (Terraform)
- Maintain, support, and develop within the Supply Chain - Transportation Data Mart Snowflake instance, including code build/review, auditing, performance tuning, and security
- Build and maintain documentation and models for the Data Mart
What You'll Need
- Enrolled full-time in a Bachelor’s or Master’s degree program in Data Engineering, Data Analytics, Machine Learning, Mathematics, Engineering, or related field, with an anticipated graduation of Spring 2026
- Excellent verbal and written communication skills and ability to explain complex concepts to non-expert partners in a simple manner
- Current permanent U.S. work authorization required
- Ability to work 40 hours a week, Monday through Friday
Bonus
- Proficiency in coding and data analysis using Python, PySpark, Airflow, SQL, Snowflake
- Knowledge of AWS data toolset (Glue, Athena, EMR, EKS, etc.) and other data platforms like Databricks
- Experience of translating ambiguous customer requirements into clear problem definitions and delivering them
- Proven experience in the design and execution of analytical projects
#LI-DNI
Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members. If you have a disability under the Americans with Disabilities Act or similar law, and you need an accommodation during the application process or to perform these job requirements, or if you need a religious accommodation, please contact CAAR@chewy.com.
If you have a question regarding your application, please contact HR@chewy.com.
To access Chewy's Customer Privacy Policy, please click here. To access Chewy's California CPRA Job Applicant Privacy Policy, please click here.