Job Summary
At NetApp, our mission is to help our customers bring AI to their data - wherever and however they want - in a way that is agile, achievable, and secure. Our AI tools help customers to seamlessly deploy AI on data in-place: on-prem, hybrid or cloud. By providing AI-ready infrastructure, NetApp enables confident innovation and effective data management for our customers, while maintaining the highest standards of security and regulatory compliance.
Job Requirements
- Provide technical direction for AI projects, ensuring the application of best practices and cutting-edge technologies.
- Collaborate with cross-functional teams to align AI initiatives with business objectives.
- Spearhead by providing technical direction on multiple AI projects, ensuring timely and high-quality delivery.
- Stay current with the latest advancements in AI/ML and integrate new technologies into the team's work.
- Proven expertise in AI and machine learning, including supervised and unsupervised learning, neural networks, natural language processing, computer vision, or reinforcement learning.
- Experience deploying AI/ML models in production environments at scale.
- Proficiency in programming languages such as Python, Scala, Java, or C++.
- Strong familiarity with AI/ML frameworks and tools such as TensorFlow, PyTorch, Scikit-learn, etc.
- Working in Linux, AWS/Azure/GCP, Kubernetes - Control plane, Auto scaling, orchestration, containerization is a must
- Proficiency No Sql Document Databases (e.g., Mongo DB, Cassandra, Cosmos DB, Document DB)
- Experience building Micro Services, REST APIs and related API frameworks.
- A strong understanding and experience with concepts related to computer architecture, data structures and programming practices
- Storage Domain experience is a plus
Want more jobs like this?
Get jobs in Gunnison, CO delivered to your inbox every week.
Education
- 8-10+ yrs Proficiency in programming languages like Python, Scala, Java
- 6+ years experience with Machine Learning Libraries and Frameworks: PyTorch, TensorFlow, Keras, Open AI, LLMs ( Open Source), LangChain etc
- 6+ years experience working in Linux, AWS/Azure/GCP, Kubernetes - Control plane, Auto scaling, orchestration, containerization is a must.
- 6+ years experience with No Sql Document Databases (e.g., Mongo DB, Cassandra, Cosmos DB, Document DB)
- 6+ years experience working building Micro Services, REST APIs and related API frameworks.
- 8+ yrs experience with Big Data Technologies: Understanding big data technologies and platforms like Spark, Hadoop and distributed storage systems for handling large-scale datasets and parallel processing.
- A strong understanding and experience with concepts related to computer architecture, data structures and programming practices
- Storage Domain experience is a plus
Job Segment: Developer, Java, Linux, Open Source, Database, Technology