Job Summary
Member of a software engineering team involved in development & design of AI Data Platform built on NetApp's flagship storage operating ONTAP. Data is the currency of business in the digital era. NetApp is the data authority, helping customers leverage and manage their data wherever it resides – in the cloud, in their data centers, or anywhere data flows. Engineers at NetApp help transform the way customers utilize their dynamic, diverse, and distributed information. They are allowing doctors to save lives with deep data analytics shared with medical experts around the world, helping automotive engineers improve autonomous vehicle navigation with artificial intelligence, enabling scientists to monitor and identify environmental hazards through deep image analysis, and providing companies the ability to expand their businesses in yet unimagined ways. By joining NetApp, you can take part in transforming how data is changing the world. ONTAP is the #1 Storage Operating System in the world, managing hundreds of Exabytes of customers information. We have more than 30,000 customers today that rely on us to be the data authority. Take part in the transformation that is changing how we work and play daily.
An ideal candidate for this position will be driven to produce results, collaborative, curious and creative. He/she strives to excel in their day-to-day work and will continue to produce high quality results without anyone looking over their shoulders. They will bring broad experience across multiple domains including Big Data processing, AI/ML workflows, MLOps, Kubernetes operating systems, storage technologies and distributed systems.
Job Requirements
Proficiency in programming languages like Python, GO, Java/C#, C/C++
Experience with Machine Learning Libraries and Frameworks: PyTorch, TensorFlow, Keras, Open AI, LLMs ( Open Source), LangChain etc
Experience working in Linux, AWS/Azure/GCP, Kubernetes – Control plane, Auto scaling, orchestration, containerization is a must.
Experience with No Sql Document Databases (e.g., Mongo DB, Cassandra, Cosmos DB, Document DB) is a must.
Experience working building Micro Services, REST APIs and related API frameworks.
Experience with Big Data Technologies: Understanding big data technologies and platforms like Spark, Hadoop and distributed storage systems for handling large-scale datasets and parallel processing.
Experience with AI/ML frameworks like PyTorch or TensorFlow is a must.
Storage Domain experience is a plus.
Experience with Filesystems or Networking or file/cloud protocols is a plus.
Proven track record of leading mid to large sized projects
This position requires an individual to be creative, team-oriented, a quick learner and driven to produce results.
Responsible for providing support in the development and testing activities of other engineers that involve several inter-dependencies
Participate in technical discussions within the team and with other groups within Business Units associated with specified projects
Willing to work on additional tasks and responsibilities that will contribute towards team, department and company goals
A strong understanding and experience with concepts related to computer architecture, data structures and programming practices
Education
Typically requires a minimum of 12 years of related experience with a Bachelor's degree or 12 years and a Master's degree; or a PhD with 10 years experience; or equivalent experience.