NetApp's New Steps To Accelerate GenAI Innovation Through Industrial Collaboration

JAKARTA - NetApp announced the development of its collaboration with industry leaders to accelerate AI innovation, by providing the smart data infrastructure needed to operate GenAI (Generative AI).

The first few collaboration updates are the NVIDIA DGX SuperPOD storage certification for NetApp ONTAP in addressing data management challenges for large language model (LLM), eliminating the need to agree on data management in AI training workloads.

In addition, NetApp also announced the creation of a global metadata namespace to safely explore and manage data, integrated AI data pipeline, disagregated storage architecture that allows a thorough share of backend storage, as well as new capabilities on its native cloud service.

"By providing a safe, measurable, and high-performance data intelligence infrastructure that is integrated, NetApp helps customers overcome barriers in the implementation of GenAI," saidten Vitaldevara, Senior Vice President, Platform on NetApp in a statement quoted Thursday, October 10.

Not to forget, NetApp also revealed its collaboration with Domino Data Labs to improve machine learning operations (MLOps) conditions, availability of AIPod with Lenovo for NVIDIA OVX, and new features for FlexPod AI.

Through the latest offers and integration with this platform, NetApp aims to help organizations overcome data challenges and make better use of AI and GenAI to create added value.

"Through collaboration with leading AI infrastructure vendors in the industry, NetApp customers can be assured that their computing, network, storage, and AI software solutions will be seamlessly integrated to encourage AI innovation," said Mike Leone, Practice Director, Data Analytics & AI, Enterprise Strategy Group, part of Tech Target.