NetApp announced NetApp AIPod with Lenovo ThinkSystem servers for NVIDIA OVX, a converged infrastructure optimised for the generative AI (GenAI) that claims to enable the use of private, proprietary data for AI without large-scale model training.

"The NetApp AIPod with Lenovo ThinkSystem servers for NVIDIA OVX transforms enterprise AI by delivering a pre-integrated, high-performance solution that accelerates the deployment and scaling of generative AI workloads," said Sandeep Singh, senior vice president and general manager of Enterprise Storage at NetApp.
Converged infrastructure solution
NetApp AIPod combines storage systems with hybrid cloud data management, such as Lenovo's high-performance ThinkSystem SR675 V3 servers, to provide an infrastructure solution that helps organisations unlock the full potential of AI.
The solution claims to enable customers to conduct AI RAG and inferencing operations used for chatbots, knowledge management, and object recognition.

In collaboration with Lenovo and NVIDIA, NetApp claims to simplify AI deployment as it integrates into ecosystems, simplifies operations, accelerates implementation, and protects AI infrastructure.
“As customers deploy AI, they demand business-critical availability, ease of management, and infrastructure efficiency. The NetApp AIPod with Lenovo ThinkSystem servers for NVIDIA OVX delivers optimised and validated solutions to make generative AI more accessible for businesses of every size,” said Kirk Skaugen, president of Lenovo Infrastructure Solutions Group.