Nutanix , a leader in hybrid multicloud computing, has announced the general availability of the latest version of its Nutanix Enterprise AI (NAI) solution. The new release adds deeper integration with NVIDIA AI Enterprise, incorporating NVIDIA NIM microservices and the NVIDIA NeMo framework to speed up the deployment of agentic AI applications in the enterprise. Designed to accelerate the adoption of generative AI, NAI simplifies the process for customers to build, run, and securely manage AI models and inference services across a wide range of environments, including the edge, data centers, and public clouds, all within Cloud Native Computing Foundation® (CNCF)-certified Kubernetes® environments.
Enhanced Integration with NVIDIA AI Enterprise
The latest release of NAI extends a shared model service methodology, simplifying agentic workflows and making deployment and operational management more efficient. It streamlines the resources and models required for deploying multiple applications across various business units with a secure, common set of embedding, reranking, and guardrail functional models for AI agents. This builds on the NAI core, which includes a centralized LLM model repository that offers secure endpoints for connecting generative AI applications and agents.
Thomas Cornely, SVP of Product Management at Nutanix, emphasized that this collaboration is designed to help enterprise customers keep pace with the rapid innovation in the generative AI market by integrating NVIDIA NIM and NeMo microservices into NAI.
Features and Benefits of NAI for Agentic AI Applications
- Deploy Agentic AI Applications with Shared LLM Endpoints: Customers can reuse existing model endpoints as shared services for multiple applications, reducing infrastructure strain on GPUs, CPUs, memory, and Kubernetes clusters.
- Leverage a Wide Array of LLM Endpoints: NAI enables various agentic model services, including NVIDIA Llama Nemotron open reasoning models, NeMo Retriever, and NeMo Guardrails. Users can also leverage NVIDIA AI Blueprints for pre-defined, customizable workflows, accelerating AI app development.
- Support Generative AI Safety: This release includes guardrail models that filter initial user queries and responses to prevent harmful or biased outputs, maintaining safe interactions and detecting jailbreak attempts. NVIDIA NeMo Guardrails ensure content filtering and consistent model reliability.
- Unlock Insights from Data: NAI integrates with the NVIDIA AI Data Platform and Nutanix solutions like Unified Storage and Database Service to process both structured and unstructured data for AI, turning raw data into actionable intelligence.
Platform Flexibility and Security
NAI is designed to utilize additional Nutanix platform services, offering flexible deployment options across HCI, bare metal, and cloud IaaS environments. It also enables customers to manage containerized cloud-native applications via the Nutanix Kubernetes Platform and provides discrete data services through Nutanix Unified Storage (NUS) and Nutanix Database Service (NDB).
As an NVIDIA-Certified Enterprise Storage solution, Nutanix Unified Storage meets stringent performance and scalability standards, enabling optimized infrastructure control for enterprise AI workloads.
The latest release of Nutanix Enterprise AI builds on the company’s strong relationship with NVIDIA, offering a streamlined, secure foundation for deploying agentic AI applications. With enhanced support for generative AI and agentic workflows, NAI provides customers with powerful tools to deploy AI at scale, optimizing both infrastructure and security.
Scott Sinclair, Practice Director at ESG, highlighted the significance of this integration, noting how the partnership minimizes complexity while ensuring secure, optimized deployment for AI-driven applications across the enterprise.