AI infrastructure solution accelerates customers’ path to GPT and LLMs while keeping organizations in control of their data
Nutanix (NASDAQ: NTNX), a leader in hybrid multi-cloud computing, today announced the Nutanix GPT-in-a-Box™ solution for customers looking to jump-start their artificial intelligence (AI) and machine learning (ML) innovation, while maintaining control over their data.
The new offering, available today, is a full-stack software-defined AI-ready platform, along with services to help organizations size and configure hardware and software infrastructure suitable to deploy a curated set of large language models (LLMs) using the leading open-source AI and MLOps frameworks on the Nutanix Cloud Platform™.
It allows customers to easily procure AI-ready infrastructure to fine-tune and run generative pre-trained transformers (GPT), including LLMs at the edge or in their data centre.
Privacy Concerns about AI/ML Applications
Many enterprises are grappling with how to quickly, efficiently and securely take advantage of the power of generative AI and AI/ML applications, especially for use cases that cannot be run in the public cloud because of data sovereignty, governance and privacy concerns. New use cases emerge every day as organizations look to leverage generative AI to improve customer service, developer productivity, operational efficiency and more.
From automated transcription of internal documents to the high-speed search of multimedia contents, and automated analysis, many organizations see the opportunity with AI.
Leakage of Intellectual Property
Still, they are struggling with growing concerns regarding intellectual property leakage, compliance, and privacy. Additionally, organizations looking to build an AI-ready stack often struggle with how to best support ML administrators and data scientists, while the prospect of large AI investment costs has enterprises stalled in their AI and ML strategy.
“As customers look to design and deploy generative AI solutions, they find themselves struggling with balancing the deep expertise required to install, configure, and run these workloads with concerns around their data security and protecting company IP – all while controlling costs,” said Greg Macatee, Senior Research Analyst, Infrastructure Systems, Platforms and Technologies Group at IDC.
“With GPT-in-a-Box, Nutanix offers customers a turnkey, easy-to-use solution for their AI use cases, offering enterprises struggling with generative AI adoption an easier on-ramp to deployment.”
Ready-to-use Customer-controlled AI
The Nutanix GPT-in-a-Box solution delivers ready-to-use customer-controlled AI infrastructure for the edge or the core data centre and allows customers to run and fine-tune AI and GPT models while maintaining control over their data. Nutanix provides a full complement of security and data protection offerings ideal for AI data protection.
This new solution includes:
- The Industry-leading Nutanix Cloud Infrastructure™ platform, with the Nutanix Files Storage™ and Objects Storage™ solutions, the Nutanix AHV® hypervisor and Kubernetes, along with NVIDIA GPU acceleration, which can be sized for large to small scale.
- Nutanix services to help customers size their cluster and deploy an opinionated stack with the leading open-source deep learning and MLOps frameworks, inference server, and a curated set of large language models such as Llama2, Falcon and MPT.
- The ability for data scientists and ML administrators to immediately consume these models with their choice of applications, enhanced terminal UI, or standard CLI.
- The platform can also be leveraged to run other GPT models, as well as fine-tune these models leveraging internal data, hosted on included Nutanix Files or Objects Storage services.
“Leveraging AI to more efficiently and effectively help our customers is a top priority for us but, as a regulated financial services organization, maintaining full control over our data is necessary,” said Jon Cosson, CISO at JM Finn. “The Nutanix Cloud Platform delivers the performance, flexibility and security required to safely deploy AI workloads.”
Nutanix’s expertise and involvement in the open-source AI community provide customers with a strong foundation on which to build their AI strategy. Key contributions include: participation in the MLCommons (AI standards) advisory board; co-founding and technical leadership in defining the ML Storage Benchmarks and Medicine Benchmarks; serving as a co-chair of the Kubeflow (MLOps) Training and AutoML working groups at the Cloud Native Computing Foundation (CNCF).
By Thomas Cornely, SVP, Product Management at Nutanix