By Ken Kaplan
Many companies are on the path to implementing artificial intelligence, while others are still unsure how they will use it to run their businesses. Either way, CIOs and IT teams have many choices to make as AI continues to evolve at lightning speed.
Some feel like the train is leaving the station and they should get on board now, according to Sean Donahue, senior solutions manager at Nutanix.
“AI is not an option,” he said. “It is not a speculative market. Companies know they have to turn to AI; “They just haven’t figured out which use cases they will address first.” Donahue said this is likely because many do not fully understand how their organizations could benefit from the use of AI. He compared it to when Thomas Edison introduced the light bulb in 1888.
“It was surprising to see at first glance and the demonstration caused amazement, but people did not understand how to use electricity, especially since there was no infrastructure to bring electricity to their homes.”
The adoption of artificial intelligence is a challenge many CIOs face as they look to the future. Before you intervene, your teams must have the practical knowledge, skills, and resources to implement AI effectively.
AI challenges and infrastructure needs
to get your IT operations Forward-thinking, AI-ready leaders are reevaluating their entire IT ecosystem to build the right infrastructures to handle both existing and future AI-driven functions.
“You need data scientists, AI engineers, and machine learning operational engineers, and then you need people with good infrastructure, along with the developers who build the applications,” said Rajiv Ramaswami, president and CEO of Nutanix. “The set of tools needed to create AI applications and bring them to market is also not easy. Additionally, there is a shortage of {hardware}.”
AI is expensive to implement and training AI models requires a substantial investment. “To realize the potential, you have to pay attention to what it will take to achieve it, how much it will cost and make sure you make a profit,” Ramaswami said. “And then you have to go do it.”
GenAI has rapidly transformed from an experimental technology to an essential business tool, and adoption rates will more than double by 2024, according to a recent study by AI at Whartona research center at the Wharton School of the University of Pennsylvania. Weekly AI usage among business leaders increased from 37% to 72% and organizations reported a 130% increase in AI spending from 2023. the report found.
Traditional IT infrastructure is not equipped to handle high-intensity AI requirements, such as training large language models (LLM) or processing high-volume data streams in real time. IT professionals indicated that running AI applications on their current IT infrastructure would be a “significant” challenge, according to the Enterprise Cloud Index Report, published by Nutanix in early 2024.
Donahue used a practical automobile metaphor that demonstrates this challenge. “My 1949 car is not up to today’s performance demands,” he said. “I’m happy driving it, but I know it will never compete on the road. In fact, you shouldn’t drive it on the road because it’s already so outdated.” In other words, most of the existing IT infrastructure maintains the established order, but will not be able to efficiently meet the intense demands of AI workloads.
As newer automobile designs have evolved to meet higher safety, fuel efficiency and performance standards, enterprise IT infrastructures must evolve to provide greater computing power, flexibility and efficiency to handle AI applications, Donahue said. .
It is essential to improve security measures and governance frameworks as companies seek to protect intellectual property and customer data within AI models. This is driving CIOs to look for infrastructure that can manage AI strategically and securely, and that is agile enough to handle future innovations and challenges.
Manage IT infrastructure that runs AI
According to Donahue, IT teams are exploring three key elements: choosing language models, leveraging AI from cloud services, and building a hybrid multi-cloud operating model to get the best of native and public cloud services.
“We are finding that very, very, very few people will build their own language model,” he said. “This is because building a language model internally is like building a car in the garage with spare parts.”
Enterprises are looking at cloud-based language models, but must examine security and governance capabilities while controlling costs over time. “If those things don’t scare me into not using it with my IP and corporate data, then I’ll realize at the end of the month that I’m paying hyperscalers because my AI inference application, that little lookup box my employees use to ask questions, it uses cloud GPUs, and those are not cheap,” he said.
This leads IT teams to a third step: thinking beyond cloud-based models and considering solutions designed intentionally and specifically to handle AI functionality.
Donahue pointed out Nutanix Boxed GPTA comprehensive, pre-configured solution that combines hardware and software to support the deployment of AI models directly on-premise, in the cloud, or at the edge. This configuration is designed to streamline the deployment and operation of GPT models by providing all necessary components in a single, integrated package to integrate generative AI and AI/ML applications into IT infrastructures while keeping data and applications under management. of the IT team.
Donahue explained that GPT-in-a-Field allows existing IT systems to streamline the processes needed to incorporate AI capabilities. Reduced the complexity of selecting compatible components, configuring software, and optimizing performance.
By controlling the entire stack, including hardware, software, and AI layers, IT teams can implement robust security measures designed to safeguard AI environments, including data encryption, secure access controls, and data and intrusion detection systems. GPT-in-a-Field also allows teams to manage performance by leveraging optimal resources to efficiently access data in the right location.
Application and data management in hybrid multicloud systems
According to Donahue, infrastructure must be at the center of the AI adoption strategy and there is a cloud model poised to be particularly successful: hybrid multicloud.
“Hybrid multicloud is where it’s at,” Donahue said. “AI simply talks about hybrid multicloud because its data sets will be everywhere. You will need to use a solution like unified storage to collect and manage them under one roof.”
Hybrid multi-cloud environments integrate diverse computing resources and types of data storage. They facilitate efficient data management and processing, critical to the performance of AI systems, especially when handling large and varied data sets distributed across multiple locations.
“People already using hybrid multicloud will probably have an easier time getting started with their AI efforts,” Donahue said.
Prioritizing infrastructure modernization is essential. Adopting AI effectively requires businesses to re-evaluate and revitalize their underlying IT systems, focusing on the future and achieving the scalability, capacity, efficiency and key analytical capabilities needed to keep up in a rapidly changing IT world.
Learn more about the Nutanix Enterprise AI capabilities in this blog post and video coverage of its launch in November 2024.
Ken Kaplan He is editor-in-chief of The Nutanix Forecast. Find it in X @kenekaplan.