GenAI in 2025: Accelerating the Adoption Curve
Organizations across every industry increasingly recognize the transformative potential of generative AI (GenAI) and the need to invest in this technology. According to a recent MeriTalk survey, 83 percent of IT decision-makers say their organization will fall behind their competition if they do not embrace GenAI.
The research, conducted in partnership with Dell Technologies and Intel, surveyed 300 IT decision-makers familiar with their organization’s GenAI plans to explore how they are working to support their organizations through GenAI-driven transformation in 2025 and beyond.
The shift towards an AI-first mindset in many organizations is happening organically because employees are being exposed to GenAI functionality embedded in the applications they use every day, such as Microsoft Copilot, according to Matt Allard, Director of Alliances and Solutions, Dell Technologies.
“The companies that are most ahead are ones that are looking at opportunities to drive productivity, efficiency, and insights strategically. They often have some experts in the organization who know enough about what’s possible with AI to help guide proofs of concepts or pilots, which then can bloom into bigger implementations,” Allard said.
The research found that half of IT decision-makers are considering or plan to create a custom GenAI application in 2025.
Allard said he works with organizations to use pre-trained models combined with retrieval augmented generation (RAG) techniques using the customer’s data to quickly generate accurate, data-driven insights as an entry point for many of their GenAI projects. Developers can pilot and deploy GenAI models securely on-premises with Dell Precision workstations and then build larger custom projects with an AI factory approach. An AI factory is a portfolio of products, solutions, and services tailored for AI workloads and designed for fast, repeatable outcomes.
Approximately 60 percent of IT decision-makers reported that traditional IT environments are ill-equipped to handle GenAI. The AI factory approach provides scalable, modular building blocks for AI and is designed to grow with an organization’s GenAI needs, Allard said.
“We work with partners like Intel, Hugging Face, Meta, and Microsoft to provide tools that help developers better understand the equipment, environment, and infrastructure to deploy GenAI for a particular application or use case,” he said. “It’s becoming easier to build original GenAI tools. As public programming toolsets become simpler, they enable IT developers to become AI developers.”
With end-to-end processing capabilities from the Dell AI Factory, organizations can deploy GenAI models and applications as close as possible to where data resides, from desktop to data center to cloud, to minimize latency, lower costs, and maintain data security by keeping sensitive information within a controlled environment.
The survey also found that 85 percent of IT decision-makers believe that GenAI requires an open, modular IT architecture. Large language models (LLMs) and GenAI software are evolving rapidly, enabling organizations to build applications that use multiple LLMs for various functionalities and enabling them to switch out LLM components for improved or additional functionality in the future, according to Allard.
For example, to create Andie, Dell Technologies’ digital human, a virtual assistant driven by GenAI, four LLMs were used for speech to text, text to speech, 3D graphic representation, and audio sync.
“Our thinking is that an open and componentized approach for AI applications will allow customers to evolve their software with best-in class-components and to continue to get the best of what’s being developed by the primary foundation model makers,” Allard said.
From a hardware perspective, as the number of GenAI applications simultaneously in use on a machine increases, the number of LLMs operating on that machine will increase the workloads of the NPU or GPU performing the processing. Modular, scalable GenAI processing and open infrastructure can enable users to start a project in one domain, such as a workstation, and easily scale up and transport that project to a server domain when additional processing capabilities are required.
“We know we’re at a stage where a lot of companies are exploring and innovating, and they may not be ready to go to scale, but when they are, they will want to have a toolset that lends itself to carrying things from a sandbox environment, like a workstation, to a server environment for deployment,” Allard said.
As organizations increasingly expose their employees to GenAI applications, they’re developing comprehensive frameworks that extend beyond basic user policies. These frameworks tackle critical issues like data utilization, programming variables, and inherent biases. Over the next year, 51 percent of IT decision-makers anticipate creating or expanding their GenAI guidelines and governance frameworks.
Allard said organizations need to offer employees opportunities to use GenAI technologies in their work so they can experiment with it safely and understand the possibilities it offers. He advises user policies and training that emphasizes that AI is a helper, but its outputs ultimately needs the review and approval of a human being.
“Our feeling is that companies should be trying it. They should be trying it in a safe way. For us, that means keeping data behind a firewall for security. But give employees access to it. These are productivity tools that can make a difference in our day-to-day lives,” he said. “I think the possibilities are absolutely endless with AI.”
For more insights, review the listicle: https://www.mypossibilit.com/resource/openness-and-innovation-genai-essentials-for-2025/