New Machines For A New World

How AI is turning data into "wisdom we’ve never had before."

Not since the beginning of the Industrial Revolution nearly 300 years ago has the world experienced technological advancement as rapid and profound as today, at the beginning of the age of artificial intelligence.

Instead of the steam and steel that fired the industrial revolution, the raw materials of the AI revolution are data and information, said Dell Technologies Vice Chairman and COO Jeff Clarke during a keynote Tuesday at Dell Technologies World in Las Vegas.

“The new machines are GPUs capable of massive parallel processing performing trillions of floating-point operations per second,” Clarke said. “That, when coupled with high-speed AI fabrics, AI speed storage, the right models and data tools, transforms data and information into insights, knowledge and wisdom we’ve never had before.”

A new computing architecture is required to handle AI workloads. Traditional architectures are simply not equipped for the task, Clarke said. Dell calls this new architecture the Dell AI Factory, and as with any factory, the Dell AI Factory takes raw material, data, and turns it into something useful, insight and knowledge.

The strategy behind the Dell AI Factory is based on several key assumptions. Namely, that vast majorities of data sit on-premises, rather than in a public cloud, and that half of enterprise data is generated at the edge.

This means that for the sake of efficiency and security, AI must be brought to the data, rather than the other way around, Clarke said. It also means there’s no one-size-fits-all approach to AI, and that AI requires a broad, open ecosystem and open, modular architecture, he said.

Most enterprises will not train large language models themselves but turn to open-source models to use GenAI in their business, Clarke said. Over time, smaller, optimized open-source models will help enterprises achieve better performance and efficiency. Intelligent data pipelines will ensure AI is primed with correct, comprehensive data, and inferencing will be performed anywhere an AI-guided outcome is desired, he said.

Because of this, Clarke said, AI Factories must come in all shapes and sizes, from mobile workstations or a single server, to multiple data centers containing hundreds-of-thousands of GPUs connected as a single, cognitive computer.

The best way to put this new architecture to work is to separate it from legacy systems and optimize each for its respective workload, Clarke said. AI workloads require accelerated compute, optimized high-speed storage, high-throughput low latency network fabrics, data protection, integration into a common data pipeline and AI PCs.

This may seem like a lot. These are complex, highly technical, engineering-intensive systems that require high levels of software and networking expertise. Still, Clarke’s advice was simple: “Start getting ready now.”

In the coming years, the compute requirements are expected to rise astronomically. By 2030, only about 10% of compute demand will be earmarked for training. The lion’s share will be used for inferencing. Also by 2030, data center capacity will grow 8X, and by the end of the decade the PC install base will refresh and there will be two billion AI PCs in use, Clarke said.

To dig into the networking side of the AI equation, Clarke welcomed Broadcom President Charlie Kawwas to the stage, who said networking is essential to the communication required to bring large-scale AI to life, and discussed Broadcom components used in new Dell switches and servers.

Clarke was also joined on stage by Arthur Lewis, president of Dell’s Infrastructure Solutions Group, who walked through the company’s AI solutions portfolio in detail.

“The advancements we’ve seen in AI will not only accelerate the value the world’s data will bring to organizations of all sizes. It will forever change the architecture of data centers and data flows,” Lewis said. “Silos of the past will be dismantled. Everything will be connected.”

“We sit at the center of an AI revolution,” Lewis said, “and we have the world’s broadest AI solutions portfolio from desktop to datacenter to cloud, a growing and great ecosystem of partners and a full suite of professional and consulting services.”

Lewis gave a rundown of Dell’s AI portfolio and previewed future solutions. To talk more about that growing ecosystem of AI partners, Lewis brought Dell SVP and CTO for AI, Compute and Networking Ihab Tarazi to the stage. Tarazi was joined by Sy Choudhury, Director of AI partnerships at Meta.

Open-source models like Meta’s Llama 3 are driving “the most impactful innovations” in AI, Tarazi said.

By using open-source models, companies contribute to performance and security improvements, Choudhury said. Meta’s partnership with Dell means the companies can deliver well integrated and highly optimized solutions customers can leverage for a wide variety of use cases.

“And with our permissive license,” Choudhury said, “customers like those represented here today can easily leverage the state-of-the-art nature of the Llama 3 models as-is, or fine-tune them with their own data.”

Tarazi and Choudhury walked through a demo of Llama 3 working on a product development project, and Tarazi welcomed Hugging Face Product Head Jeff Boudier to the stage to talk about the company’s efforts to make it easy for customers to build their own secure, private AI systems using Dell and open-source models. The pair introduced and demonstrated the Dell Enterprise Hub.

Sam Burd, Dell President of Dell’s Client Solutions Group, took the stage to highlight the central role AI PCs will play in the broader AI revolution.

Dell’s AI PCs, Burd said, are “a true digital partner enabling software developers, content creators, knowledge workers, sales makers and everyone in between to be more efficient, solve problems faster, and focus on the more meaningful, strategic work.”

To illustrate the point, Burd welcomed Deloitte Chief Commercial Officer Dounia Senawi to the stage to talk about the increased productivity, speed and cost efficiency the company has realized with Dell AI PCs. He also teamed with Microsoft Corporate Vice President for Windows Matt Barlow to demo the capabilities of AI PCs.

Another of Dell’s high-profile customers is McLaren Racing, and CEO Zak Brown joined Clarke on stage to talk about how the team is putting Dell AI solutions to use in development and on-track.

Technology generally, and AI in particular, are the difference between success and failure for McLaren, Brown said. McLaren uses AI to run simulations, to help make critical decisions very quickly. “We see millions of simulations before we make a decision,” Brown said. “And we work in real time. We have a split second to make a decision.”

All that happens on a backdrop of furious competition. “Take the car that’s on pole for the first race,” Brown said. “If it’s untouched, by the end of the year it would be dead last. That’s the pace of development. We used to be able to spend our way out of problems. Now you have to make sure what starts in the digital world works on the racetrack.”

Dell is bringing this new world to life with a set of familiar strengths, namely its unique operating model, including an end-to-end portfolio, an industry-leading supply chain, the industry’s largest go-to-market engine and world-class services.

“Our strategy, in its simplest form,” Clarke said, “is to accelerate the adoption of AI.”

Matt Brown

About the Author: Matt Brown

Matt Brown is Dell Technologies senior managing editor. Before joining Dell in 2019, Matt was an award-winning journalist for various daily, and business publications. He lives in Connecticut with his wife, two children and a small dog.