The Skunkworks Experiment

How a one-off VR experiment at Cream Productions evolved into an AI-powered creative engine, fueled by Dell and NVIDIA innovation.

tl;dr: Cream Productions transformed creative storytelling by pioneering VR, AI, and innovative workflows. From recreating Hitchcock’s Psycho in VR to AI-generated content for History Channel, their journey highlights the power of Dell Pro Max workstations and NVIDIA RTX PRO GPUs. This infrastructure-first approach drives industry-leading innovation, positioning them at the forefront of creative technology advancements.


Andrew’s journey began with a pitch meeting that took an unexpected turn. After 18 years as a cinematographer working solo shoots for National Geographic and the History Channel, he transitioned to producing for Cream Productions. He walked into Dave Brady’s office to pitch a show, but Cream’s forward-thinking owner had different questions.

Brady began with interest in the show but quickly fixated on virtual reality. He wanted to know Andrew’s experience and knowledge: had he tried it, what did he think of it, and more importantly, could it be used to tell stories?

That conversation launched an unprecedented department dedicated to exploring future technology. Andrew agreed to come back with a budget, joking that they’d “tape a bunch of GoPros together and shoot some VR.” At the time, it felt like a one-off “skunkworks” experiment, but in actuality, it was a step toward redefining how Cream embraced technology.

The moment aligned perfectly with VR’s first major hype cycle. For their debut project, Andrew chose an ambitious homage to recreating Alfred Hitchcock’s famous shower scene from Psycho in VR. Hitchcock solved the challenge of depicting violence while satisfying 1950s censorship standards through unique angles. MacDonald placed the viewer inside the scene. The VR version added an immersive layer that struck a viral chord.

Networks noticed and offered small budgets for VR promotional content tied to existing TV series. The pilot projects led to larger opportunities, culminating in Hulu’s VR channel, launching in 2019.

The crash and pivot

VR’s limitations surfaced quickly, and the hype faded. “There was really no way to monetize it,” Andrew says.” It was a bit of a novelty. Once people had seen a VR film or two, they were like, “Okay, seen it, what’s next?”

Cream pivoted to volumetric capture and game engine development to merge their TV production work with celebrities like Dominic Monaghan and Nikolai Coster-Waldau into VR storytelling. On paper, it sounded straightforward: they needed to take their knowledge and apply it to something new.

After the team began, they realized they hadn’t considered the complexity of their task. They were asked how to make high-fidelity digital celebrities operate on mobile VR processors. The commitment proved technically impossible with conventional volumetric capture methods.

This crisis drove breakthrough innovation. Faced with rendering demands that far exceeded what standardized workflows could handle, the team turned to Dell Pro Max workstations powered by NVIDIA RTX PRO GPUs.   Experimenting with new techniques on systems purpose-built to support creative pipelines, the team developed and patented a solution that projected helmet-mounted camera footage onto low-polygon 3D character models. Instead of requiring high-resolution facial geometry, details were derived from video playing on the face’s surface plane.

Source: Survivorman VR Simulation: From TV Show to Immersive Game

“You don’t need a high-poly head anymore. You just have all that detail coming from a video, playing on the Cartesian plane of the face,” Andrew explains. The resulting patented pipeline became a cornerstone of Cream’s unique new VR production.

Source: Survivorman VR Simulation: From TV Show to Immersive Game

The AI breakthrough

By late 2024, the infrastructure investment philosophy faced its biggest test. The team had been experimenting with AI video generation for over two years, starting with frame-by-frame Midjourney workflows and progressing to early diffusion models. The technology showed promise, but legal barriers prevented broadcast use.

When enterprise AI models launched with ethical sourcing guarantees and insurance clearance, “the floodgates opened,” Andrew says. AI-generated content began to clear broadcast standards.

This breakthrough coincided perfectly with A&E’s History Channel, “Life After People,” a scientific exploration of Earth after human disappearance. Traditional television production would budget for two major VFX shots and fill the rest with stock footage and interviews. AI changed the creative equation.

“A photorealistic bear wandering around in a grocery store would be a $20,000 or $30,000 VFX shot, or $20,000 or $30,000 for an on-location shot with a bear wrangler and safety person,” Andrew explains.” Now, something like that has become possible to prompt into existence.”

But prompting isn’t enough. The workflow demands the ability to refine, adjust timing, and respond to director feedback in real-time. Cloud-based AI services imposed waiting times that killed momentum. In contrast, Dell Pro Max workstations running local ComfyUI deployments with NVIDIA RTX PRO 6000 Blackwell Architecture  transformed the entire dynamic.

“If you’re waiting four hours to get a five-second video back, that’s counterproductive to the creative workflow,” MacDonald shares. Local processing kept pace with the demands of the director.

Industry recognition

The infrastructure advantage produced measurable business results. When A&E executives reviewed the AI-generated content, “they were impressed because they hadn’t seen content of this quality from any of the other people they were working with,” Andrew recalls. This led the network to begin internally sharing Cream’s work across departments.

MacDonald’s team leads commercial AI offerings by running sophisticated ComfyUI workflows locally. “Most of the stuff that you do in ComfyUI today is eventually repackaged into commercial platforms, but we’re a step ahead because we’re always running the cutting edge of what’s possible,” MacDonald explains.

The infrastructure philosophy

The decade-long journey demonstrates that an early infrastructure investment in emerging technologies establishes competitive advantages when those technologies mature. In the past, Dell Pro Max workstations enabled experimental VR development. Today, NVIDIA RTX PRO Blackwell GPUs power industry-leading AI production workflows.

“Dell has been great. They’ve been so supportive of us,” Andrew reflects. What began as experimental VR “skunkworks,” has evolved into a sustainable competitive advantage across multiple technology cycles. With Dell and NVIDIA as foundational pillars, Cream’s infrastructure-first philosophy positions the team to lead, rather than follow the next wave of creative innovation as AI continues to evolve.

Powering the future of creative production

Andrew sees the industry nearing an inflection point where local AI deployment may become the professional standard. “There’s a trend where open-source AI companies prefer users to run models locally, via API rather than on their servers since they’re operating at a loss to gain market share,” he explains. Local processing offers greater control, speed, and compliance. As enterprise AI models mature and hardware capabilities expand, the combination of Dell Pro Max workstations and NVIDIA RTX PRO Blackwell GPUs positions creative teams to seize this opportunity.

Learn More:

Dell Pro Max Accelerated by NVIDIA

Cream Productions

About the Author: Logan Lawler

Logan has worked in various roles at Dell for 16 years, including sales, marketing, merchandising, services, and e-commerce. Before joining Dell, Logan grew up in Missouri and graduated from the University of Missouri (MIZ!). Logan lives in Round Rock with his wife Ally, daughter Calloway, and labradoodle Truman.