In recent years, modern GPUs have reshaped nearly every facet of technological innovation. But where we often see headlines about chatbots and AI image generators, JangaFX is using today’s computational power to transform visual effects (VFX) through real-time physics simulation.
Through technological advancements including the latest NVIDIA RTX™ GPUs and the software that harnesses their potential, complex physical simulations that once took days to render now unfold in real-time. From the nebulae of EVE Online to the cinematic explosions in League of Legends and from Superman’s fiery rescues to the cosmic vistas of How the Universe Works – this dramatic shift brings both challenges and opportunities as studios reimagine their creative processes around instant feedback. JangaFX is leading these efforts in the acceleration of VFX processes by creating the tools that enable artists to work on high-end graphics interactively.
Nick Seavert, CEO and founder of JangaFX, began tinkering with visual effects (VFX) as a young creator, jesting that “I was a bit of a pyromaniac as a kid. VFX was a safe way for me to play with fire.” Seavert’s motivations for simulating reality were simple: he wanted better, more realistic explosions. However, his original tools made the process laborious.
The Challenges with Fire
In the early 2010s, simulating fire and smoke required hours for a single frame to render. And though Nick’s setup was equipped with capable hardware for its time – a 75-gigabyte hard drive and a GTX 780 graphics card – animation software did not fully leverage the power of GPUs. They would instead call upon CPUs to do all of the complex mathematical calculations behind the physical simulations.
“I was a bit of a pyromaniac as a kid. VFX was a safe way for me to play with fire.” – Nick Seavert, CEO & Founder, JangaFX
“I wondered—why can’t the GPU handle this?” Nick recalls. In gaming, GPUs had already shown they could render stunning visuals in real time. So why not simulations? That question became the foundation for EmberGen: a tool designed to give artists the speed and interactivity they needed to stay creative.

Breaking the Render Barrier: The Birth of Real-Time VFX
Before modern GPUs were introduced, fluid simulation tools like FumeFX relied heavily on CPU power. “I’d wait 12 to 24 hours just to render a single frame,” Nick recalls. “By the end of it, my hard drive would be full, and my system would crash.”
Back then, creating visual effects meant enduring a disjointed workflow. The software first calculated how particles moved or smoke curled over time, then rendered these calculations into viewable results. Both steps took hours, and artists often had to redo simulations repeatedly. The constant waiting crushed creative momentum.
EmberGen broke this cycle by merging simulation and rendering into a single, real-time process. Artists could now adjust parameters and immediately see their fully rendered effects, turning an iterative slog into an interactive experience.
Nick found inspiration in Valve’s Hammer Editor, which allowed creators to tweak particle effects and visualize results instantly. “Valve’s tools ran in real time—you’d click a button, see an explosion, and immediately play around with it,” Nick explains. He wanted to bring that same immediacy to 3D volumetric simulations. Though initially dismissed as impossible, advancements in GPU technology by 2016 made real-time fluid simulation achievable, laying the foundation for EmberGen.
How EmberGen Changes the Workflow
EmberGen’s impact comes to life in games like EVE Online, where CCP Games faced a formidable challenge: creating rich, volumetric clouds that could bring depth to their space battles without grinding their production pipeline to a halt.
These volumetric clouds require significant computational power. EmberGen helped CCP Games eliminate these limitations by leveraging the VRAM capabilities of NVIDIA RTX™ GPUs – allowing artists to craft and fine-tune intricate cloud formations in real time. The team exported these optimized simulations as VDB files for their game engine. “We wanted to make sure artists could iterate without limits,” Nick explains. “Creativity shouldn’t be held back by technology.”

The Technology Behind Embergen’s Real-Time Magic
At the heart of EmberGen’s real-time capability is how it interacts with GPUs, specifically modern hardware like NVIDIA’s RTX™ 4090 and RTX™ 6000 series. The high memory bandwidth and parallel processing capabilities enable applications to handle billions of voxels (3D pixels) at once, accelerating calculations and rendering results instantly.
Professional creators can access this power through workstations built for these demanding workflows. The Dell Precision 5860 Tower and the Precision 7960 Tower, equipped with these NVIDIA RTX GPUs, provide the foundation needed for real-time simulation and rendering. Paired with JangaFX’s VFX tools, creators can quickly create their own explosions, nebulae clouds, and much more from their own workstations.
Why Real-Time Matters
For Nick and the JangaFX team, real-time simulation is about reducing the hidden steps between the artist and the artwork. When artists can see the impact of their changes immediately, they’re free to experiment and push boundaries. That freedom is what makes projects like EVE Online’s volumetric clouds possible. It’s also what drives the team to keep pushing the limits of what GPUs can do.
“We didn’t just want to make explosions,” Nick says. “We wanted to make the process of making explosions feel explosive, too.”
Engineered for unmatched performance, Dell Pro Max high-performance PCs* harness the full potential of professional-grade graphics. To learn more about the benefits of high-performance PCs and NVIDIA RTX GPUs, click here
*Dell Pro Max high-performance PCs, previously referred to as Dell Precision workstations


