E11 Volume: Real 360° Capture vs Unreal Virtual Production
E11 Volume in Los Angeles hosted an exclusive hands-on session for cinematographers, focusing on how to effectively integrate real-world 360° capture into LED volume workflows. The event brought together filmmakers, colorists, and virtual production specialists for a deep dive into the practical techniques that define a seamless hybrid pipeline from pre-production planning to client delivery. Unlike traditional Unreal Engine-based environments, E11’s system empowers filmmakers to project real, photoreal 360° plates into the LED volume, blending the richness of on-location lighting with the control of virtual tools. The session highlighted how this process preserves the natural interplay of light, depth, and reflection. Delivering visuals that not only look real but feel real.
As DP Matt Ryan shared during the event:
“Unreal can get you about 60% of the way there… but the rest, the soul of the image, lives in what’s real.”
Workflow Best Practices for Cinematographers
Participants learned how to structure their workflow timelines for maximum efficiency by planning two to four weeks ahead to travel, capture, and process spherical footage before shooting in the volume. The team at E11 emphasized client communication and creative management as essential skills: ensuring directors, producers, and agencies understand both the creative potential and logistical requirements of volume-based production.
E11’s node-based color workflow was a focal point of the training. The system allows DPs to manage multiple color grades, highlight treatments, and depth layers side-by-side in real time. This flexibility empowers cinematographers to make creative adjustments—such as rebalancing the sun’s position or fine-tuning color temperature around 6500K—directly on set.
Foreground elements designed by Mandrake Studios were also showcased as an example of how practical objects anchor digital environments, creating a seamless blend between physical and virtual worlds.
Sim Plates: Engineering Photorealism
A key partner in the session was Sim Plates, a company specializing in high-resolution 360° driving and environment plates engineered for virtual production. Their workflow—spanning Blender to Octane Render optimizes image stitching, resolution, and color fidelity across 16-camera input arrays, ensuring lifelike parallax and motion that CG environments can’t replicate.
E11’s collaboration with Sim Plates offers filmmakers a ready-made solution for plate acquisition and integration, dramatically streamlining the pre-production process. Each capture retains full metadata and live camera-tracking data, giving DPs precise control over lighting, playback, and sync once the footage hits the volume.
E11’s Advantage: A Hybrid Future
E11 Volume’s mission is simple: to elevate virtual production through real-world authenticity.
By merging live-action plate capture, node-based compositing, and dynamic lighting systems like Assimilate LiveFX and Optical Flow, E11 provides filmmakers a flexible, photorealistic canvas—one where creative intent drives technology, not the other way around.
This event reinforced E11’s role not only as a production hub but as an educational resource for the cinematography community. A place where the next generation of DPs can master the evolving language of virtual filmmaking.
In an industry chasing pixels and render times, take this to heart as a reminder:
the most advanced production technology still begins with real footage.
For your next project, reach out to us and we’d love to strategize what’s the best decision for your production.





