January 6, 2019
In early 2018, we built something at Frame Labs that felt a little bit ahead of its time: Dreamdeck 360, a synchronised VR 360-video playback system designed to run reliable, high-quality group experiences across large numbers of standalone headsets.
The idea was simple to explain but tricky to pull off in practice. If you have dozens of people wearing headsets, you want every single headset to start at the same moment, stay in sync, and remain stable for the entire experience, with a smooth operational workflow for staff running sessions all day.
Dreamdeck 360 didn’t appear out of nowhere. It was built on a trail of lessons from earlier Frame Labs projects, starting with our first public event: the Whitenoise Underground VR/AR workshop (April 2016). For that event, I wrote a small tool called GearVR Remote, which allowed a laptop to synchronise 360 video playback across multiple GearVR headsets running on Samsung Note 8 phones over a local Wi-Fi network.
From there, we continued to push into networked and real-time VR demos, and eventually, the more demanding operational requirements of Mechatron VR, our VR motion platform ride. Each project surfaced the same recurring problems: networking quirks, headset behaviour, user-proofing, and the realities of running experiences in the wild.
Dreamdeck 360 was the next step: a purpose-built platform for high-scale, repeatable VR screenings.
Later in 2018, Dreamdeck 360 was licensed to the WA Maritime Museum for The Antarctica Experience, a 21-minute 360 film exploring Antarctica and the climate science conducted there. Perth-based White Spark Pictures produced the film.
This deployment became the biggest and most demanding version of Dreamdeck 360. At its peak, the system synchronised playback across 94 Oculus Go headsets, while also syncing to an external cinema-surround soundtrack. That number still makes me smile, because anyone who has tried to keep a handful of headsets stable in a public environment knows how quickly “simple” becomes “chaos” at scale.
Across the 14-week run, just over 20,000 people came to visit the exhibition.
Setting up and testing 94 Oculus GOs.
Dreamdeck 360 didn’t stop there. It was later used to showcase a full-dome film called Star Dreaming inside VR headsets, and it was part of various festivals and exhibition contexts, including multiple XRWA festivals and The Waiting Room project in Melbourne.
The 360 Cinema at the inaugural XRWA Festival 2019. Powered by Dreamdeck 360.
My responsibilities on Dreamdeck 360 covered both the software and the on-the-ground delivery:
System design and software development for synchronised playback at scale
A custom Unity application built for Oculus Go (And later ported to Pico G1/G2 and Skyworth headsets)
A kiosk-style lockdown solution to prevent users from exiting the experience
A Windows control and management application for operators running sessions
Networking hardware procurement and installation, tuned for real-world reliability
Team delivery management during setup and deployment
360 video encoding workflows for consistent playback performance across devices
Dreamdeck 360 was one of those projects that sits right on the edge between engineering and experience design. The tech had to be invisible. The system had to be reliable enough that staff could run it all day. And the audience experience had to feel effortless, even though under the hood it was coordinating a small fleet of headsets, Wi-Fi infrastructure, and AV playback constraints.
It was a challenging build, a great example of “shipping something real,” and one of the projects I’m proudest of from that period at Frame Labs.