December 20, 2023
In late 2022, Frame Labs was contracted by (the VERY cool) Soul Assembly to provide technical support on their live VR title Drop Dead: The Cabin. At the time, Meta had recently introduced Application SpaceWarp, which lets developers render at a lower native framerate and synthesise intermediate frames to maintain smoother motion on headset hardware.
The Cabin was struggling to reliably hold the minimum performance target required for comfortable play and store compliance on Quest 2 hardware. My role for the first engagement was a focused, one-week performance investigation: identify bottlenecks, recommend fixes, and provide practical implementation notes and sample code to help the team integrate Application SpaceWarp in a production-safe way.
A targeted performance bottleneck breakdown (rendering, scene, and content hot spots)
A technical review of ASW suitability and risks for the project
Sample code and integration guidance tailored to the game’s structure and update cadence
After the performance work, Soul Assembly brought us back in to help develop a mixed reality mode inside the main Cabin experience. That MR mode became Home Invasion, which turns your physical room into the play space, with enemies breaking in through your real doors and windows. It was positioned as a flagship MR experience around the launch window of Meta Quest 3, and the mode received strong coverage from VR outlets and the wider Quest community.
From January to November 2023, I worked remotely with the Soul Assembly team through a high-pressure production march to get MR gameplay running reliably on evolving platform tech.
MR enablement without breaking the live game
Adapting existing Cabin systems, assets, and gameplay rules so MR could coexist with the shipped VR game and remain maintainable across updates.
Working with early MR platform tech
Integrating and validating against beta mixed reality SDKs and changing requirements, including periods where I did not have access to the final hardware.
Systems R&D to support the designed MR gameplay
Prototyping and hardening the technical building blocks needed to make “zombies in your room” feel convincing.
Incremental delivery under event deadlines
Producing regular internal test builds, plus milestone builds for major industry events like the Game Developers Conference and Gamescom.
In-game tools for faster iteration
Building practical debug and test tooling so designers, artists, and QA could validate MR behaviour quickly, tune scenarios, and reproduce edge cases without developer hand-holding.
The result was a mixed reality mode that landed well with players and reviewers, and it remains a standout example of how MR can feel intense, physical, and surprisingly “real” when it fully commits to your space as the battlefield.
To date, this has been the greatest mixed reality project I've had the privilege to work on. Great team. Great game!
Home Invasion impacted nearly every aspect of the project, from how the game interprets a real room to how it renders, navigates, and debugs behaviour on-device. The sections below highlight the main systems and tools I built or led, and how they came together to make “zombies in your room” feel believable in production.
Home Invasion started with a very practical question. How do you turn someone’s real room into a believable play space without breaking the live VR game?
At the time, Meta’s mixed reality stack was still in its early stages and undergoing rapid changes. Before touching the main Cabin project, I spun up a small, separate Unity project and used it to learn the SDK, validate assumptions, and build a repeatable workflow. That sandbox let me iterate fast, make mistakes safely, and arrive at a stable approach before integrating anything into production.
The first step was loading the mixed reality scene anchors that players had created in the headset. Anchors were tagged by type, such as floor, ceiling, wall, window, door, and furniture, and they provided position, orientation, and size data. I built a simple visualisation layer that generated proxy geometry for each anchor so we could immediately see what the headset thought the room looked like and spot bad scans or strange edge cases early.
Once that worked, I moved from “drawing anchors” to “building a room”. Floors and ceilings were straightforward planes, but walls needed more structure. I wrote a room-building system that treated each wall as a set of modular wall blocks and automatically segmented them around openings. Blank wall sections became blocks, then additional blocks were generated above and below window anchors and above door anchors. The result was a clean, enclosed shell with proper holes where windows and doors belonged.
From there, placing game content became reliable. I instanced window and door prefabs into those openings using the anchor position and rotation. For gameplay consistency, we used fixed window and door sizes rather than matching the player’s exact dimensions. That choice was deliberate. Enemies needed to enter through openings that were known to work with the existing VR design, so we preserved the player’s layout and orientation while keeping entry points predictable and testable. In practice, this also helped the illusion, because the game openings tended to be slightly larger than real doors and windows, which reads well in passthrough.
To sell the mixed reality effect, I wrote a shader that renders a passthrough video on any surface using that material. Once applied to the generated floor, ceiling, and wall geometry, the room immediately felt “present” and gave the rest of the gameplay systems something coherent to sit inside.
Finally, I built a fake anchor workflow so development did not depend on repeatedly setting up rooms in the headset. This allowed me to author and replay multiple room configurations instantly, reproduce issues reliably, and iterate on the room builder without the friction of rescanning spaces every time.
That early anchor and room-building work became the foundation for everything else in Home Invasion. Once the room could be built consistently, we could start making zombies feel like they were really in your space.
Early test getting the MR anchors and enabling passthrough video.
Video showing the 'wall blocks' (green sections) and window and door sections.
The MR wall building process.
Early test of MR windows, doors, and hole/portal into outside virtual world.
Fake MR anchors inside the Unity editor helped speed up development.
Once the room-building system was working reliably, the next step was getting it to live inside a shipped VR title without destabilising anything.
The mixed reality work was integrated as a separate mode, reachable from the main menu. The core VR game needed to remain intact because it was already live, actively updated, and the original team had moved on to other work. That reality shaped the overall strategy. Avoid risky edits to existing content and prefer runtime adaptation wherever possible. Build what MR needs around the game, not through it.
The first problem appeared immediately. In mixed reality, most of the game’s materials looked semi-transparent, as if the passthrough feed was bleeding through the entire scene. We were on a tight schedule, so the initial fix was intentionally blunt. I wrote a runtime script that replaced problematic materials with a default white material so we could keep moving while I investigated the root cause.
Weird ghostly passthrough mixing issue.
Quick material switcher fix while we made shaders MR safe.
Using the Frame Debugger, it became clear what was happening. Many of the game’s shaders were writing alpha into the frame buffer. In the VR game, that had not been an issue, but in mixed reality, the alpha channel has a specific meaning. Any alpha written into the frame buffer becomes a signal to the MR compositor, and passthrough video gets blended in wherever that alpha exists. The tricky part was that nobody could point to a clear reason those shaders were writing alpha in the first place, and the project constraint still held. We could not edit core game assets.
So I duplicated the shaders and created MR safe variants that do not write to the alpha channel. Then I expanded the runtime material switching tool so it could selectively swap materials and shaders when entering MR mode. That system became one of the key pieces that allowed MR to coexist cleanly with the shipped VR experience.
With rendering stable, the next major hurdle was navigation. The MR room geometry is created dynamically from anchors, so any pathfinding that depends on the environment needs to be generated dynamically as well. At the same time, the game already had an established navigation setup, and we needed MR to plug into it rather than replace it. I used Unity’s NavMeshSurface workflow to generate a navigation mesh after the room was built, then layered that into the existing system so enemies could convincingly traverse from the virtual world into the player's physical space.
The runtime debugger showing the NavMesh that was built at runtime.
Most of that worked quickly. Then we hit the problem that mattered most for the fantasy of Home Invasion. Zombies could break down the window planks and doors, but they would not actually enter through the windows. They would reach the opening and just stall. Doors and windows both relied on NavMeshLinks, but window entry had additional complexity because it required specific animation timing and logic that was separate from normal pathing.
This turned into a deep dive. I spent long nights tracing the navigation and traversal code, stepping through the logic that decides when an enemy should commit to a window entry, and comparing how the door pipeline worked versus the window pipeline. Eventually, the missing pieces revealed themselves. There were conditions and link states that needed to be set in a very particular sequence for the jump traversal to be recognised as valid.
Once those pieces were restored, the behaviour snapped into place. Enemies broke the planks, committed to the link, and properly vaulted into the room. It was a small moment in the codebase, but a huge moment for the mode. When zombies finally started coming through the windows the way the design intended, the entire experience clicked into that unsettling, physical feeling that made Home Invasion special.
A celebratory shotgun blast to the face after fixing zombies jumping through windows bug.
Mixed reality development forces you to debug in the least convenient place possible. Most problems only appear when you are physically in the space with the headset on, which makes the usual Unity Editor workflow far less effective. While I still used the editor wherever possible, a lot of validation and troubleshooting had to happen on-device, inside a live MR session.
To make that workable, I built a dedicated runtime debugging tool designed for fast iteration in a headset. I treated it as something that would grow with the project rather than a one-off hack, so it was structured to be extensible and usable by the wider team, including design, art, and QA. The result was an in-game panel that could be opened with a controller button press while in mixed reality, giving immediate access to the controls we needed during testing.
The panel was organised into clear sections, including logging, mixed reality, zombie behaviour, wave management, weapons, performance, player settings, environment settings, and triggerable events. Each section contained practical, hands-on tools. For example, the zombie section could spawn specific enemy types on demand, while the weapons section could spawn items and ammo instantly. This made it far easier to reproduce edge cases, validate tuning changes, and test scenarios without needing a developer to rebuild or hand-hold every experiment.
To support some of these features while still respecting the constraint of not modifying large parts of the shipped game, the debugger occasionally used C# reflection to reach into existing systems and call methods or adjust internal values. Reflection is not something I would lean on in production runtime systems, but for a development-only tool, it was the most effective way to interact with systems we could not safely refactor mid-flight.
That runtime debugger became one of the key enablers for the project. It reduced turnaround time, made MR testing practical for the whole team, and helped keep delivery on track under heavy deadlines.
The RuntimeDebugger tool showing the 'events' section.
The RuntimeDebugger tool allowed us to spawn lots (AND LOTS) of game items.
Mixed reality makes level design harder because you do not know the player’s space ahead of time. Home Invasion needed to feel authored and dramatic, but still adapt to whatever room the player scanned. The core loop was a defence scenario where the player protects a radio tower outside the room while zombies also break in through the player’s real doors and windows. If the tower goes down, the player can reactivate it by completing a mini-game on a virtual console. Between waves, a weapon rack rises from the floor to provide guns, ammo, and health before the next escalation.
To support that gameplay, I built systems to dynamically place key interactive props in valid locations within the player’s scanned environment. Early on we defined minimum space requirements to keep the experience reliable, including at least one window, at least one door, at least one desk, and a room size of roughly 3 by 3 metres. The desk became the natural anchor for the console interaction. I repurposed an existing table asset from the VR game and created a dynamically scalable version that could fit over desks of different sizes. On top of that, I implemented a lightweight placement grid that subdivided the tabletop into cells. One cell was reserved for the console, positioned and oriented to face into the room, while other cells were used to spawn starting weapons and ammo in sensible, readable positions.
Placing the weapon rack was trickier because it needed a clear wall-adjacent location away from windows, doors, and the desk, and it also had to support a specific reveal animation. Around this time we discovered a limitation in my initial room geometry, which used simple planes for the floor and ceiling. In irregular rooms, players could sometimes see the plane edges outside their real boundaries, which broke the illusion. I replaced the floor and ceiling generation with a room-fitting approach that derived a 2D footprint from the wall anchors, voxelised that shape, and then generated tighter geometry that matched the scanned space more accurately. That same voxel representation gave me a reliable way to search for a clear segment of wall and select a robust weapon rack placement point.
The rack itself presented an additional challenge because it emerged from below the floor through sliding doors, which meant the floor needed a clean opening. I solved this by reserving space for the rack during voxelisation so a hole was baked into the generated floor mesh at the correct location. Because the voxel-cut hole did not perfectly match the rack’s door shape, I built additional “collar” geometry at runtime that bridged between the hole edge and the rack doors, and I shaded it using the same passthrough material as the floor. The result looked seamless in headset, even with the rack animation revealing a convincing void beneath the player’s real floor.
That technique became a reusable pattern. Later, I applied the same approach to create a collapsible ceiling section for the finale. If the player survives all waves, a rescue helicopter arrives and lowers a rope ladder through a hole in the ceiling. The player grabs the ladder to end the mode and escape, and because the ceiling opening was generated using the same geometry workflow, the moment felt grounded and believable inside a wide range of real rooms.
The floor (And ceiling) voxelization process. Including hole for the weapon rack.
Early test of the weapon rack rising out of the floor.
One of my favourite pieces of work on Home Invasion was building a destructible wall sequence. After the player survives several escalating waves, a large boss enemy called Forest Pete smashes through one of the player’s walls and leaves a gaping hole that smaller enemies can pour through. The animation for Forest Pete was already strong, so the goal on my side was to sell the impact and make the wall failure feel physical inside the player’s real room.
To do that, I built a system that turns a section of the player’s scanned wall into simulated debris. The wall fragments had to be real-time physics objects so they could fall naturally into the space and settle in believable places, like across a desk or piled against a skirting line. I also wanted them to feel “present”, so I left them pushable and reactive rather than turning them into pure VFX. Because persistent debris can become visually noisy and expensive over time, I added a cleanup strategy that hides or removes fragments only when the player is not looking, which avoids obvious pop and keeps performance stable.
A key part of the illusion was handling passthrough correctly. If the fragment faces continued to display passthrough video while the chunks were airborne, it immediately looked wrong. I wrote a shader that transitions the debris from passthrough to a broken concrete material as the pieces separate and fall, then supported the moment with small bursts of smoke and dust particles to smooth the handoff. The result felt cohesive in the headset and avoided the “floating camera feed” problem that can break mixed reality scenes.
On the content side, I generated the fractured wall geometry in SideFX Houdini using the minimum room dimensions we required for the mode. Houdini made it easy to control chunk count, size distribution, and fracture aesthetics, with larger chunks around the perimeter and smaller ones near the impact centre. It also let me procedurally UV the pieces and separate front faces from interior geometry, so I could apply passthrough to the outer faces and broken concrete to the internal surfaces and backs. That separation mattered a lot once the wall was open and players could see into the thickness of the break.
I also built a pre-break “cracking” phase to build tension before the impact. In Houdini, I baked a flattened normal direction into the mesh as vertex colour data, then in Unity I used that data in a shader to subtly push vertices and open crack lines. A simple radius-based tool let me grow cracks around an impact point, so the wall could visibly stress and start to fail just before Forest Pete bursts through. It was a small touch, but it added anticipation and made the smash feel earned.
A close second favourite was a persistent damage system that I added because it felt strange for bullets to leave decals on virtual props, but not affect mixed reality walls at all. I implemented a technique that “paints” bullet impacts into render textures at runtime. Those textures are then used as masks inside a modified passthrough shader, revealing damage on the player’s walls in a way that stays consistent over time. It is a cheap trick, but it looks great, feels satisfying, and makes the physical room feel more like an active part of the combat space.
Wall cracking shader trick.
Wall cracking + selective physics breaking.
Test wall smash in the Unity Editor.
Test wall smash in mixed reality.
Persistent damage masking test. The red, green, blue, and white show the level of damage.
The persistent damage mixed with passthrough video allows players to shoot their walls.
Working remotely with the Soul Assembly team on Home Invasion was genuinely enjoyable. It gave me the chance to contribute to a large Unity codebase alongside experienced artists and developers, and to integrate my work into an established production pipeline rather than building everything from scratch. Regular stand-ups and check-ins helped keep momentum high, and they made it easy to surface blockers early, align on priorities, and keep the team moving in the same direction.
It was also valuable exposure to a mature QA process. Having a dedicated QA team meant issues were captured consistently, reproduced clearly, and tracked through to resolution, which raised the quality bar across the whole mode. It reinforced how important good tooling, clear documentation, and repeatable test steps are when you are shipping under pressure, especially when the work spans multiple disciplines and needs to remain stable inside a live product.