PORTFOLIO PROJECT
gesture-controlled
immersive gaussian splats
2025

This immersive portal system proof of concept was developed as part of a larger real-time interactive installation created for a luxury real estate event at Outernet London. The wider project explored how web-based applications, interactive Gaussian splats, and gesture-controlled animation could come together across Outernet’s screen infrastructure to create a spatial experience that felt both technically innovative and visually striking.
For privacy reasons I’m only showing the prototype rather than the final product, but it reflects a key part of the innovation behind the wider installation. Built as a browser-based 3D proof of concept using React, Vite, TypeScript, and PlayCanvas, the system was designed to test interaction, scene behaviour, and input pipelines across a set of immersive portal environments.
A central part of the concept was using gesture input to drive animation and visual change inside Gaussian splat scenes within a web-based application. That combination formed the core of the technical exploration and, in the context of Outernet, marked a first-of-its-kind use of web-based interactive Gaussian splats controlled through gesture-driven animation.
The experience was structured as a multi-scene single page application with several portal variants each portal loaded a Gaussian splat environment in a specifc screen, with scene-specific shaders, textures, audio, camera settings, and interactive behaviours, all orchestrated through a shared runtime system.

A key part of the wider project was connecting the experience to the camera infrastructure at Outernet. I integrated the system using OSC and WebSockets, allowing live input from the cameras in the space to control the experience in real time. During on-site testing, we were able to drive the interactive scenes using the venue cameras and validate the control pipeline directly within the space.
The prototype also supported MediaPipe-based hand and body tracking on laptop, which made it useful as a portable testing and demo environment during development. Alongside the live Outernet integration path, this helped create a flexible system that could be tested in different contexts while still feeding into the final installation pipeline.
IMPACT
A proof of concept developed within a large-scale interactive installation for Outernet London, exploring a new technical format for the venue: web-based interactive Gaussian splat experiences driven by gesture-controlled animation.
TECHNICAL DETAILS
Web-based immersive portal system built with React, TypeScript, Vite, and PlayCanvas.
Interactive Gaussian splat environments with gesture-controlled shader animation, custom textures, audio, and camera behaviour.
Integrated with Outernet camera systems through OSC and WebSockets for real-time control.
Used both as a browser-based prototype and as part of the wider installation workflow, including executable launches on Outernet machines.
RESPONSIBILITIES
Creative Technology Development: Built and developed the proof of concept as part of a larger interactive installation pipeline, focusing on real-time behaviour, interaction, and technical feasibility.
Real-Time 3D Experience Development: Created and integrated immersive portal scenes in PlayCanvas, including scene logic, transitions, camera behaviour, shaders, and runtime control systems.
Interaction and Visual Innovation: Developed a gesture-controlled interaction system that drove animation and visual change within Gaussian splat environments inside a web-based application, forming a key part of the project’s technical novelty.
Camera and Venue Integration: Connected the experience to Outernet’s camera systems via OSC and WebSockets, enabling real-time control of the scenes using live input from the space.
On-Site Technical Testing: Tested the system on the ground at Outernet, validating that the camera-driven interactions worked reliably within the venue environment.
Remote Launch and Installation Support: Remotely accessed Outernet CPUs to launch executable builds that drove the screen-based experiences, which were layered with other visual elements across the installation.
Prototype and Demo Workflow: Used MediaPipe-based hand and body tracking on laptop as part of the prototype workflow, allowing the experience to be tested and demonstrated outside the final installation environment.
