Physicalization
of AR/XR
Experiences
Overview
AR experience design in 2021 was rich in theory and thin in practice. The tools existed, but building with them was the only way to understand them, and there was no shared vocabulary for what each mechanic does, what it costs, or what design decisions it forces.
The thesis became a system of six physicalized prototyping tools, one per AR mechanic. Each tool makes a specific technique legible: what it can do, where it fails, and what it demands of the physical environment around it. The goal was a working vocabulary for designing with AR constraints firsthand.
How It Started
You can't design with a tool you don't understand
The project started as an attempt to design finished AR experiences. Working with unfamiliar tools quickly exposed a gap: there was a significant distance between imagining an AR experience and knowing how to build one, and conceptual work alone wasn't closing it.
The first prototype made this concrete. We built modular cubes using 3D masking and image mapping, each face a window into a small AR room. It worked, but it took far longer than expected, and most of what we learned was about the tools themselves.
The goal shifted: build one physicalized prototyping tool per AR mechanic. Each tool would teach how a specific technique behaves and what design decisions it forces.
Methods
- AR landscape research and tool exploration
- 3D modeling and virtual environment development (Unity, Cinema 4D)
- Physical computing and sensor integration (Arduino, ESP32, Raspberry Pi)
- Fabrication via laser cutting and 3D printing
- Iterative prototyping and material testing
- Capstone presentation and exhibition
The Prototyping Tools
06 toolsSingle Image Mapping
Surface finish determines whether AR tracking holds. We tested a range of materials as image tracking anchors, evaluating contrast and reliability. The physical material becomes a design decision with direct consequences for digital behavior.
3D Masking
The tool renders a physical object invisible to the camera and replaces it with a virtual interior. Scale and material of the physical form shape the illusion and determine its failure points.
Multiple Image Mapping
Simultaneous tracking across multiple image targets. We tested how overlays relate spatially when they share a frame, and how physical design choices in target size and contrast affect coherence across the layout.
Processing Effects: Glass
Real-time shader effects applied directly to the live camera feed. A glass refraction effect treats the physical surface as though it has optical properties, changing how the camera reads it without altering the underlying geometry.
Reactive Lighting: Light Sensor
A light sensor connected via Arduino to Bluetooth controls a virtual lamp in real time. Covering the sensor toggles it, making natural sunlight an active design input.
Physical Control: Potentiometer
A dial wired through the same Arduino-to-Bluetooth stack controls the speed of a virtual fan. An analog gesture maps directly to a digital animation parameter, with no abstraction between the two.
Tool Details
What We Learned
Material is part of the design
In every tool, surface finish, form factor, and contrast directly determined how the digital layer behaved. Tracking reliability, occlusion quality, and sensor performance all follow from physical decisions. The material is an active design input.
Build to understand
Each tool revealed constraints and failure modes that were invisible before construction. Documentation describes AR mechanics; building with them teaches what they actually do. The building was the research.
Each mechanic has a physical signature
Image mapping, 3D masking, shader effects, and sensor inputs each impose different requirements on the physical environment. Understanding those requirements is what makes AR design decisions actually designable.
The First Attempt
Pre-pivot concepts & first prototypeThe original direction
The starting point was a finished AR experience: a narrative space where physical objects connected to virtual stories.
The concepts were developed in detail before building started. The first prototype, a 3D masking cube with a virtual office room inside, took far longer than expected and taught us mostly about the tool. That gap between intention and execution was the real finding, and it redirected the project.
Process — Form
Modeling the cube
Process — Material
Surface and finish testing
Process — Assembly
Building the six tools
Process — In Use
AR captures, indoors and out
Outcomes
6
Physicalized prototyping tools covering image mapping, 3D masking, shader effects, and physical sensor inputs. A reference set for designing with AR constraints firsthand.
4
Disciplines bridged across the project: visual design, physical fabrication, electronics prototyping, and real-time software development. Each tool required all four.
1
Live capstone exhibition where visitors tested each mechanic firsthand, engaging directly with the prototypes rather than reading about their mechanics. The tools worked as both research output and demonstration artifacts.
Reflection
The consistent finding across all six tools was that AR's design problems are relational. How virtual content sits within a physical form, responds to physical light, moves when the object moves. These are composition problems, and you can't reason your way to them. You find them by building.
The tools exist because there was no other way to have that conversation. Figma can't simulate what happens when a mapped texture breaks at a physical edge. A prototype can. That's the argument the thesis is making in material form: that designing for AR requires a different kind of design literacy, one that has to be earned through making, not just studying.
Next Project
Collective IQ
Experiences
Design lead on an unreleased Microsoft product under IC3 & Copilot. Details available under NDA. Reach out to discuss.
View case study →