Carnegie Mellon University · Bachelor of Design 2021 – 2022

Physicalization
of AR/XR
Experiences

Institution
Carnegie Mellon University
Program
Bachelor of Design · Hybrid Environments
Collaborator
Franklin Guttman
Tools
Unity · Cinema 4D · Arduino · ESP32 · Raspberry Pi · Laser cutting · 3D printing

AR experience design in 2021 was rich in theory and thin in practice. The tools existed, but building with them was the only way to understand them, and there was no shared vocabulary for what each mechanic does, what it costs, or what design decisions it forces.

The thesis became a system of six physicalized prototyping tools, one per AR mechanic. Each tool makes a specific technique legible: what it can do, where it fails, and what it demands of the physical environment around it. The goal was a working vocabulary for designing with AR constraints firsthand.

How It Started

You can't design with a tool you don't understand

The project started as an attempt to design finished AR experiences. Working with unfamiliar tools quickly exposed a gap: there was a significant distance between imagining an AR experience and knowing how to build one, and conceptual work alone wasn't closing it.

The first prototype made this concrete. We built modular cubes using 3D masking and image mapping, each face a window into a small AR room. It worked, but it took far longer than expected, and most of what we learned was about the tools themselves.

The goal shifted: build one physicalized prototyping tool per AR mechanic. Each tool would teach how a specific technique behaves and what design decisions it forces.

Methods

  • AR landscape research and tool exploration
  • 3D modeling and virtual environment development (Unity, Cinema 4D)
  • Physical computing and sensor integration (Arduino, ESP32, Raspberry Pi)
  • Fabrication via laser cutting and 3D printing
  • Iterative prototyping and material testing
  • Capstone presentation and exhibition

Material is part of the design

In every tool, surface finish, form factor, and contrast directly determined how the digital layer behaved. Tracking reliability, occlusion quality, and sensor performance all follow from physical decisions. The material is an active design input.

Build to understand

Each tool revealed constraints and failure modes that were invisible before construction. Documentation describes AR mechanics; building with them teaches what they actually do. The building was the research.

Each mechanic has a physical signature

Image mapping, 3D masking, shader effects, and sensor inputs each impose different requirements on the physical environment. Understanding those requirements is what makes AR design decisions actually designable.

The First Attempt

Pre-pivot concepts & first prototype

The original direction

The starting point was a finished AR experience: a narrative space where physical objects connected to virtual stories.

The concepts were developed in detail before building started. The first prototype, a 3D masking cube with a virtual office room inside, took far longer than expected and taught us mostly about the tool. That gap between intention and execution was the real finding, and it redirected the project.

Concept sketches: various AR container shapes, frames, and architectural form explorations
Sketch: single augmented story overlaying a related physical artifact — Korean ceramics (Buncheon)
Sketch: architectural interior perspective with layered augmented text overlaid on walls and floor
Sketch: AR-enabled building model concept — each room tells its own animated story
Sketch: modular cube assembly diagram showing how units combine to form a virtual interior structure
Sketch: image map cube variations — door and window options with image tracking anchor placements
3D render: isometric view of the chaotic office room concept in grayscale — early layout exploration
3D render: warm-lit office room with toppled furniture — refined material and lighting pass
3D render: close-up inside the office room — warm directional light with scattered papers
Unity editor: cel-shaded office scene in development — the virtual room that would go inside the first cube
Unity editor: 3D masking prototype scenes positioned outdoors — testing virtual content at real-world scale
The first physical iterations: cardboard box prototypes — rapid tests of the cube form before committing to materials

Outcomes

6

Physicalized prototyping tools covering image mapping, 3D masking, shader effects, and physical sensor inputs. A reference set for designing with AR constraints firsthand.

4

Disciplines bridged across the project: visual design, physical fabrication, electronics prototyping, and real-time software development. Each tool required all four.

1

Live capstone exhibition where visitors tested each mechanic firsthand, engaging directly with the prototypes rather than reading about their mechanics. The tools worked as both research output and demonstration artifacts.

The consistent finding across all six tools was that AR's design problems are relational. How virtual content sits within a physical form, responds to physical light, moves when the object moves. These are composition problems, and you can't reason your way to them. You find them by building.

The tools exist because there was no other way to have that conversation. Figma can't simulate what happens when a mapped texture breaks at a physical edge. A prototype can. That's the argument the thesis is making in material form: that designing for AR requires a different kind of design literacy, one that has to be earned through making, not just studying.

Next Project

Collective IQ
Experiences

Design lead on an unreleased Microsoft product under IC3 & Copilot. Details available under NDA. Reach out to discuss.

View case study →
Microsoft Collective IQ — agent builder UI