AR/VR · Research · IEEE 2024

Virtual
Interviewer

A first-author research project exploring how augmented reality overlays can assist real-time additive manufacturing workflows — reducing error and bridging the gap between digital model and physical output.

Year 2024
Role Lead Researcher & Developer
Published IEEE Conference 2024
Category
AR/VR Research C# Unity
Stack
Meta XR SDK MRTK Unity

Project cover image

What was built

This project investigates how augmented reality can serve as a real-time guidance layer for additive manufacturing — 3D printing — processes. By overlaying digital model data directly onto the physical build environment, operators can catch spatial misalignments, layer errors, and calibration drift before they compound.

The system was developed using Unity and the Meta XR SDK, with spatial anchoring via MRTK to lock overlays to physical printer coordinates. The result was submitted and accepted as a first-author paper at IEEE 2024.

"The gap between a perfect digital model and an imperfect physical print is where AR lives — and where this system intervenes."

1st

Author position

IEEE

Publication venue

2024

Published

The challenge

Additive manufacturing involves precise layer-by-layer deposition, where small spatial errors early in a print compound into significant failures. Operators typically rely on 2D screen-based monitoring tools that lack spatial context and require constant cross-referencing with the physical object.

The core question: can an AR headset provide a more intuitive, spatially-grounded monitoring experience that reduces cognitive load and catches errors earlier?

Problem diagram

Traditional monitoring workflow

AR overlay concept

Proposed AR intervention

How it was made

Development began with spatial anchoring — establishing a reliable coordinate transform between the AR headset's world space and the physical printer. This involved custom calibration routines using printed fiducial markers and MRTK's spatial mapping pipeline.

The overlay system then renders the expected digital model geometry in real-time, aligned to the current print stage. A layer progress tracker highlights areas where physical output deviates from the digital target, color-coded by severity.

User testing was conducted with five participants from engineering backgrounds, using think-aloud protocol and structured interview sessions.

System architecture diagram

Spatial anchoring and overlay pipeline

Calibration UI

Fiducial calibration interface

Deviation heatmap

Real-time deviation overlay

What it achieved

Participants consistently found the spatial overlay more intuitive than screen-based monitoring for detecting geometric deviations, particularly for complex multi-axis prints. The paper was accepted at IEEE 2024, contributing a novel application of AR to the manufacturing domain.

Key findings included: improved error detection rates in early print stages, reduced head-down time at monitoring screens, and strong preference for the AR modality among users with prior 3D printing experience.

What I learned

This project deepened my understanding of the constraints of consumer AR hardware — latency, field of view, and spatial drift are real limitations that design must work around, not assume away. The spatial anchoring problem alone consumed a significant portion of the development time.

It also reinforced how HCI methods translate directly into better engineering decisions: user testing revealed calibration UX issues that would have been invisible from a purely technical perspective.

If revisiting, I would invest earlier in a more robust anchoring solution and explore multi-user co-presence for collaborative manufacturing environments.

Next Project

NOAA Watershed Mapper