← PROJECTS

EMPATHAR

Real-time social battery detection — MediaPipe AR overlay in the browser

ROLE Designer & Developer
TIMELINE 2026
STACK MediaPipe · WebGL · WebAssembly · Three.js · Vanilla JS
RUNS Fully on-device · No server · No data leaves the browser

OVERVIEW

EmpathAR uses your camera to detect posture, expression, and movement in real time — translating them into a live Social Battery overlay rendered directly over the people in your frame. Most of social intelligence is invisible. EmpathAR makes those signals visible, without telling you what to feel.

WHAT WE BUILT

Real-time AR overlay — Social Battery score and state rendered over live camera feed using WebGL

Three-signal model — posture (33 body landmarks), expression (52 facial blendshapes), and movement velocity blended into a single 0–100% score

Asymmetric smoothing — charging is slow and must be sustained; draining is fast, mirroring how social energy actually works

Five named states — Energized, Engaged, Present, Fading, Needs Space — each with a contextual action prompt

Multi-person tracking — persistent P1/P2/P3 labels that survive people leaving and re-entering the frame using shirt-color signatures

Fully on-device — Google MediaPipe runs locally via WebAssembly; no video, images, or biometric data ever leaves the browser

Toast notification system — contextual tips fire per-person as battery state changes

HERO — AR overlay in use
Social Battery states UI
Multi-person tracking overlay

APPROACH

Two MediaPipe models run simultaneously per frame: a pose landmarker and a face landmarker. Their outputs are weighted and fused into a single battery score using an asymmetric blend algorithm — raw scores ease gradually into the current value rather than snapping instantly, preventing flicker from a single outlier frame. Person identity is maintained between frames by sampling the average color of each person's upper torso and comparing it to stored signatures, so labels persist even when someone steps out of frame.

OPEN APP → VIEW CODE →