Interactive Demo

Rotate your device to landscape mode to view the interactive demo

RoleResearch, Interface Design, UI Engineering

OptaBlate treats spinal tumors with radiofrequency ablation: probes inserted into the tumor, controlled heat, real-time monitoring of temperature, power, and resistance across every probe simultaneously. The console is the thing keeping that process legible while a surgeon operates.

I led research, design, and interface engineering for that console. The brief, if you stripped it down: the interface cannot be the thing a surgeon has to think about.

What Actually Matters in an OR

I learned the procedure thoroughly and worked closely with surgeons and clinicians. The research was about mental models: how a surgeon thinks through a procedure, where attention goes, what gets tuned out. A few things became non-negotiable fast:

  • Two seconds. That's the glance budget. If status isn't readable in two seconds, the design failed.
  • One free hand. The other is occupied. Every interaction has to work with one hand, and common actions can't require a reach.
  • No black boxes. Surgeons don't trust systems they can't read. Transparency isn't a nice-to-have. It's how trust works under pressure.
  • Alerts have to be unmistakable. Not alarming. Unmistakable. There's a difference.
The Decisions That Shaped It

The key structural call was organizing by probe, not by data type. Surgeons don't think in columns of temperatures. They think about probe 1, then probe 2. Horizontal rows matched that mental model and made status scannable without any hunting.

Color was purely semantic. Green means normal, yellow means watch it, red means act now. That applied to every element, including the stop control. No decorative color anywhere. If it's red, it means something.

The project went through five rounds of usability testing with twelve surgeons in simulated procedures. The interface is FDA-cleared and in clinical use.