top of page

Body-Based Gestures

Just as touch can provide satisfaction in user interfaces, 3D cameras have opened up interaction design to include the whole body. I secured a collaboration agreement between the Primesense company and USC to work with Evan Suma to create open-source software that acted as a translator between skeletal gestures on a Microsoft Kinect or Primesense device and mouse and keyboard commands. We initially focused on creating applications for health and wellness such as stroke rehabilitation and breathing exercises, then introduced the software to the gaming community through some fun and well received videos, supporting the thousands of users who downloaded the application to control games, and created a prototype application exploring the value of such interfaces in places like operating rooms.

 

I find the design space of body-based interaction irresistible because it enables the mapping of the emotions of a gesture to the affordances of an interface. Working with Primesense on a consumer product reference design, I found that we needed to warp the coordinate space around a user to map it to meaning – for example, a far reach feels different than a short reach and pulling toward the body feels different then pushing away. I have also researched ways to detect subtle non-verbal behaviors such as breathing rate and fidgeting with 3D cameras while working on a user-sensing project for DARPA with Skip Rizzo.

Selected Publications

Motor Rehabilitation

Breath Rate Detection

Consumer Interface

April Fool’s

Meaningful Gestures

Hands-Free Control

bottom of page