User Authored Live Controls
Satisfying Interaction: Touch-Based Metaphors
We think with our bodies as well as our minds. My designs leverage spatial cognition and proprioceptive memory to enable fluid and instinctual ways to discover meaning in data. I was awarded a Microsoft Research gift to prototype radically different interface metaphors for the production of music and video content. Valuing the element of play in discovery, our work began by shutting off the table-sized touch surface and pouring a bag of rice on the screen. I videotaped students as they interacted with the rice and found gestures that had naturally associated meanings such as gathering, cleaning up, or digging down. We interviewed artists to map these within the context of digital editing tools and applied these to the design approach.
We developed three new approaches to touch interfaces: one that enables searching for connections by ‘wiggling’ icons, similar to a patch bay; another that provides for a multi-touch zoomable user interface; the final offering a way to allow the user to build interfaces themselves at the same time they were working on the content, modifying their tools to meet the task. This work led to a DARPA grant to discover ways to use the body to better analyze large datasets as part of their XDATA and, later, PLAN-X projects and integrating touch techniques with our tablet-based hybrid immersive viewer – the inVRrse.
Uncovering Natural Gestures
Immersive Touch, Physiological Testing