Investigating Sensory Extensions as Input for Interactive Simulations

Chris Hill*, Casey Hunt*, Sammie Crowder, Brett L. Fielder, Emily B. Moore, Ann Eisenberg

(The first two authors equally contributed to the publication)


Sensory extensions enhance our awareness by transforming variations in stimuli normally undetectable by human senses into perceivable outputs. Similarly, interactive simulations for learning promote an understanding of abstract phenomena. Multimodal experiences combining sensory extension devices with interactive simulations give users the novel opportunity to connect their sensory experiences in the physical world to computer-simulated concepts. We explore this opportunity by designing a suite of wearable sensory extension devices that interface with a uniquely inclusive PhET Simulation, Ratio and Proportion. In this simulation, two hands can be moved on-screen to various values, representing different mathematical ratios. Users explore changing hand heights to find and maintain ratios through visual and auditory feedback. Our sensory extension devices translate force, distance, sound frequency, and magnetic field strength to quantitative values in order to control individual hands in the computer simulation.


For this exploration of sensory extensions as input to interactive simulations, we designed devices that allow users to perceive the ratio between weights, sound frequencies, distances, and/or magnetic fields as input for the Ratio and Proportion simulation. These devices enhance users’ senses by giving them a quantified output of the ratio between phenomena that are not ordinarily perceived as numerical values or are not naturally comparable. For example, users do not typically have a natural sense of the ratio between a sound frequency and a distance. With our design, the ratio between these phenomenon is displayed in the Ratio and Proportion simulation. Each sensory extension device consists of a sensor, a device control unit, and an armband. Each sensor is placed in a corresponding 3D printed wearable housing and connected to the control unit. The central control unit (Fig. 3) houses a rechargeable battery and a Bluetooth microcontroller that sends data to the Ratio and Proportion simulation. Each housing uses color, icons, and labels to indicate the sensor type and hand association (left or right) that the unit uses to interact with the simulation.