PUBLICATIONS

Rakesh Patibanda, Xiang Li, Yuzheng Chen, Aryan Saini, Chris Hill, Elise van den Hoven, and Florian ’Floyd’ Mueller. 2021. Actuating Myself: Designing Hand-Games Incorporating Electrical Muscle Stimulation. In CHI PLAY ’21: ACM Annual Symposium on Computer-Human Interaction in Play.

Motor movements are performed while playing hand-games such as Rock-paper-scissors or Thumb-war. These games are believed to benefit both physical and mental health and are considered cultural assets. Electrical Muscle Stimulation (EMS) is a technology that can actuate muscles, triggering motor movements and hence offers an opportunity for novel play experiences based on these traditional hand-games. However, there is only limited understanding of the design of EMS games. In this paper, we present two games inspired by traditional hand-games, “Slap-me-if-you-can” and “3-4-5”, which incorporate EMS and can be played alone, in contrast to traditional games. We discuss the design of these games and three themes: a) Gameplay experiences and influence of EMS hardware, b) Interaction with EMS and the calibration process and, c) Shared control and its effect on playing EMS games. We hope that an enhanced understanding of the potential of EMS to support hand-games can aid the advancement of movement-based games as a whole.

Chris Hill, Michael Schneider, Ann Eisenberg, and Mark Gross. 2021. The ThreadBoard: Designing an E-Textile Rapid Prototyping Board. In Proceedings of The Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’21).

E-textiles, which embed circuitry into textile fabrics, blend art and creative expression with engineering, making it a popular choice for STEAM classrooms [6, 12]. Currently, e-textile development relies on tools intended for traditional embedded systems, which utilize printed circuit boards and insulated wires. These tools do not translate well to e-textiles, which utilize fabric and uninsulated conductive thread. This mismatch of tools and materials can lead to an overly complicated development process for novices. In particular, rapid prototyping tools for traditional embedded systems are poorly matched for e-textile prototyping. This paper presents the ThreadBoard, a tool that supports rapid prototyping of e-textile circuits. With rapid prototyping, students can test circuit designs and identify circuitry errors prior to their sewn project. We present the design process used to iteratively create the ThreadBoard’s layout, with the goal of improving its usability for e-textile creators.

Chris Hill, Michael Schneider, Mark Gross, Ann Eisenberg, and Arielle Blum.  2020.  A Wearable Meter That Actively Monitors the Continuity of E-Textile Circuits as They Are Sewn. In Proceedings of FabLearn 2020.

The e-textile landscape has enabled creators to combine textile materiality with electronic capability. However, the tools that e-textile creators use have been adapted from traditional textile or hardware tools. This puts creators at a disadvantage, as e-textile projects present new and unique challenges that currently can only be addressed using a non-specialized toolset. This paper introduces the first iteration of a wearable e-textile debugging tool to assist novice engineers in problem solving e-textile circuitry errors. These errors are often only detected after the project is fully built and are resolved only by disassembling the circuit. Our tool actively monitors the continuity of the conductive thread as the user stitches, which enables the user to identify and correct circuitry errors as they create their project.

Michael Schneider, Chris Hill, Mark Gross, Ann Eisenberg, and Arielle Blum. 2020. A Software Debugger for E-textiles and Arduino Microcontrollers. In Proceedings of FabLearn 2020.

When learning to code a student must learn both to create a program and then how to debug said program. Novices often start with print statements to help trace code execution and isolate logical errors. Eventually, they adopt advance debugger practices such as breakpoints, “stepping” through code execution, and “watching” variables as their values are updated. Unfortunately for students working with Arduino devices, there are no debugger tools built into the Arduino IDE. Instead, a student would have to move onto a professional IDE like Atmel Studio and/or acquire a hardware debugger. Except, these options have a steep learning curve and are not intended for a student who has just started to learn how to write code. I am developing an Arduino software library, called Pin Status, to assist novice programmers with debugging common logic errors and provide features specific to the e-textile microcontroller, Adafruit Circuit Playground Classic. These features include a breakpoint method which pauses an Arduino program’s execution and offers, via Serial communication, a menu for viewing and/or updating the current value of digital pins and “watched” variables. On the Adafruit Circuit Playground Classic, the library also uses on-board LEDs to show the current value of the digital pins (High/Low). This work has been funded by NSF STEM+C, award #1742081.

Annie Kelly, Christine Chang, Chris Hill, Mary West, Mary Yoder, Joseph Polman, Shaun Kane, Michael Eisenberg, and R. Benjamin Shapiro. 2020. “Our Dog Probably Thinks Christmas Is Really Boring”: Re-mediating Science Education for Feminist-inspired Inquiry. ICLS 2020:  Proceedings of the 15th International Conference of the Learning Sciences.

Feminist science approaches recognize the value of integrating empathy, closeness, subjectivity, and caring into scientific sensemaking. These approaches reject the notion that scientists must be objective and dispassionate, and expand the possibilities of what is considered valuable scientific knowledge. One avenue for engaging people in empathetically driven scientific inquiry is through learning activities about how our pets experience the world. In this study, we developed an augmented reality device we called DoggyVision that lets people see the world similar to dogs’ vision. We designed a scavenger-hunt for families where they explored indoor and outdoor environments with DoggyVision, collected data firsthand, and drew conclusions about the differences between how humans and dogs see the world. In this paper, we illustrate how our DoggyVision workshop re-mediated scientific inquiry and supported the integration of feminist practices into scientific sensemaking.

Carlos Pinedo, Jordan Dixon, Christine Chang, Donna Auguste, Mckenna Brewer, Cassidy Jensen, Chris Hill, Devin Desilva, Amanda N. Jones, Allison P. Anderson, and James S. Voss. 2019. Development of an Augmented Reality System for Human Space Operations. In Proceedings of The 49th International Conference on Environmental Systems. 

In this work we develop an augmented reality heads up display for astronaut use during human space operations. This work takes advantage of recent advances in commercial heads-up-display technology to simulate information delivery to the astronaut. The primary design objectives were to increase situation awareness (SA), provide timely information to the user and supporting personnel, and facilitate communication among all system elements (user, ground control, and intravehicular astronauts). The design includes a visual interface that provides on-demand information in both egocentric (fixed to the user) and exocentric (fixed to the environment) perspectives. The information includes spacesuit informatics, checklist procedures, communication information, and basic navigation. The design also includes an audio interface that receives verbal commands from the user and provides auditory feedback and information. A novel method of interacting with the augmented reality system was explored: electromyography. Electromyography receives electrical signal output from muscle groups on the user’s body and is able to map those as specific inputs to the augmented reality system. In this way, the user’s hands and voice are free to complete other tasks as necessary, while still maintaining a mode of communication with and control of the device. To aid in communication among all elements, remote display control via telestration (the ability of a remote user, such as ground control or another astronaut, to draw over a still or video image) was included. This provided a means of visual communication to facilitate task completion, aid in emergency situations, and highlight any anomalies thereby increasing user situation awareness and decreasing workload. Additional capability was provided for object-tool recognition and basic navigation assistance. Preliminary testing highlighted the potential benefits of the following critical design elements: minimalistic visual display, redundancy of interaction through modalities, and continuity between internal and external display elements.