Rakesh Patibanda, Xiang Li, Yuzheng Chen, Aryan Saini, Chris Hill, Elise van den Hoven, and Florian ’Floyd’ Mueller. 2021. Actuating Myself: Designing Hand-Games Incorporating Electrical Muscle Stimulation. In CHI PLAY ’21: ACM Annual Symposium on Computer-Human Interaction in Play.
Chris Hill, Michael Schneider, Ann Eisenberg, and Mark Gross. 2021. The ThreadBoard: Designing an E-Textile Rapid Prototyping Board. In Proceedings of The Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’21).
Chris Hill, Michael Schneider, Mark Gross, Ann Eisenberg, and Arielle Blum. 2020. A Wearable Meter That Actively Monitors the Continuity of E-Textile Circuits as They Are Sewn. In Proceedings of FabLearn 2020.
The e-textile landscape has enabled creators to combine textile materiality with electronic capability. However, the tools that e-textile creators use have been adapted from traditional textile or hardware tools. This puts creators at a disadvantage, as e-textile projects present new and unique challenges that currently can only be addressed using a non-specialized toolset. This paper introduces the first iteration of a wearable e-textile debugging tool to assist novice engineers in problem solving e-textile circuitry errors. These errors are often only detected after the project is fully built and are resolved only by disassembling the circuit. Our tool actively monitors the continuity of the conductive thread as the user stitches, which enables the user to identify and correct circuitry errors as they create their project.
Michael Schneider, Chris Hill, Mark Gross, Ann Eisenberg, and Arielle Blum. 2020. A Software Debugger for E-textiles and Arduino Microcontrollers. In Proceedings of FabLearn 2020.
When learning to code a student must learn both to create a program and then how to debug said program. Novices often start with print statements to help trace code execution and isolate logical errors. Eventually, they adopt advance debugger practices such as breakpoints, “stepping” through code execution, and “watching” variables as their values are updated. Unfortunately for students working with Arduino devices, there are no debugger tools built into the Arduino IDE. Instead, a student would have to move onto a professional IDE like Atmel Studio and/or acquire a hardware debugger. Except, these options have a steep learning curve and are not intended for a student who has just started to learn how to write code. I am developing an Arduino software library, called Pin Status, to assist novice programmers with debugging common logic errors and provide features specific to the e-textile microcontroller, Adafruit Circuit Playground Classic. These features include a breakpoint method which pauses an Arduino program’s execution and offers, via Serial communication, a menu for viewing and/or updating the current value of digital pins and “watched” variables. On the Adafruit Circuit Playground Classic, the library also uses on-board LEDs to show the current value of the digital pins (High/Low). This work has been funded by NSF STEM+C, award #1742081.
Annie Kelly, Christine Chang, Chris Hill, Mary West, Mary Yoder, Joseph Polman, Shaun Kane, Michael Eisenberg, and R. Benjamin Shapiro. 2020. “Our Dog Probably Thinks Christmas Is Really Boring”: Re-mediating Science Education for Feminist-inspired Inquiry. ICLS 2020: Proceedings of the 15th International Conference of the Learning Sciences.
Carlos Pinedo, Jordan Dixon, Christine Chang, Donna Auguste, Mckenna Brewer, Cassidy Jensen, Chris Hill, Devin Desilva, Amanda N. Jones, Allison P. Anderson, and James S. Voss. 2019. Development of an Augmented Reality System for Human Space Operations. In Proceedings of The 49th International Conference on Environmental Systems.
In this work we develop an augmented reality heads up display for astronaut use during human space operations. This work takes advantage of recent advances in commercial heads-up-display technology to simulate information delivery to the astronaut. The primary design objectives were to increase situation awareness (SA), provide timely information to the user and supporting personnel, and facilitate communication among all system elements (user, ground control, and intravehicular astronauts). The design includes a visual interface that provides on-demand information in both egocentric (fixed to the user) and exocentric (fixed to the environment) perspectives. The information includes spacesuit informatics, checklist procedures, communication information, and basic navigation. The design also includes an audio interface that receives verbal commands from the user and provides auditory feedback and information. A novel method of interacting with the augmented reality system was explored: electromyography. Electromyography receives electrical signal output from muscle groups on the user’s body and is able to map those as specific inputs to the augmented reality system. In this way, the user’s hands and voice are free to complete other tasks as necessary, while still maintaining a mode of communication with and control of the device. To aid in communication among all elements, remote display control via telestration (the ability of a remote user, such as ground control or another astronaut, to draw over a still or video image) was included. This provided a means of visual communication to facilitate task completion, aid in emergency situations, and highlight any anomalies thereby increasing user situation awareness and decreasing workload. Additional capability was provided for object-tool recognition and basic navigation assistance. Preliminary testing highlighted the potential benefits of the following critical design elements: minimalistic visual display, redundancy of interaction through modalities, and continuity between internal and external display elements.