Projects I’ve worked on

Inclusively-Designed Sensory Extensions for STEM Inquiry Learning

In collaboration with:

Emily B. Moore (P.I.), Director of Research and Accessibility, PhET Interactive Simulations

Ann Eisenberg (Co-P.I), Director of Craft Technology Lab

PhET Interactive Simulations


NSF award number: 2119303

This project investigates innovative sensory extension device technologies to create learning materials that are accessible and enable diverse learners to use multiple modalities in science and mathematics learning. The new technologies will be designed, crafted, customized, and personalized by STEM learners with diverse needs. Humans think and communicate using multiple sensory modalities, including sight, sound, gesture, movement, and touch. Most science and mathematics learning materials convey information visually, with displays such as diagrams and simulations, resulting in learning experiences with limited sensory engagement. For many learners with visual or print-related disabilities, visual learning materials are inaccessible. Even for others, these materials constrain learning opportunities. Sensory extension incorporates materials and devices to enable new or enhanced perceptions of real or virtual environments. Familiar sensory extension devices include eyeglasses (refractive lenses improve vision), and Geiger counters (auditory perception of radioactive decay). In an inclusive co-design process, the project team will partner with diverse members of the learning community, together co-designing flexible, adaptive, and personalizable technologies, which enable new sensory experiences (e.g., sound, gesture, movement, and touch) to augment popular and widely used interactive simulation learning tools. The project team brings together experts in educational technologies at the University of Colorado Boulder (PhET Interactive Simulations and the Craft Tech Lab) and partner organizations serving youth with learning disabilities and visual impairments.


In collaboration with:
Mary Etta West,  University of Colorado, Boulder – Computer Science
Netta Ofer, University of Colorado, Boulder – ATLAS Institute
Casey Hunt, University of Colorado, Boulder – ATLAS Institute
Sandra Bae, University of Colorado, Boulder – ATLAS Institute

Cyborg Crafts is a student-organized collaborative group comprised of five graduate students at the University of Colorado Boulder’s ATLAS Institute. Our group explores both what’s on our minds and in our bodies. This approach challenges us to explore and integrate what are often seen as opposing themes: synthetic/organic, research/maker and craft/engineering. Cyborg Crafts’ researchers are committed to developing open-source, accessible projects to promote human augmentation. We take a craft approach to design, and our focus is on easily sourced materials and replicable documentation. All of this translates to hackable projects that can be enjoyed by both novice and experienced engineers.


Collaborative Research: Debugging by Design: Developing a Tool Set for Debugging with Electronic Textiles to Promote Computational and Engineering Thinking in High School.

CU Boulder PIs:  Ann Eisenberg/Mark Gross
CU Boulder research team:  Michael J. Schneider,  Christian Hill,  Arielle Blum, Ethan Frier, Rona Sadan

In collaboration with:
Yasmin Kafai,  University of Pennsylvania
Debbie Fields,  Utah State University

NSF award number: 1742081

We are developing hardware and software tools to assist in the process of “debugging” e-textile circuits.  E-textile debugging presents a unique set of constraints due to the flexible and fabric-based nature of project materials.  In collaboration with teams headed by Yasmin Kafai (U. Pennsylvania) and Deborah Fields (Utah State University), we study the way that students use these tools to debug their textile projects, their cognitive models of e-textile troubleshooting, and the educational implications of these activities.

Pet Project

NSF award number: 1736051

Pet Project will design, implement and study a series of inquiry-rich activities in which students’ curiosities—and their affection for their pets—lead them to design and conduct investigations into pet behavior and biology. Students will make heavy use of physical computing, including a new genre of educational technology—wearable devices for trans-species sensation such as modified headphones combined with augmented reality headsets that enable students to hear sounds and see colors like a dog. Goals for the project: Engage students in data science, physical computing, writing code and otther STEM fields. Enable students to better empathize with, and care for, their pets. Invite more students into STEM fields by engaging them in a fun and relevant science project that frequently touches on the topic of empathy, thereby countering stereotypical ideas of science as being emotionally remote and joyless.


NASA SUITS (Spacesuit User Interface Technologies for Students) challenges students to design and create spacesuit information displays within augmented reality (AR) environments. As NASA pursues Artemis – landing American astronauts on the Moon by 2024, the agency will accelerate investing in surface architecture and technology development. For exploration, it is essential that crewmembers on spacewalks are equipped with the appropriate human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration and extreme terrestrial access. The SUITS 2020 Challenges target key aspects of the Artemis mission.