Although artificial intelligence (AI) is playing an increasingly large role in mediating human activities, most education about what AI is and how it works is restricted to computer science courses. This research is a collaboration between the TILES lab, the Expressive Machinery Lab (Dr. Brian Magerko, Georgia Tech), and the Creative Interfaces Research+DesignStudio (Dr. Duri Long, Northwestern University) to create a set of museum exhibits aimed at teaching fundamental AI concepts to the public. In particular we aim to reach middle school girls and students from groups who are underrepresented in computer science.
This 4-year project is funded by the NSF Advancing Informal STEM Learning (AISL) program (NSF DRL #2214463). We are collaborating with the Museum of Science and Industry in Chicago to conduct focus groups, needs assessments, and pilot testing of exhibit designs based off our prior work.
This research will explore how embodiment and co-creativity can help learners make sense of and engage with AI concepts.
Our project, Technology for Acquiring Language through Engagement (TALE), focuses on designing a gamified platform to help self-learners acquire foreign language skills, specifically non-native English speakers.
Research question: How might we help intermediate to advanced English learners get more practice speaking their target language with others?
Motivations: Self-learners encounter a number of challenges when attempting to learn a language, including:
Lack of depth in existing free learning materials
Disconnect to native speakers and culture
Struggles with loss of interest or feelings of burnout over time
Difficulties practicing natural conversations
Why gamification? Research has shown that playful environments and gamified approaches can be powerful tools for engaging learners.
Accessible Oceans: Exploring Ocean Data Through Sound is a pilots & feasibility study in the NSF’s Advancing Informal STEM Learning program (NSF AISL #2115751). The interdisciplinary project team is exploring how oceanography Data from the Ocean Observatories Initiative can be sonified and made interactive in a museum exhibit tailored to visually impaired learners.
The project is a collaboration between the Woods Hole Oceanographic Institute (WHOI), Your Ocean Consulting, the University of Oregon, and Georgia Tech.
Roberts, J., Lowy, R., Li, H., Bellona, J., Smith, L., and Bower, J. (2023). Breaking Down the Visual Barrier: Designing Data Interactions for the Visually Impaired in Informal Learning Settings. Paper presented at the annual conference of computer-supported collaborative learning (CSCL 2023). Montreal, Quebec, Canada. Read the paper here. View our lit review dataset here.
Bower, A., J. Bellona, J. Roberts, L. Smith 2023. Accessible Oceans: Exploring Ocean Data Through Sound, presented at 2023 Sonification World Chat Meeting, 23 May. Online.
Bower, A., J. Bellona, J. Roberts, L. Smith, 2022. Accessible Oceans: Exploring Ocean Data Through Sound (ED12A-03), presented at 2022 AGU Fall Meeting, 12-16 December.
Braun, R., L. Karlstrom, A.S. Bower, M.O. Archer, 2022. Listening to Our World: Sonification Applications in Research, Education, and Outreach Town Hall (TH13H), presented at 2022 AGU Fall Meeting, 12-16 December.
Dr. Amy Bower, WHOI [PI]
Dr. Jon Bellona, U Oregon [co-PI]
Dr. Jessica Roberts, Georgia Tech [co-PI]
Dr. Leslie Smith, Your Ocean Consulting, LLC [key personnel]
Together with my colleague and friend Francesco Cafaro (IUPUI), I am very excited to announce the release of our new book, Data Through Movement: Designing Embodied Human-Data Interaction for Informal Learning. https://doi.org/10.2200/S01104ED1V01Y202105VIS013.
This project has been a long time in the making, beginning with our initial collaboration on the CoCensus project over 10 years ago! This volume, part of the Morgan & Claypool Synthesis Lectures on VIsualization, provides an overview on theoretical foundations of embodiment and discusses empirical findings from multiple human-data interaction (HDI) projects in museums.
You can find more information and details about where to get it here .
Art is an important factor in child development. Research has highlighted art education’s role in children’s acquisition of the economic, cultural, and civic capital required to sustain a communities’ cultural resources. For K-2 learners, art education also contributes to the development of fine motor skills, cognition, and interpersonal relationships. The incorporation of art museum visits into school curriculum is one of the ways students can have repeated, sustained engagement with art. Recognizing this, many art museums provide digital resources to support the integration of these resources into classrooms, but little research investigates classrooms’ use of these resources. Additionally, little research investigates technology designs that support interactions and needs of K-2 learners, teachers, art museum educators, and docents in fully remote art education settings.
This project uncovers key implications and design requirements for developing effective, remote art education environments for K-2 learners and educators. From these requirements we made novel, instrumented tangible tools that can create beneficial learning opportunities where K-2 learners can practice fine motor skills and age-appropriate art principles. This project also studies how the integration of these tools into virtual environments can support K-2 learners in remote settings.
Understanding K-2 Remote Art Education Needs
Repeated, meaningful art education experiences for young children are often coordinated across school settings, which support daily art encounter opportunities, and museums, which provide more isolated encounters with artworks. To understand the needs of classroom and museum educators in remote K-2 settings, we conducted a needs survey and interview. We developed 3 sets of design requirements covering their needs. We also developed a novel typology of existing art education platforms, identifying where educators’ needs are and are not met.This project is described in our 2022 Interaction Design and Children (IDC) Paper: Ready, Set, Art: Technology Needs and Tools for Remote K-2 Art Education
To satisfy the need for young learners to receive appropriate feedback as they practice fine motor skills in remote environments, we created the Chameleon Clippers. This low cost instrumentation of classic school scissors uses line sensors and a custom built Processing application to alert users when they deviate from the line they are attempting to cut.
This project is described in our 2022 Computer-Supported Collaborative Learning (CSCL) paper:
Mansi, G., Boone, A., Kim, S. & Roberts, J. (2022). Chameleon Clippers: A Tool for Developing Fine Motor Skills in Remote Education Settings. In Proceedings 2022 International Conference on Computer Supported Collaborative Learning (CSCL). Hiroshima, Japan and online. Best design paper nominee
Increasing shifts to online and remote education in recent years — greatly augmented by the Covid-19 pandemic — have created a new challenge for museum-based art educators: How can young children have impactful art engagement experiences on remote museum tours?
In this project, I explore the K-2 art educators’ pedagogical needs in facilitating the remote art tour through co-design, and offer a technological solution, Play and Learn in Virtual Museum (PLVM). PLVM (pronounced as Plum) is a web-based digital platform for K-2 art educators and students to aid the followings:
Simple technology set up for students (3 point setup)
Ability to deeply looking at the art utilizing different interaction models such as discussion, drag-and-drop, point, multiple choice and storytelling
Integration with existing educational tools and different media such as 360 view, youtube, google slides, and images from the museum collection database
Sharable hands-on activities
Texture sound for the 3D elements
This platform offers both moderated and unmoderated versions to make it more accessible for limited resourced art educators.
Transgender and nonbinary (trans/nonbinary) college students face unique challenges as members of Greek Life organizations, as their identities contradict the heteronormative culture of most Greek communities. Despite these differences, trans/nonbinary students still exist within Greek communities, and many Greek organizations have been making efforts to be inclusive of these members. Institutional-level changes, however, do little to prepare individual Greek students to be inclusive of their trans/nonbinary peers, and because GT Greek Life does not mandate LGBTQ+ education, cisgender students struggle to bridge this gap in knowledge, while transgender students are expected to act as educators. Crossroads is an educational mobile application designed for cisgender Greek students. It teaches users about LGBTQ+ concepts, specifically focusing on trans/nonbinary issues within Greek communities.
The Crossroads application has four features: learning modules, community messaging, a glossary, and external resources. In our proof of concept prototype, we designed and tested two modules, one of which focused on basic LGBTQ+ terminology and concepts, and a second that honed in on trans/nonbinary issues. These modules are followed up with ‘daily challenges’ that utilize a spaced repetition model to support long-term learning. We hypothesize that this learning environment will provide cisgender Greek students with an easy-to-access LGBTQ+ learning environment that can be easily integrated into a college student’s schedule and will act as an alternative to relying on trans/nonbinary students for education.
Stephanie Baione, Yiming Lyu, Audrey Reinert & Jessica Roberts (2022) Crossroads: a transgender education platform for Greek life students, Journal of LGBT Youth, DOI: 10.1080/19361653.2022.2070813 Please contact me if you would like to read the article but do not have access and I will be happy to send you an access link.
When Sulfur Oxides (SOx) are emitted from power plant facilities, they do not fall directly to the ground. They are carried by air currents, sometimes great distances. Modeling of atmospheric transport and dispersion of these particles can estimate fine particulate matter (PM2.5) source impacts attributable to SOx emissions from each of the more than 1,200 coal-fired electricity generating units in operation in the United States between 1999-2018.
The Coal Pollution Impacts Explorer (C-PIE) is a web-based interface designed to visualize and scaffold atmospheric data and modeling for a public audience. Users can investigate the sources of pollution in their home county’s air, examine where pollution from a nearby facility disperses, and explore trends over time as facilities install pollution-mitigating scrubbers in response to legislative actions.
Research on the C-PIE platform investigates how data interactions can be scaffolded to support inquiry and engagement for public audiences.
Congratulations to Gennie Mansi for her virtual poster at today’s CSCL poster session at ISLS2021!
Embodiment and Social Interactions in a Class Virtual Reality Poster Session
Gennie Mansi, Blair MacIntyre, & Jessica Roberts
Abstract: There is a growing enthusiasm to use VR to improve remote student learning experiences. However, incongruities between students’ virtual embodiment – as avatars in the virtual environment – and physical embodiment – from their biological bodies – can significantly impact them. We observed a university class poster session held entirely in an immersive Mozilla Hubs environment. We found that incongruities in embodiment created both challenges and significant opportunities for students to collaborate and learn in a shared space.
Students are expected to learn how to make inferences in the third grade, but few high-quality resources are available to help students master this skill. Frequently it is practiced by giving students pictures and asking them to infer something about the picture (for example, from a photo of children standing in front of bicycles at a beach, a student could infer that it’s summertime, that they are siblings, that they rode their bikes to the beach, etc.). Teachers evaluate the inferences based on whether or not they are plausible, but often students are left to make up a set of disconnected inferences. FossilVR is a novel virtual environment that grounds the skill of making inferences in an authentic context: a paleontological fossil dig.
Students travel through the virtual environment with Dr. Hannah, the lead paleontologist at the site, and dig up fossils, about which they are then asked to make observations and inferences in their field notebook. The notebook contains scaffolds to guide noticing to help students create an argument about the characteristics of the specimen. We hypothesize that this system will increase the quality of inferences made, support argumentation skills, and create a more enjoyable learning experience compared with traditional methods.
Roberts, J., & Leinart, K. (2022) How Big was a Triceratops, Really? Using Augmented Reality to Support Collaborative Reasoning about Scale. In Tissenbaum et al. Learning at the intersection of physical spaces and technology. Symposium to be conducted at the 2022 International Conference on Computer Supported Collaborative Learning (CSCL). Hiroshima, Japan and online.