Tag Archives: STEM education

AI Literacy through Embodiment and Co-Creativity

Description

Although artificial intelligence (AI) is playing an increasingly large role in mediating human activities, most education about what AI is and how it works is restricted to computer science courses. This research is a collaboration between the TILES lab, the Expressive Machinery Lab (Dr. Brian Magerko, Georgia Tech), and the Creative Interfaces Research + Design Studio (Dr. Duri Long, Northwestern University) to create a set of museum exhibits aimed at teaching fundamental AI concepts to the public. In particular we aim to reach middle school girls and students from groups who are underrepresented in computer science.

Creature Features asks learners to build their own training dataset to teach an AI what a bird is. After choosing examples and non-examples, learners receive feedback on how well their AI is able to identify new birds.

This 4-year project is funded by the NSF Advancing Informal STEM Learning (AISL) program (NSF DRL #2214463). We are collaborating with the Museum of Science and Industry in Chicago to conduct focus groups, needs assessments, and pilot testing of exhibit designs based off our prior work.

Knowledge Net asks learners to use tokens to create a network that powers a chatbot. After uploading an image of their network to a custom website, learners can ask the chatbot questions.

This research will explore how embodiment and co-creativity can help learners make sense of and engage with AI concepts.

Publications

FossilVR: Supporting literacy skills through science

A Virtual Reality space for teaching 3rd grade literacy skills

Description

Students are expected to learn how to make inferences in the third grade, but few high-quality resources are available to help students master this skill. Frequently it is practiced by giving students pictures and asking them to infer something about the picture (for example, from a photo of children standing in front of bicycles at a beach, a student could infer that it’s summertime, that they are siblings, that they rode their bikes to the beach, etc.). Teachers evaluate the inferences based on whether or not they are plausible, but often students are left to make up a set of disconnected inferences. FossilVR is a novel virtual environment that grounds the skill of making inferences in an authentic context: a paleontological fossil dig.

Students travel through the virtual environment with Dr. Hannah, the lead paleontologist at the site, and dig up fossils, about which they are then asked to make observations and inferences in their field notebook. The notebook contains scaffolds to guide noticing to help students create an argument about the characteristics of the specimen. We hypothesize that this system will increase the quality of inferences made, support argumentation skills, and create a more enjoyable learning experience compared with traditional methods.

Publications

  • Roberts, J., & Leinart, K. (2022) How Big was a Triceratops, Really? Using Augmented Reality to Support Collaborative Reasoning about Scale. In Tissenbaum et al. Learning at the intersection of physical spaces and technology. Symposium to be conducted at the 2022 International Conference on Computer Supported Collaborative Learning (CSCL). Hiroshima, Japan and online.  
Video introduction presented at CSCL Physical Spaces Symposium

Demo Video

Research Team

Morgan Chin, MS-HCI 2021

Kyle Leinart, MS-HCI

Anirudh Mukherjee, MS-CS

Jessica Roberts, advisor

Blair MacIntyre, collaborator

Molly Porter (NHMLAC), collaborator