Although artificial intelligence (AI) is playing an increasingly large role in mediating human activities, most education about what AI is and how it works is restricted to computer science courses. This research is a collaboration between the TILES lab, the Expressive Machinery Lab (Dr. Brian Magerko, Georgia Tech), and the Creative Interfaces Research+DesignStudio (Dr. Duri Long, Northwestern University) to create a set of museum exhibits aimed at teaching fundamental AI concepts to the public. In particular we aim to reach middle school girls and students from groups who are underrepresented in computer science.
This 4-year project is funded by the NSF Advancing Informal STEM Learning (AISL) program (NSF DRL #2214463). We are collaborating with the Museum of Science and Industry in Chicago to conduct focus groups, needs assessments, and pilot testing of exhibit designs based off our prior work.
This research will explore how embodiment and co-creativity can help learners make sense of and engage with AI concepts.
Students are expected to learn how to make inferences in the third grade, but few high-quality resources are available to help students master this skill. Frequently it is practiced by giving students pictures and asking them to infer something about the picture (for example, from a photo of children standing in front of bicycles at a beach, a student could infer that it’s summertime, that they are siblings, that they rode their bikes to the beach, etc.). Teachers evaluate the inferences based on whether or not they are plausible, but often students are left to make up a set of disconnected inferences. FossilVR is a novel virtual environment that grounds the skill of making inferences in an authentic context: a paleontological fossil dig.
Students travel through the virtual environment with Dr. Hannah, the lead paleontologist at the site, and dig up fossils, about which they are then asked to make observations and inferences in their field notebook. The notebook contains scaffolds to guide noticing to help students create an argument about the characteristics of the specimen. We hypothesize that this system will increase the quality of inferences made, support argumentation skills, and create a more enjoyable learning experience compared with traditional methods.
Roberts, J., & Leinart, K. (2022) How Big was a Triceratops, Really? Using Augmented Reality to Support Collaborative Reasoning about Scale. In Tissenbaum et al. Learning at the intersection of physical spaces and technology. Symposium to be conducted at the 2022 International Conference on Computer Supported Collaborative Learning (CSCL). Hiroshima, Japan and online.