Realization Lab

3 sample projects for learning technologies

Research and development of technologies for learning

Sample work produced by students enrolled in LSRC 594, a graduate course offered jointly by the University of Illinois at Chicago’s (UIC’s) Learning Sciences Research Institute (LSRI) and UAM-Azcapotzalco’s Graduate Program in Design.

Projects designed for the course webpage

The realization lab in the Learning Sciences Research Institute aims to promote the research and development of learning technologies for learning in formal and informal spaces like schools or museums.
  • About the lab

    Asking questions is one of the best ways to practice a curious mindset— questions that challenge assumptions, inspire others, open up a broader context, and cause reflection.

    What can we do?

    • Identify technological needs by thinking and questioning
    • Build prototypes to learn and teach
    • Build opportunities to engage learners

    • Research XR technologies AR/VR Visualizations/Large scale displays Tangible technologies Embedded technologies Eye tracking and EEG

  • Who can be involved

    Any UIC faculty, staff or student interested in learning about and working with these technologies can take advantage of the resources of the Realization Lab.

    Additionally, we’re actively seeking to collaborate with individuals from other institutes of higher learning, museums, from corporate settings, and nonprofits. Contact Brenda Lopez Silva for more information.

  • Current technologies

    We have a series of technologies ready to develop and research the design of learning environments in formal and informal settings.

    Currently we work with schools to develop AR, VR and XR curricular materials to teach middle school students problem-based projects that scaffold STEM concepts.

    Currently our lab is equipped with:

    SAGE2 Visualizations/Large scale displays

    Tangible technologies such as vive trackers, Arduino, Estimotes

    Embedded technologies

    Eye tracking

    VR/AR/XR

Learning Technologies

BUTEERFLY
Research on XR
This project explores how HCI expands in the use of space combining physical and virtual interactions needed to accommodate learning in virtual experiences. Learners immerse themselves in a butterfly sanctuary to explore the migration patterns of monarch butterflies and their lifecycles. Project demonstration at the ACM SUI 2020 https://dl.acm.org/doi/10.1145/3385959.3421720
Sage2
Large adaptable graphics environments
Current web-based collaboration systems, such as Google Hangouts, WebEx, and Skype, primarily enable single users to work with remote collaborators through video conferencing and desktop mirroring.
test
Temi
This is a project lead by faculty member Joseph Michaellis. His research sits at the nexus of learning sciences and human-computer interaction (HCI) disciplines.
Embedded Phenomena
Embedded Phenomena
Embedded phenomena' is a learning technology framework in which simulated scientific phenomena are mapped onto the physical space of classrooms. Students monitor and control the local state of the simulation through distributed media positioned around the room, gathering and aggregating evidence to solve problems or answer questions related to those phenomena. Embedded phenomena are persistent, running continuously over weeks and months, creating information channels that are temporally and physically interleaved with, but asynchronous with respect to, the regular flow of instruction. In this paper, we describe the motivations for the framework, describe classroom experiences with three embedded phenomena in the domains of seismology, insect ecology, and astronomy, and situate embedded phenomena within the context of human-computer interaction research in co-located group interfaces and learning technologies.

Here’s a demonstration of using VR to enhance student understanding of the challenges faced by monarch butterflies as they make their way to Mexico in migration season.