How Emerging Technologies are Crafting New Worlds and Revolutionizing Learning

Laurie Robinson - March 6, 2024

Research with KidsTeam and local neighborhoods examines how youth use emerging technologies like AI and VR

People learning using virtual reality headsets.

Photo by Lucrezia Carnelos on Unsplash

The University of Maryland (UMD) College of Information Studies (INFO) Associate Professor Tamara Clegg, who is an affiliate faculty member at the College of Education (EDUC), INFO Assistant Research Professor beth Bonsignore, and other collaborators are working on a Meta-funded project to better understand how young people are using emerging technologies known as “XR”, or extended reality, which encompasses a range of systems or experiences that include virtual reality (VR), augmented reality (AR), and mixed reality (hybrid/MR). Many of these XR technologies may integrate AI as well. For example, in an immersive XR environment, youth might interact with AR-representations of AI-based agents like Alexa or Siri.

A core goal for the Meta-supported project is to develop foundational ethical principles for engaging in XR research with youth and families. For instance, how do we ensure that all youth have access to these immersive emerging technologies, and how do you protect the privacy of their experiences while also mitigating any possible risks  (e.g., does “VR motion sickness” affect children differently than adults?) 

The researchers have started working with KidsTeam, an interdisciplinary group that is a part of the UMD Human-Computer Interaction Lab (HCIL), to explore how children as young as 7-8 years old interact with VR headsets typically used by adults. While continuing with KidsTeam, the researchers are also planning summer co-design sessions with youth in local communities. The HCIL has been a joint partnership between the University of Maryland Institute for Advanced Computer Studies (UMIACS) and INFO for four decades, with KidsTeam being an integral co-design subgroup within HCIL for 25 years.

The Meta XR project stands out for its engagement with a diverse cohort of youth. This project, known as the “XR for Youth Ethics Consortium,” comprises seven research universities across the United States. The lead institution is the University of Iowa, and UMD is one of six other partner institutions, including Boise State University, Northeastern University, University of Baltimore, University of Minnesota, and University of Washington. While consortium members like University of Iowa, Boise State, and University of Minnesota may explore XR with rural communities, UMD and University of Baltimore are uniquely positioned to partner with minoritized and under-resourced communities in the urban and inner-ring suburban neighborhoods around DC, Baltimore, and in Prince George’s County. Northeastern will be focusing on neurodiverse youth and their families/communities. With such a broad consortium of groups contributing, the project reflects a commitment to diversity. Meta’s funding of the project underscores their interest in inclusive research that can inform not only the effective design and development of their tools, but also the ethical principles by which they design and research these technologies overall.

Visualizing a New Landscape 

VR headsets are reshaping our digital existence. They serve as windows to immersive panoramas, bridging the void between the tangible and the imagined. The headset is an extension of our senses—an architect of visual and auditory illusions, synthesizing landscapes ranging from the hyper-realistic to the surreally fantastical. Wearers look around, and the environment responds in real-time, they reach out, and their digital avatars mirror their movements with delicate precision.

The educational potential of VR is vast. Historical recreations within VR can transport students back in time to witness events first-hand, while scientific simulations can unravel the complexities of human anatomy or take learners on a tour of the solar system. These experiences have the potential to revolutionize teaching methodologies, bringing abstract concepts to life and making the learning process a vivid and interactive journey.

“People can use them for all types of things–learning environments, simulations, games–and so we’re trying to figure out some of the ways they might be used with young people. What are some of the ethical issues involved with that?” says Clegg. “We’re helping kids understand what these technologies can do. We’ll give youth and communities  an opportunity to play around with the tools and explore them. We will have them co-design ways they want to use them. Eventually, it could get to what kind of new technologies they want to design. One of our focus areas is on understanding what new uses and experiences they want to have with some of these tools.”

“Relatedly, another goal for our project is to investigate what ethical concerns, challenges, or opportunities that youth imagine might come along with these new tools and immersive experiences, such as limited access to some children over others, or inadvertently integrating human bias or hidden controls into the systems,” says Bonsignore.

The Limitations of Emerging Technologies

Social scientists and technologists debate about the potential of VR to cause isolation. In a society where virtual spaces can sometimes eclipse reality, striking a balance between the allure of virtual worlds and the grounding force of physical interactions becomes critical.

Debates around generative AI are similar. Considering these tools for educational purposes, it’s crucial to recognize both their potential to foster engagement and the possible constraints they may introduce. For instance, children using generative AI for art creation learn to instruct the system on crafting a desired image or photo. This requires them to creatively engage with the technology. However, the research team is considering the implications of such technology on traditional art creation: what creative experiences might children miss out on, and what sorts of imaginative opportunities could be lost when using AI to visualize for them?

Emerging technologies are changing the social and cognitive landscape for children, prompting a reevaluation of educational needs. “From a learning perspective, it’s causing us to rethink what kids need to learn. What do you need to learn to be literate in society now? I think those things are shifting and changing,” says Clegg. 

Youth need new skills for proficiency with tools like generative AI and VR. As these technologies integrate into learning, it’s important to consider what may become obsolete. How might traditional modalities such as pen-and-paper tasks remain relevant? The challenge is determining which skills should be retained and how to blend them with modern technological capabilities.

Preparing Learners to be Digital Citizens

Another challenge emerging technologies pose is their infringement on the privacy and security of children’s data. In daily life, the division between online and offline realms has become indistinguishable for youth. This seamless existence raises critical questions about their online privacy and data generation that Clegg and other researchers, including INFO Associate Professor Jessica Vitak and PhD student Elana Blinder, have been looking into through an ongoing National Science Foundation-funded project. 

Technology has become a dominant force in education, drawing children into virtual environments that, while innovative, create vast pools of data. These data points, captured by schools and the companies behind these digital tools, hold valuable insights into learning patterns and behaviors. However, they also attract the interest of tech corporations, which have a history of leveraging personal data for various purposes–some unwelcome.

In this age of ubiquitous data, it’s vital to ask: what type of data should be collected from learners? The consideration of caregivers, educators, and learners themselves in this equation cannot be overstated. The responsibility of caregivers in the realm of privacy and security extends beyond mere protection—it’s about equipping youth with the knowledge and autonomy to influence their digital footprint.

Their research suggests that while children place substantial trust in their parents and teachers regarding social dilemmas encountered on digital platforms, they exhibit wariness towards sharing personal information with companies and gaming sites. To bring the concepts of privacy and security to the forefront of a child’s consciousness, the researchers suggest introducing them to real-world scenarios that accentuate the importance of these issues.

Traditionally, the decision-making power regarding a child’s data has been entrusted to adults. While this offers a safeguard, it simultaneously strips children of their agency, preventing them from developing informed self-governance over their data. The goal is not to eliminate parental oversight but to foster an environment where children can learn to make judicious decisions about their online presence.

The research teams from both projects advocate for a balanced approach, one that preserves the trust children have in their caregivers while encouraging them to cultivate an understanding of their digital rights. Adults must prepare children to navigate the complexities of technology with both confidence and caution. In doing so, they empower them to become savvy digital citizens who understand the permanence of their online actions and the significance of their digital identities.