Skip to main content
Have a request for an upcoming news/science story? Submit a Request

The future of teaching and learning has arrived

  • Science Highlights

Purdue University’s Envision Center, part of the Rosen Center for Advanced Computing, has developed a virtual reality platform that utilizes the Meta Quest 3 headsets to bring the future of teaching and learning to our doorstep.

The Envision Center (EC) has been Image descriptionin the field of virtual reality for quite some time. In fact, they recently celebrated their 20th anniversary. And with 20 years of learning and innovation under their belt, it’s no surprise they are at the forefront of advancing the field of interactive education. Introducing “Collab XR”—the newly developed shared learning platform designed to leverage all of the Meta Quest 3 headset’s newest features. The Collab XR (XR=extended reality) platform is a shared environment that allows anyone in a headset to view and interact with the same virtual content pieces together, viewed in fully virtual (the entire field of vision is virtual, obscuring all of the real environment, and other participants appear as virtual avatars) or passthrough augmented reality (cameras feed the real room and people to the headset view, overlaying virtual content to appear to exist in the same space). Developed in conjunction with Professor Danny Milisavljevic, the Collab XR platform is the first of its kind, merging virtual reality (VR), augmented reality (AR), and mixed reality (MR) capabilities with training, research, and education. One of the best features of Collab XR is its ability to allow for co-located mixed-reality learning experiences. Instead of looking at images in a textbook, or even playing with a 3D model of an ancient artifact, imagine taking a classroom full of students and virtually placing them at the archaeological site. While immersed in the environment, they could interact with archaeological artifacts, making them bigger or smaller, viewing them from any angle they wish, all while having the instructor and other class members right beside them. With this level of data visualization, the ability to reach the students and help them truly understand the data grows by leaps and bounds. In fact, Collab XR has been so successful that it is already being put to use in the classroom.

Dr. Danny Milisavljevic is an Associate Professor of Physics and Astronomy at Purdue University. He uses the Collab XR platform to help explore and share complex 3D data with his students—namely, the remnants of stellar explosions. Milisavljevic likens himself to a “CSI Bomb Tech” but for astrophysics. By looking at what is left behind after a star explodes, he learns about the properties of that star from before its death. Traditionally, the biggest hurdle in conveying this type of data (to other researchers and students alike) stems from the difficulty of mentally visualizing a 3D object based on 2D information. To combat this in the classroom, Milisavljevic first turned to 3D animations but found that this method was also lacking.

“I could create animations that provided some perspective,” says Milisavljevic, “but ultimately, it was ME that decided which angle I wanted to rotate about, and therefore it was me that got the most information out of the animation. When I show it to a student, they aren’t going to have the same spatial comprehension, and that’s largely because they lack the agency to be able to rotate it themselves. I’ve found that if you have the ability to change the angle with your hand or by moving your head around, it allows you to unpack the 3-dimensional properties much more efficiently.”

Milisavljevic took this problem to the Envision Center, and Collab XR was born. Now, Milisavljevic can use the Collab XR platform to bring a room full of students into a co-located AR environment, where they can all explore supernova remnants together in the same virtual classroom. This gives the students the ability to examine the 3D reconstructions on their own terms, while still being able to interact with their professor or classmates. Milisavljevic has used Collab XR to conduct immersive classroom lectures for some of his courses, and the students all love it.

“The response from students has been incredible,” says Milisavljevic. “The excitement and reward of the experience is heightened by exploring content collaboratively, and they report far better comprehension of the multidimensional data as compared to traditional screen or blackboard presentations. This is a game-changing instructional platform.”

Another Purdue professor who is using Collab XR in the classroom is Dr. Robin Tanamachi from the Department of Earth, Atmospheric, and Planetary Sciences. Tanamachi has conducted small-scale lectures for her atmospheric science students using the Collab XR platform. During these lectures, Tanamachi and her students interact with radar observations of storm clouds. Doing so in a co-located XR environment gives her the ability to provide more in-depth insight into meteorological phenomena while allowing the students to better understand its spatial geometry.

“Much of what goes on inside severe thunderstorms is dynamic, three-dimensional, and invisible to the naked eye. Weather radar gives us the ability to probe the interior structure of the storm using a microwave beam. But, looking at those data on a flat screen and trying to reconstruct the full three-dimensional structure of the storm is cognitively challenging. With Collab XR, I’ve been able to let the students interact with the data in a way that isn’t possible on a screen. For example, they can rotate the data, walk through it, look at it from different angles, and enlarge features of interest. Most importantly, they can switch between different radar variables and see the spatial relationships between features. These features and their behaviors may precede hazardous weather near the surface, like large hail and tornadoes. ”

Collab XR is able to accomplish all that it can by taking advantage of the Meta Quest 3 headset’s latest features. It utilizes mixed reality, spatial anchors, shared environments, and multi-user connectivity. Even more impressive, Collab XR can be used for both local and remote instruction. As long as the user has a Meta Quest 3 headset, they can participate in the class from anywhere in the world, giving them the same level of instruction and experience that they would receive if they were in the room with the instructor. The best part of Collab XR? There’s no need for individually built applications for each use case.

“Our platform is more of an ecosystem, not a hyper-specialized one-off,” says George Takahashi, Lead Visualization Scientist at the Envision Center. “As long as we have the content, any class can see it.”

This means that any field or discipline can utilize the Collab XR platform to bring advanced visualization capabilities to the classroom. As long as the instructor has an idea of what they want and the data needed to create the visuals, the Envision Center can get them up and running on the platform in no time.

“We are very excited to have developed our collaborative visualization platform and to have it grow into a functional co-located mixed reality tool,” says Takahashi. “Having the ability to intuitively communicate complex three-dimensional information was previously limited to physical models. Virtual Reality opened the door to bring people together virtually to observe physics-defying large models. By moving to mixed reality and generalizing the models to accommodate data from any domain, this platform brings people together physically and is more scalable and accessible than ever before.”

To get a preview of what the Collab XR platform can do, please visit: https://www.rcac.purdue.edu/envision/xr-lab

Written by: Jonathan Poole, poole43@purdue.edu

Originally posted: