As AR/VR technology continues to improve and as universities experiment with the “metaversity” concept to provide experiential digital lessons within extended reality (XR) environments, higher education institutions will need to carefully consider the data privacy and security implications that could come with mass adoption of VR technologies.
These concerns were the focus of a recent webinar led by Richard LaFosse, compliance and policy lead for academic innovation at the University of Michigan, and Didier Contis, executive director of academic technology, innovation and research computing at the Georgia Institute of Technology, who offered an overview of security and ethical challenges of XR adoption both on and off campus at the virtual Educause Annual Conference this month.
According to LaFosse, AR/VR technology’s scanning capabilities could raise privacy concerns similar to those associated with virtual room scans for remote test proctoring. As an example, he cited a recent ruling from the U.S. District Court for the Northern District of Ohio which said “room scans” have the potential to violate students’ constitutional rights to privacy when required by a public university.
“XR technologies such as VR headsets far exceed webcams in terms of data capture concerning one’s surroundings, so it’s easy to see how similar violations might be found where privacy considerations are left out of XR initiatives, particularly if students are using devices in their home where the expectation of privacy is so high,” he said. “XR devices collect a lot of [personally identifiable information] that could trigger [Family Educational Rights and Privacy Act] FERPA protections.”
When it comes to student privacy and data practices, Contis said it’s important to take note of just how much data can actually be collected through the use of XR devices. He said XR headsets can scan and analyze the space around students wearing them in detail, similar to room scans used for proctoring, and also track students’ unique movements, interactions with objects, facial features and biometric data, among other data points. He said institutions must know if data such as this is stored locally on the device or in the cloud, and whether the collection itself could violate current federal or state student privacy regulations.
“With each new end user computing device, more data and personal information is collected,” Contis said. “To understand how much data is collected by XR devices, we need to consider the sensors this new class of devices has. In fact, XR headsets have a lot more sensors than a phone.”
In addition to carefully reviewing data collection practices and terms of service before adopting new devices, Contis said universities must consider the implications of requiring students to use their student accounts when using or logging into XR devices, as well as ways to manage and keep them up to date.
“Analyzing device manufacturers’ data collection and use practices and processes to address concerns is not enough,” he said. “The same needs to apply to XR application developers or other service providers themselves, as they may be collecting potentially sensitive and personally identifiable information as part of delivering a specific service or app functionalities.”
Other considerations include the danger of physical and psychological harm from the use of these devices, such as users bumping into objects in the room while using headsets or getting “cyber sickness,” a form of disorientation or nausea that can come from seeing motion without physically being in motion. LaFosse added that these devices, used by so many students, could also raise concerns about hygiene and sanitation.
Noting potential liability concerns in this area, he said, universities should clearly communicate lab safety procedures and possible dangers for students using XR technologies.
“As much as XR is a great technology, the risk of harm either to yourself or others is unfortunately real and should not be underestimated,” he said.
Read More: news.google.com