There have been many new developments in extended reality (XR) over the last year. XR refers to all real-and-virtual combined environments and human-machine interactions through wearables and computer technology. Here, we take a look at some of the most significant recent developments in virtual reality (VR), augmented reality (AR) and mixed-reality.
The Difference Between Virtual Reality and Augmented Reality?
First, however, let’s clarify some key terms. What is the difference between virtual reality and augmented reality?
Virtual Reality (VR) is a computer-generated simulation or recreation of a real-life environment or situation, which works by immersing the user in that reality firsthand through stimulation of their vision and hearing. VR is used for two main reasons: (i) gaming, entertainment and play; and (ii) training for real-life environments, such as flight simulators for pilots. A headset such as Facebook’s Oculus or Google Cardboard is typically used. VR is made possible through VRML (Virtual Reality Modeling Language), a coding language that determines a series of images and details what kind of interactions can occur with them.
Augmented Reality (AR) is a technology that layers computed-generated images over an existing reality that users can interact with. AR is used on mobile devices to blend digital and real worlds together in a way that they enhance each other. AR is beginning to enter the mainstream. In sports, for instance, it is used to display score overlays over telecasted games or on mobile devices, to pop out 3D emails, texts or photos. Holograms and motion activated commands are increasingly being deployed via AR.
VR and AR can sometimes be blended together to make an even more immersive experience, for example, haptic feedback (vibration and sensation added) is used in both formats to give a more lifelike feeling.
Many believe that entertainment will become increasingly more immersive and interactive than previously, in part because of the growth in potential for AR, VR and mixed-reality technology. The adoption of AR tech for entertainment is increasingly widespread. In 2016, Lucasfilm announced a partnership with Magic Leap aimed at the sole purpose of experimentation with AR, VR and mixed reality tech.
Not only is entertainment expected to change with the further adoption of VR, AR and mixed-reality, but how learning functions will also likely change. One such VR experience that is already being used as a training experience is Karim Ben Khelifa’s “The Enemy,” in which participants resemble a soldier avatar who encounters enemy soldiers and asks them questions. It is intended to build empathy between supposed enemies, and has been used in war zones to powerful effect.
Some believe that film and TV could become more immersive through the use of AR to offer additional info on top of what is being seen visually. Similarly, the ways in which people listen to music and audio could change. Doppler Labs was the first company to make AR earbuds “with Real-World Sound Control”, which filter out real-world sounds to allow other sounds to come through i.e. creating an augmented layer on top of real audio to enhance and customize the listening experience. Doppler has since folded, but the product developed offers a glimpse of what could come and how diversified AR tech could become.
Newspapers and magazines have also been experimenting with AR, for instance by enabling ticket buying through AR features on a reader’s smartphone or bringing new features to newspaper ads.
There are many scenarios in which AR is already being used and brought into the mainstream. Here are two different use cases.
New AR Experience for Professional Sports: Clippers CourtVision
The LA Clippers just debuted a new augmented reality (AR) experience. On opening night for the 2018-2019 NBA season at Staples Center in Downtown Los Angeles, the new experience, Clippers CourtVision was rolled out. The experience combines computer vision, AI and AR to analyze the court action and translate it into annotations and animations that “brings fans deeper into every game”. Some of the features include selecting from multiple camera angles around the arena, choosing your audio channel (including unfiltered sound direct from the arena), watching live games and recaps on mobile or desktop, and real-time shooting stats.
LA Clippers owner Steve Ballmer who is the former CEO of Microsoft, and the company behind the technology, Second Spectrum, say it could be the initial step toward a different kind of viewing experience for professional sports in the future. Ballmer is an investor in Second Spectrum, which was founded five years ago by AI experts and acted as the “official optical tracking provider” of the NBA in 2016, allowing computers to watch live sports and track player/ball movement at a granular level. Then machine learning (ML) and AI are applied to overlay a live NBA stream with data and graphics.
“We think everyone will watch sports this way,” said Second Spectrum CEO Rajiv Maheswaran in a preview of the experience for reporters. “There will be a day when you look back and say, I can’t imagine we all used to watch the same thing at the same time. That seems silly.” The new viewing experience is available via FOX Sports Prime Ticket subscribers in the LA area through the FOX Sports app. The AR experience currently takes place on a two-minute delay to complete the computer processing; a traditional TV broadcast usually involves a delay of about 30 seconds. Maheswaran said Spectrum plans to narrow that gap to single-digits.
AR for Designers: Adobe’s Project Aero
Project Aero is an Adobe AR venture aimed at giving “creative professionals the tools they need to design and publish immersive experiences that will transform the world”. Adobe envisions AR helping storytelling and design experiences being lifted off the page and out of our digital devices to more deeply involve the senses. Adobe is working in partnership with various tech leaders to standardize its interaction models and file formats, and with open standard efforts including usdz and gITF “to deliver a comprehensive AR offering”.
Project Aero lets designers use familiar tools like Adobe Photoshop and Dimension CC to create AR content “in physical spaces, making AR creation more fluid and intuitive”, and to deliver AR experiences to mobile users more easily. Adobe’s support of the project signals the likely move for AR to become a mainstream design experience in a similar way to the move from paper materials to digital.
There are many fascinating use cases for VR underway in an array of areas, as diverse as physical therapy to motorsports testing.
VR to Help Stroke Patients
A partnership in a UK hospital is using the possibility of VR to help stroke patients rely less on human carers, whether family, friends or hospital staff. The University of Chester and the stroke department at the Countess of Chester Hospital (with funding from the UK government) are experimenting with ways in which a VR headset can help patients relearn and practice everyday activities, such as toasting bread.
Nigel John, from the University of Chester, said: ‘Patients will be able to measure how well their cognitive abilities are improving, building confidence in their ability to perform everyday tasks and reducing the psychological trauma often associated with the condition.”
VR for Motorsports Testing
At Ford Performance’s North Carolina tech center, VR is being used to help test and develop the passenger cars that will end up at local dealerships. The tech center has two simulator rooms, including a racing simulator, which Ford can fit up with the truncated cabin/s of the vehicle that it is currently testing. The simulator room is supported by a control center in which engineers monitor the tests. They track what gear a driver is in, how much throttle they are using, and even how their particular driving style might impact on fuel mileage or wear and tear in a real-life racing situation. Traction can be programmed in, as can various weather conditions, and sound levels/types. Inside the pod, the driver has the exact same controls and dashboard as they would have in a real-life car.
Improving the VR Experience
VR gloves until recently have been heavyweight items with a bulky exoskeleton, pumps and thick cables. They may have tracked the movement of the user’s hands, but didn’t offer a real sense of touch beyond a kind of rumbling or vibration. Two recent innovations, however, have begun to change that:
Ultra-light Dextres VR Gloves
Scientists from EPFL and ETH Zurich have recently released newly developed thin and light VR gloves, which allow for haptic feedback and an “unparalleled” freedom of movement when touching objects inside a VR experience. The nylon gloves weigh under 8 grams per finger and are 2mm thick with elastic metal strips over the fingers separated by a thin insulator that reacts when contact is made with a virtual object. They are powered by a thin electrical cable, which will turn into a battery in time, according to the scientists as the system is low voltage.
“The human sensory system is highly developed and highly complex. We have many different kinds of receptors at a very high density in the joints of our fingers and embedded in the skin. As a result, rendering realistic feedback when interacting with virtual objects is a very demanding problem and is currently unsolved. Our work goes one step in this direction, focusing particularly on kinesthetic feedback,” Otmar Hilliges, head of the Advanced Interactive Technologies Lab at ETH Zurich, said in a statement.
A different type of VR glove was launched earlier this year with similar goals of letting users feel their interactions with virtual objects. The Sense Glove is a large, skeletal hand that users then slide their own into and fasten attachments onto their fingers. “So basically, Sense Glove enables touch in virtual reality,” CEO Gijs den Butter said. “It does it with force feedback, so you actually are restricted when you’re trying to grasp an object, and with haptic feedback, so you get a little tactile sensation when for example you’re touching hard or slippery objects.” It is primarily used for B2B (business-to-business) training purposes, for example on an assembly line – to teach workers through virtual objects rather than real ones, reducing costs.
Human Eye Resolution
Helsinki-based Varjo Technologies has a powerful mission: to create the first XR/VR product with human eye resolution. The company has received $31M in funding from Atomico to develop hardware and software products focused on industrial use in order to “deliver the world’s first human-eye resolution VR product for complex and design-driven industries”. The investment is being used to grow its team from 80 to 200+. Varjo’s first project will be an industrial-grade VR/XR headset that is built for integration with 3D engines and software tools. At a resolution of 50 megapixels per eye, the prototype is twenty times stronger than that of consumer devices.
“The resolution of VR devices on the market today is a fraction of what the average human eye can see,” said Atomico Founding Partner and CEO Niklas Zennström. “Until we met Varjo’s visionary founders and experienced their superior product firsthand, we thought that VR was still at least 10 years away from being truly useful for professionals. It’s because of Varjo’s world-class team that industries such as automotive, engineering, aerospace, architecture, construction, industrial design and real-world training simulations won’t have to wait that long to be able to utilize the technology for their business-critical use cases.”
Technical Innovations Powering XR
XR1: The First Chip Dedicated to AR and VR
Earlier this year, Qualcomm launched the XR1, its first chip made specifically for VR and AR devices. The chip is intended to make it cheaper for companies to build entry-level versions of AR and VR gadgets. Qualcomm described the XR1 gadgets as being designed for “lean back and 360 viewing” of videos instead of “room scale tracking”. It is equipped with “simple controllers” instead of hand-tracking abilities. Nonetheless, the chip is capable of supporting 4K displays at 60 fps, voice activation, and controllers that are able to detect movement to six degrees of freedom. If the XR1 is a success, it will be worth watching to see what Qualcomm do next and if there is commercial potential and profit to be had in making a higher-end version in the XR line. Currently, Qualcomm is also re-purposing phone processors for VR and AR, including its flagship phone processor, the Snapdragon 845.
QCA64x8 and QCA64x1 Chips: Super-Fast WiFi
This month, Qualcomm launched a family of 60Ghz WiFi chips that add extremely high-speed WiFi (up to 10 gigabits per second) to phones, laptops, routers, etc. Whether or not it catches on in the mainstream remains to be seen. It won’t be used to speed up regular web browsing (WiFi 6 due out soon is meant to do that). Rather, the new set of Qualcomm chips are intended for use with another kind of WiFi standard being updated. This kind of WiFi builds on a wireless technology called WiGig, which relies on a connection standard dubbed 802.11ad, which can average speeds of up to 5 gigabits per second over close to 10 meters. Dino Bekis, the head of Qualcomm’s mobile and connectivity group says its latest chips move WiGig to a new version of that standard dubbed 802.11ay, which Bekis says will be able to reach double the speed up to 100 meters away. The Wi-Fi Alliance says the new standard “increases the peak data rates of WiGig and improves spectrum efficiency and reduces latency.” 802.11ay is being used an optional add-on to existing WiFi technology. In some ways, it will be competing with 5G.
It will almost certainly be very useful in VR, for purposes such as replacing the data cable that VR headsets like Oculus or Vive currently require with a high-speed wireless link, eliminating the need for a wire.
These are just a fraction of the many developments in XR across the space. Experiments with VR and AR are happening in many industries in many forms, seeking out the best ways of using and developing XR for entertainment, learning and many uses not even yet dreamt of.