You see the intricate details of its dry-stone walls and how each of the carved stones fit perfectly together. You walk around as if in the footsteps of 15th century explorers.
Close your eyes again. This time when you open them, you stand ready to play tennis on the court of the French Open.
While at one time these different activities could only be experienced with extensive travel, immersive technologies are now bringing these experiences to life right in your living room.
Immersive technology is a growing platform through which we can experience the world around us – just like texting, video games, photos and videos. We can use this new medium to creatively interact with the digital and real worlds.
Currently, this technology comes in three main forms: virtual, augmented and mixed. Each will change your perception of reality in a slightly different way.
In virtual reality (VR), you are transported into an entirely virtual space and immersed in an artificial environment.
Your physical presence and environment are simulated, and your vision and hearing are often blocked, or occluded, from external perception. Instead, you see and hear a digital experience projected onto a screen. Everything you see and hear is created in the hardware.
“There are some virtual reality experiences where you don’t move at all,” says Michael Gjere, who leads a technical effort with these technologies at 3M. "There are others where you can walk around and enjoy the experience while interacting with your environment."
For instance, you can take a virtual drive on the Pacific Coast Highway in California or take the driver's seat in a real NASCAR® race while being able to look around the interior of the car.
In an augmented reality (AR) experience, your present reality is enhanced. You can still see the real world around you, but a digital experience is overlaid on top of your environment – whether it’s through a mobile device or a headset.
You’ll see a real-world environment with elements supplemented, or augmented, by computer-generated sensory input, like sound, video, graphics or GPS data.
Augmented reality is different from virtual reality in that your vision and hearing are not occluded from external perception. AR enhances your current perception of reality, while VR creates a new reality.
With AR, you may be able to interact with the inside of a car engine or human heart and see how things work. Or, you may be able to see what a piece of furniture would look like in your home, or view a chair or sofa in different textures and colors before purchasing.
AR can also allow you to see more of your world, letting you navigate stores and museums with overlaid directions or figure out where to eat by scanning a restaurant to see food reviews. You may even be able to try on a new wardrobe without going into a store – all by seeing how clothing will fit on a virtual mannequin that has your same measurements.
Mixed reality is the newest addition to the immersive technologies. It’s where your vision and hearing are occluded from reality. Your real environment can be viewed through a camera lens and integrated into the experience. Or, you can digitally alter your real environment on the device screen.
“For example, I can turn you into one of the Jetsons,” says Michael.
Typically, you view this experience through a device that uses a lens to merge your environment with the digital experience, allowing physical and digital objects to co-exist and interact in real time.
All of these experiences come with an emotional, psychological and physical reaction – one that can be so intense that it could be difficult to discern which reality is real. "You don’t realize how engaged you get into these experiences until someone bumps into you," explains Michael.
But, that can all be ruined if you’re distracted by a clunky headset or feeling nauseous.
How do scientists make sure digital reality is as seamless as possible?
Jo Etter and Tim Wong, two product development engineers working with AR/VR technologies at 3M, explain.
“Right now, virtual reality headsets are really large," says Jo. "We say that they’re kind of like having a shoebox on your head.”
Tim adds, “We think that smaller devices will be more comfortable. They can be made lighter weight, and we think that’s really what’s going to allow this market to grow."
A team of 3M scientists utilize various 3M technologies, such as microreplication and multi-layer optical films, to create solutions that enable smaller devices and more efficient optical components in AR and VR headsets.
The 3M team is working to create optical components for head mounted displays that help provide more engaging images. The optics are designed with an understanding that in order to have as realistic an experience as possible, you don’t want to feel like you have a piece of equipment on your head.
“The components we’re developing increase efficiency and improve the profile of the devices,” says Jo. “They're designed to enable smaller headsets, similar to glasses or goggles.”
You may be surprised to know that these optical components are designed on a computer and brought to life through machine processes.
“We get to use a variety of lab equipment, such as laser cutters and milling machines for fabrication. Then, we measure what we make and iterate to create the next generation products," says Jo.
She explains that scientists measure light properties going through the lens of optical material at different spots. The result can lead to a higher resolution, higher field of view and ultimately a better image.
Quality images are important, but not if they lead to a headache. The scientists know that wearing a comfortable headset extends beyond the physical equipment. How you feel while wearing it matters, too. The main goal is to view images from the display without experiencing eye strain.
"One of the problems with some optics is they can create headaches and eyestrain when people wear them,” says Tim. “3M scientists are exploring optical materials designed to let those optics be smaller, lighter weight, more compact, and higher-efficiency.”
The end result? Less eye strain.
So now that we’re comfortable, how do we make these experiences feel real?
That’s where Chris Brown comes in. He’s a human factors psychologist with experience in human cognition and perception. In his role at 3M, he uses video game development software to create interactive digital experiences.
“You can create any interactive 3D environment that you would want, helping you build environments that feel realistic,” says Chris.
In one of those environments, Chris created an urban downtown area, an emergency response area and a construction zone.
He says that materials, patterns and colors can make these environments feel realistic – and to do that, Chris gets to play with math.
“Making a material is basically matrix algebra in a visual form,” he says.
One big challenge is making sure that the light seen in these environments looks and feels the same way it does in real life.
For the retro-reflective materials that Chris created, the software will calculate the angle that the light is coming in and the angle that the light is reflecting.
“You can tweak how bright the reflection is, what direction it comes in and how quickly it falls off,” says Chris. “Then, you can apply whatever kind of texture you want.”
In working closely with these technologies at 3M, Michael experiences the excitement around digital reality firsthand – but he says this technology still has a long way to go.
“Augmented reality is in its early teens right now. We’re trying to bring it into adulthood,” he says. “The first people to use the technology were hardcore gamers, and it’s only now trickling into enterprise and more common usage.”
Michael’s prediction is that within two years, these technologies will become universal.
"We all carry supercomputers in our pockets. Why can’t we use them to improve our experiences? That’s what we’re trying to do," says Michael. “Think about having a smartphone in your pocket 10 years ago. We couldn’t believe we could take pictures with it. We couldn’t believe we had GPS in our phones or that our phones could tell us where to park. It’s part of our nature now.”
And if his prediction proves true, Michael says we may be able to use digital reality to benefit many aspects of our lives. “You may be able to hold your phone over a car and see what it would look like if you wanted to buy it in red. You might even be able to pull up the MSRP while you’re at the lot and say, ‘I want to buy this car.’”
Michael adds that it’s important to collect analytics and data from these experiences so we can learn from and move this technology forward.
“We have a lot of technological capabilities where we can track every single touch, click and glance. What happens in these technologies? We can gather data on that,” says Michael. “We can figure out things we didn’t know we didn’t know.”
Ultimately, Michael doesn’t want to know where the future is headed with this technology. He says, “I want to be surprised and delighted to see where it’s going to take us.”