Cats. Love them or loathe them, one thing is for sure: the Internet, that astounding creation that lays the sum of mankind’s knowledge, wisdom, experience, and expertise at the fingertips of anyone with even a dial-up modem, is chock full of them. From generating TGIF memes to pedaling corporate affirmations, where there’s a marketing need there seems to be a feline for the job. But there is one cat who exerted significantly more influence upon the development of serious science than the moggies riding roombas or wearing Cat-Woman cosplay drag. And that is, of course, Schrödinger’s cat. In fact, in terms of the development of Virtual Reality and its arguably more useful sibling Augmented Reality, Schrödinger’s flea-collared friend might well have kicked it all off. Cats, quantum mechanics, and augmented reality in the cleanroom – oh my!
Today we’re diving into the alternate realities of superpositions of state and examining how they connect with AR/VR technology in the cleanroom, and this journey just about thrills the geek inside each of us.
And so it should, the quantum field long having been the preserve of the geek. When Schrödinger, the father of modern quantum mechanics, first developed his theories that led to the Many Worlds concept, he was perhaps a prototypical geek at a time when such a status – outside of the scientific establishment – was far from desirable. But we digress. Many Worlds – the concept that every observation results in the bifurcation of reality – leads to the theory that we live in one of an infinite number of possible universes. When we make an observation – for example, that when a coffee mug drops to the floor it will smash – we rule out the opposite observation from our lived reality. But that doesn’t mean it goes away. In fact, in the Many World theory, our observation of the coffee mug breaking apart just caused a splitting of reality into two different states. And while we experience a universe where it did drop and was smashed, the reality now also exists where this did not occur. And in this way, every decision we make, every observation we note, every fork in the road we take over another births a new reality in which the alternate decision/observation/path was made or taken. And that number of realities quickly adds up.
It’s an interesting thought experiment and one that we could riff on for hours. But managing an infinite number of theoretical realities can get taxing so let’s confine ourselves right now to looking at just two alternatives within our current, non-bifurcated universe – Virtual Reality and Augmented Reality. Suddenly these two hot topics seem eminently more tangible and approachable…
Within the scientific community, there can be few people who are unfamiliar with the concept and implementation of Virtual Reality. From the largely failed social simulation experiment that was Second Life to the multitude of multi-user role playing games requiring the kind of AV equipment that teenagers can only dream of, Virtual Reality – the space in which life exists on a pixel-based plane – is now a commonly understood phenomenon. Blurring the boundaries between the world as it is and as we would like it to be, virtual reality offers an escape from the quotidian, immersing the user in a wholly synthetic environment of 3D graphical and enhanced auditory input. It can blend computer-generated graphics (CGI) with genuine video footage with the aim of tricking the mind of the user into accepting the sensorial input as real. From training pilots on flight simulators to ensuring the safety of new construction, VR is a tool that allows for almost complete project customization alongside high efficiency and instant feedback.
In other words, it gives prospective astronauts a real-time, on-Earth experience of what they’ll be dealing with once outside of our own gravitational field.
And VR is used in a wealth of scientifically exciting ways. Take, for instance, NASA’s VR playground. Located in the Johnson Space Center’s labs in Houston, TX, these facilities allow future astronauts the opportunity to experience a spacewalk while still safely bound by Earth’s gravity.(1) Combining the use of a graphical 3D simulacrum of the exterior of the International Space Station (ISS) with haptic feedback gloves (yes, the same sensory technology we already wear on our wrist in the form of the Apple Watch), and motion trackers, trainee spacewalkers can become familiar with the geometry of the outer structure of the ISS, make virtual contact with objects, and gain critical experience in how objects behave in microgravity. The project is ‘an immersive training facility that provides real time graphics and motion simulators integrated with a tendon-driven robotic device to provide the kinesthetic sensation of the mass and inertia characteristics of any large object (<500lb) being handled.’ In other words, it gives prospective astronauts a real-time, on-Earth experience of what they’ll be dealing with once outside of our own gravitational field.(2)
So is Augmented Reality (AR) the same sort of deal? No, not exactly. First coined in 1990 by Thomas Caudell, a researcher at Boeing, the concept of Augmented Reality was born from Caudell’s observations of the head-mounted displays used by electricians in manufacturing wiring harnesses. He developed the term to describe the experience of integrating an overlay of sensory input in the form of usable data on top of a matrix of actual lived reality. Still confused? It’s the experience of looking at an object or a scene and having all related online data within your field of view and available in real time.
In 2013, Google released its first iteration of Glass, the wearable spectacles that combined a view of the ‘real world’ with the overlay of online functionality.
And how is this achieved? In the past, whether sitting in front of a desktop computer or scrutinizing a tablet device, we have relied upon screens to mediate our interactions with the online world and this mediation inevitably created distance from the data. But for several years already tech behemoth Google has been busy making inroads into foreshortening that distance by bringing the data closer to us. In 2013, Google released its first iteration of Glass, the wearable spectacles that combined a view of the ‘real world’ with the overlay of online functionality. Initially, it did not go well with sub-par photos, ‘even worse navigation instructions and looking for hate in the eyes of small town passers-by [making] up the bulk of the Google Glass experience,’ according to Andrew Williams, writing in Tech Radar.(3) But although the company stopped production of the eye-wear in that format in 2015, the project was never actually shelved.
And that was perhaps a fortunate decision. In myriad industries – aviation, construction, and engineering, to name just a small selection – having the ability to wear an AR device lowers, for employees, what Williams refers to as the ‘cognitive load.’ As an engineer or technician approaches a job, software that ‘reaches out, pulls down all the data, translates it into work instructions and makes that visible to the person’ streamlines the process to the point that the user saves time and resources along the workflow continuum.(4) And this sort of merging with the data is not a luxury. In many situations, having to look away from the task being performed in order to consult a screen is onerous, time-wasting, or in some cases dangerous. By reducing the distance between the user and the data, AR brings the user more fully into the experience. And given that AR-powered tasklists can enable standardized and perfected workflows, the vagaries of the human decision-making process are effectively ruled out.
…that allows high-definition augmentations to create a ‘reality experience in which holograms are mixed with the real world enabling new ways to communicate and work.
If this all still seems too theoretical let’s take a moment to examine some tangible projects. Over at NASA’s Goddard Space Flight Center, a research lab in Greenbelt, MD, augmented reality is being used to provide technicians with the ability to ascertain whether spacecraft parts will fit within the thermal-vacuum chamber before undergoing testing, thereby eliminating the potential for costly errors. Another in-house project gives technicians the opportunity to become familiar with operating of the kinds of robotic arms leveraged on the ISS. In addition, across the country in Pasadena, CA, NASA’s Jet Propulsion Laboratory (JPL) houses Project Sidekick, an immersive, hands-free holographic and real-time view hybrid that maximizes efficiency for operation and maintenance of the ISS. Working in association with Microsoft, JPL’s team is developing the Investigative Immersive Visualization Capabilities (Sidekick) remote assistance device that allows high-definition augmentations to create a ‘reality experience in which holograms are mixed with the real world enabling new ways to communicate and work. The goal of Sidekick is to increase the efficiency of crew activities related to the daily execution of science, utilization and operations aboard the ISS.’(5)
And there’s also the question of collaborative engagement. One of the most useful facets of the VR/AR experience is the ability to bring together disparate team members to collaborate in real lime. In an article published in Digital Trends, an online magazine aimed at the HENRY millennial (High Earner, Not Rich Yet), NASA engineer Thomas Grubb noted that ‘The collaborative capability is a major feature [in VR/AR]. Even though they may work at locations hundreds of miles apart, engineers could work together to build and evaluate designs in real-time due to the shared virtual environment. Problems could be found earlier, which would save NASA time and money.’(6)
But if you are not an astronaut in training, what benefit is an augmented reality facility? Purdue’s Envision center just might hold a clue. As partners in the contamination control industry, we know that establishing and maintaining a cleanroom is an expensive proposition. Not only is the initial start up cost-prohibitive, but the maintenance, supplies, expense of staffing, and ensuring that it is up-to-date in terms of equipment and protocols can take a significant bite out of an organization’s budget. Now supposing that the organization in question was an academic institution offering pharmacy technician training. Although any student in that field would benefit from live, hands-on instruction and the opportunity to gain experience in a real cleanroom, not every university or vocational college can offer such an immersive experience. And this is precisely where the virtual – or indeed augmented – equivalent plays to its inherent strengths.
VR/AR could offer a workable alternative to expensive time in a real-world cleanroom.
Leveraging a VR/AR format, an entire range of syllabi could be taught to cohorts of students following an always-available schedule across multiple time zones and countries. Best practices could always be followed and the potential for human error would be minimized in addition to tailoring the process and detail level for each course or syllabus to the needs of the student cohort. So how could it work? Let’s imagine that the students are being trained in how to effectively work within an aseptic or contamination-controlled environment. In addition to understanding the routines of proper personal hygiene and hand washing, correct garbing, and so on, they might also be learning about compounding pharmacies or nano-tech in the development of targeted drug therapies and require hands-on experience with laminar flow hoods, glove-boxes, air showers, fume hoods, HEPA filtration, or desiccator cabinets.(7) VR/AR could offer a workable alternative to expensive time in a real-world cleanroom.
And then there’s the question of maintaining the environment. Using an app, students could scan an area and learn the type and order of tasks for completion via a visual overlay. So, standing before a virtual laminar flow hood, the trainee would see both the equipment and the augmentation of current SOPs, a breakdown of recent cleaning history, an itemized detailing of tasks to be done/completed, and the contact information for the manager/learning leader responsible for oversight. And of course, the manipulation and practice in using all of this alternative technology would be shored up by an additional network of sensors connected with, and relaying data to and from, the Internet of Things (IoT). The potential for this technology truly is virtually limitless…
There was a time in our non-too-distant past when the only reality we accepted was the one we could see, hear, taste, smell, or touch, devoid of the kinds of geekery that are now commonplace. Thanks to the work of forefathers like Schrödinger (and his cat) and modern pioneers such as Mark Zuckerberg, Facebook’s über-famous founder, Jonny Ive, Apple’s creative genius, and Elon Musk, bad boy of global tech innovation, we are now more able than ever before to choose our own reality. To seamlessly blend the ‘real’ real with the virtual, and to wrap it all up in augmented detail. And that new reality, that custom blend, offers so very much more than we’ve ever experienced…if only we are willing to reach out and (virtually) grab it.
Do you work with virtual reality? Are you excited by the potential of AR? We’d love to know your thoughts!