Our physical world is three-dimensional. Even though we are far advanced in the digital age, most of our work is now being done on two-dimensional vision and application.
by Ruwantissa Abeyratne
“The metaverse is best understood as the shift of computing and interaction from a device in your pocket into a virtual simulation.” ~ Matthew Ball
The word “metaverse” has been with us for decades, first introduced as a platform of dystopia in a 1992 science fiction novel titled “Snow Crash” by Neal Stephenson. Yet, it eludes definition or full comprehension in many of us. Some have gone to the extent of calling the Metaverse a “3D Internet”. At most, the Metaverse can be described as “ a virtual-reality space in which users can interact with a computer-generated environment and other users”. When this definition is expanded it becomes “ a hypothetical iteration of the Internet as a single, universal and immersive virtual world that is facilitated by the use of virtual reality and augmented reality”.
Somewhere over Russia, enroute to Italy. [ Photo: Richard Gatley/ Unsplash] |
This new platform has already shown compelling results in the world of surgery where successful surgery was performed in 2021 in the Johns Hopkins Hospital by neurosurgeons in augmented reality during live surgery
Virtual reality and augmented reality are terms that we are already familiar with. The difference between augmented reality and virtual reality (AR and VR) is that, while AR takes digital information and transforms it to a 3D physical reality, obviating the burden of the cognitive load and is used in various commercial enterprises, VR takes physical reality into an environment that is computer generated and driven, thus making it an ideal application for entertainment purposes. The difference between the two is intrinsic in that AR gives us physical reality while VR gives us virtual reality. This is what makes AR valuable for the aviation industry. Porter and Heppelman go on to say: “At Boeing, AR training has had a dramatic impact on the productivity and quality of complex aircraft manufacturing procedures. In one Boeing study, AR was used to guide trainees through the 50 steps required to assemble an aircraft wing section involving 30 parts. With the help of AR, trainees completed the work in 35% less time than trainees using traditional 2-D drawings and documentation. And the number of trainees with little or no experience who could perform the operation correctly the first time increased by 90%”.
The combination of AR and VR would be significant to both the air transport and airport industries. We could envision a combination of AR and VR being used to simulate weather patterns; cloud formation; and turbulence in a flight path which could alert the flight crew prior to taking off. The authors of System for synthetic vision and augmented reality in future flight decks (June 2000 Proceedings of SPIE – The International Society for Optical Engineering) say: “Rockwell Science Center is investigating novel human-computer interface techniques for enhancing the situational awareness in future flight decks. One aspect is to provide intuitive displays which provide vital information and spatial awareness by augmenting the real world with an overlay of relevant information registered to the real world. Such Augmented Reality (AR) techniques can be employed during bad weather scenarios to permit flying in Visual Flight Rules (VFR) in conditions which would normally require Instrumental Flight Rules (IFR). These systems could easily be implemented on heads-up displays (HUD)”. The new vision of the flight deck includes AR in weather information, surrounding air traffic and information on terrain.
In the airport industry, this has already become a reality. In the airport world, this platform is called “The Digital Twin”. As an example, Hong Kong International Airport can be cited which has its own Digital Twin, where airport staff use a live 3D simulation to plan and determine where passengers, gates and planes should be located and directed. The Digital Twin is also being used at Schiphol in Amsterdam, San Francisco International and Vancouver Airport.
The Digital Twin is a virtual replica of every aspect of airport operations and performance “to maximise efficiency and increase capacity in a more timely and cost-effective way”, as an article in the magazine Passenger Terminal World reports. Its most effective purpose is to alert airports to anticipated problems on a 24-hour basis and flag operations staff at the airport so that they can obviate the threat and operational difficulty that could ensue. It also points to problem areas that could inconvenience and delay passenger flows, thus avoiding congestion.
Another useful purpose of the Digital Twin is that it can alleviate passenger stress. An example cited is the airport and flight experience it offers before the actual experience, thus enabling passengers who are anxious to be more prepared when undergoing the actual experience. One category that benefits from this platform is the autistic community.
The Digital Twin can also offer insights into the future. For example, if an airport has an aspirational goal of net zero carbon emissions by 2030, it can model the aircraft and on-ground vehicle movements as well as other activities on the airfield. These models can be applied to machine learning that can reflect the most efficient way an airport can be run. Even in the planning process of an airport, the Digital Twin could offer the best iteration as Schiphol has done in the application of building information management software to generate a 3D digital version of physical and functional characteristics of an airport infrastructure.
Our physical world is three-dimensional. Even though we are far advanced in the digital age, most of our work is now being done on two-dimensional vision and application. Whether we look at a computer screen or smartphone we do not have the true picture until we translate the two-dimensional information we receive into three-dimensional practicality which is the real world. This process of translation imposes a load on our mental capacity requiring time to decipher practical reality. This demand on our brain is called “cognitive load”. AR greatly diminishes the demand on our cognitive load by converting the data obtained by two-dimensional methods images and animations that instantly gives us a picture of the real world. Michael Porter and James Heppelmann in their article Why Every Organization Needs an Augmented Reality Strategy published in the Harvard Business Review say: “[T]oday, most AR applications are delivered through mobile devices, but increasingly delivery will shift to hands-free wearables such as head-mounted displays or smart glasses”.
Dr. Abeyratne is the author of Aviation and the Carbon Trade, Aviation and the Environment, and Aviation and Climate Change: In Search of a Global Market Based Option.
Post a Comment