Virtual reality, or virtual environment systems, are, in general, systems where a user is interactively interfaced to a computer and engaged in a 3D visual task. The computer provides a virtual domain for supporting 3D models of objects or complete environments and, given suitable transducers, the user can interact with the system in real time.
Features such as immersion, real-time interaction, 3D graphics and force- feedback are well-known system concepts and have been used in flight simulation for over a decade of time.
Commercial airlines ( e.g. Boeing ) and the military ( e.g. the US Army ) have been using flight simulators for over twenty years. These flight simulators are used for training pilots in developing new skills in handling aircraft under unusual operating conditions, and discovering the flight characteristics of a new aircraft.
In a flight simulator, such as the Boeing 747 simulator, the control panel in a cockpit is identical to one in a real plane. Outside the windows, there are displays generated by the sophisticated computers -- the so called Computer Generated Images (CGI) . When the trainee "takes off" in the simulator, he sees an identifiable airport and its surroundings. The simulation of Boeing Field, for instance, might show a fuel truck on the runway and Mount Rainier in the distance. The pilot also hears the rush of air around the wings that are not actually there, the clunk of non-existent landing gear retracting. Six hydraulic systems under the simulator tilt and shake the cockpit. These are so convincing which make the trainee feels that he or she is actually controlling a real plane!
Modern military and civilian aircrafts are very expensive to construct. They require sophisticated engines, accurate navigation systems, reliable communication systems, elaborate safety systems etc. which make the production cost for each aircraft enormous. As a result of the high production cost, it is not a good idea to train the would-be pilots in such aircraft - hence the evolution of flight simulator.
Although a flight simulator is also expensive, it has proven to be the most cost-effective method for training pilots. The reason is that it can be operated without damaging the real aircraft and can also enjoy a long life.
As the would-be pilots are just trainees, they do not have good experience in operating the aircraft. So, there will be a high probability that they will crash and damage the real aircraft, endangering their lives as well as creating hazard to the general public. With flight simulation, this problem will not arise and thus can be used to train the pilots effectively
For a flight simulator to be useful, it must be able to convince the pilot that he or she is actually inside the cockpit of a real plane. As a result, the flight deck of some specific crafts, such as an Airbus 320, Concorde, or a Boeing 767 is accurately reproduced to create the 'look' and 'feel' of the real thing. Furthermore, every instrument must function identically to their real-world counterparts. This implies that fuel gauges must react to the rate at which imaginary engines consume fuel, which in turn must accurately reflect thrust and temperature characteristics.
In real life situations, we obtain motion cues from the ear's vestibular system, visual data and the sensation of touch. Our ears are vital for maintaining balance and indicating spatial orientation while visual cues provide overwhelming data relating our speed with other external objects. Civilian flight simulators attempt to reproduce the motion effects by accelerating the cockpit environment with the aid of powerful hydraulic rams, and military simulators employ further gadgets such as g-suits that simulate inertial forces.
Visual system is another component responsible for creating and projecting the images of runways and airports needed for the various training exercises. The displays on the "windows" are generated by the computer image generators.
Immersion is an important feature of virtual reality systems as it is central to the paradigm where the user becomes part of the simulated world, rather than the simulated world being a feature of the user's own world. In a flight simulator the immersion is achieved by a mixture of real hardware and virtual imagery.
Real cockpits are constructed from arrays of instruments, joysticks, levers, switches, buttons, sliders, etc., that possess individual mechanical characteristics. Pilots are constrained to floor-mounted chairs, and during take-off and landing scenarios they are restrained by seat belts
Despite the automatic landing systems, the majority of landing approaches are made manually. Consequently, pilots do depend upon what they see through the cockpit windows. This is where computer graphics complements the real cockpit environment to create the illusion of flying.
In order to achieve this, flight simulator makes use of the Rediffusion
Simulation's WIDE system which comprises three elements :
Typically, three projectors form a seamless coloured image upon the back- projection screen, with each projector forming an image with 50° horizontal field of view. The translucent back-projection screen is mounted above the cockpit and out of view from the pilots; the image on the screen is then seen as a reflection in the panoramic mirror.
With the correct juxtaposition of the projectors, screen and the mirror, the pilot and co-pilot see a virtual image several metres beyond the physical domain of the cockpit.
On a 3-channel system where the field of view is approximately 150°, the image is created to a definition of approximately 800 lines and the entire image contains in excess of 2 million pixels. The update and refresh rate is 50 Hz for day scenes, and 30 Hz for dusk/night scenes. The higher refresh rate is required for the bright daytime images to prevent flicker. By reducing the refresh rate for the dimmer night scenes extra time is made available to superimpose calligraphic light points upon the raster image. The light points are used to represent runway lights, stars, moving traffic and even particle systems for modelling snow.
In order to provide an illusion of flying to the pilot in the flight simulator, it is important for the display to have high resolution. As computer graphics developed and processing power improved, image fidelity increased. Nowadays, a modern image generator can create coloured images with an update rate of 50 Hz, and a polygon count of 1000 polygons/image. In flight simulation, the virtual 3D world may encompass 100 square miles which means that some objects will appear very small when viewed in perspective. Consequently, certain features of the environment are modelled at two or three levels of detail, and the image generator selects the appropriate level to keep the polygon count at a minimum.
Very often, there are times when the pilot's world must physically interact with the virtual 3D world. For example, when the aircraft actually makes contact with the virtual runway, the motion system must respond with a transient to simulate the undercarriage striking concrete obliquely at 200 knots.
In order to achieve the real-time interaction, the data are maintained in the computer system modelling the flight dynamics of the simulated craft, and are made available to the image generator at a rate of 30 Hz. And as the update rate of the image generator is in the order of 50 Hz, the heading samples are interpolated to derive intermediate values.
Once the image generator knows the heading of the plane, there is a delay of approximately 0.06 second before the pilot actually sees the image corresponding to this position. This transport delay arises from the time required to 'walk' through the hierarchical database and retrieve the relevant geometry; apply the perspective transformation and clip surfaces to the viewing volume; and render the image into a frame store.
Placing the simulator on a motion platform permits the pilot's virtual world to be subjected to some of the forces encountered when piloting a real plane. For example, on take-off the vibration introduced by the undercarriage riding over the runway can be simulated quite effectively. Even the dynamics of the plane's suspension can be included. When landing, the actual point of contact must be computed between the plane's undercarriage and the runway; and when this happens, the platform must be driven with sufficient force to mimic the powerful interaction that occurs at this point.
The attitude of the platform is also used to simulate the forces resulting from accelerating and decelerating. By tilting the platform backwards, the pilot feels pushed back into his seat and imagines being accelerated forward, whereas tilting the platform forward simulates a sensation of decelerating. However, one of the most difficult tasks for the motion software is to keep the platform in a central quiescent position from which it can move in response to the pilot's next manoeuvre.
Force feedback is also introduced into the flying controls which are generally connected via cables to the surfaces controlling the plane's flight characteristics.
In a flight simulator the virtual worlds are based upon actual international airports, which are constructed from plans, maps, photographs and site visits. Although the airport is the central feature of the database, it is necessary to include surrounding detail up to a radius of approximately 10 miles. Creating such databases requires special software tools that can cope with the features such as : scene complexity, level-of-detail management, textures, colour, animation sequences and hidden-surface removal strategies.
Flight simulation can be made use of in the military aspect. Here, there may be more than one trainees in the virtual world, interacting with one another. As a result, a more sophisticated environment is neededve -- Advanced Synthetic Environments (ADS) which will integrate computer models, actual warfighting systems and weapon system simulators. Synthetic environments (SE) are simulations that represent activities at a high level of realism from simulations of theatres of war to factories and manufacturing processes. These environments may be created within a single computer or a vast distributed network connected by local and wide area networks and augmented by super realistic effects and accurate behavioural models. They allow complete visualisation of and total immersion into the environment being simulated. Entities within ADS will be distributed geographically and connected through a high-speed communications network. Warfighters will be able to train as they fight within this synthetic theatre of war. Synthetic environments will allow the military to train forces, develop doctrine and tactics, assess the status of operations, evaluate operational plans, conduct "what if" analyses on those plans and rehearse missions. This kind of Modelling and Simulation (M&S) technique will provide a cost-effective method for conducting joint exercises anywhere in the world.
However, for this kind of Distributed Virtual Reality technology, concurrency and consistency are crucial because as there are more entities in the same environment, the time delay will be greater. As we want all the users in the environment to interact effectively, we need to provide a longer bandwidth and need some clever way of sending messages in order to accommodating more users in the same environment in the distributed virtual reality world. For example, the US Army uses the SIMNET for the simulators.
With virtual reality techniques, flight simulation industry provides cost- effective way of training pilots, either military or civilian. However, in order to make it to be effective, i.e. to provide the user with the illusion of being in a real aircraft, sophisticated computer systems are required to provide 3D graphics, force feedback and to handle real-time interaction.
With the ever-developing computer technology, virtual reality as well as distributed virtual reality in flight simulation will continue to develop and improve -- providing much higher resolution of the display and decrease the time delay, making it more effective in achieving its purpose.