You’re feeling drowsy. You’ve worked a long day on the construction site but you insisted on driving all the way home instead of staying in the B&B the company booked for you. The car lurches, rousing you with a jolt. But soon you’re nodding again, your vision is clouding over, sounds are getting muffled, until you wake with a start and… Bang! Too late to avoid that truck.
It’s frighteningly realistic, and it’s meant to be. Engineering firm Amey, which commissioned this Fatigue Simulator from Holovis, has found that workers who experience the shock of a near miss are more careful to avoid one in future. So it wanted to subject them to one, but in a perfectly safe environment – in this case a multi-sensory driver training simulator that combines 360-degree visuals with immersive audio, synchronised motion and gesture based interactivity.
It is worth noting that, although we tend to call these things “simulators”, in a technical sense simulation is what is happening behind the scenes. “Simulation is the process of creating a mathematical model of a physical system, then running that model in a computer program to calculate how that system might behave under real conditions,” says Alan Prior, senior director for technical sales at Dassault Systemes.
What we actually experience in a driving or flight “simulator” is visualisation, “the process of creating images and animations of the model itself and the results of that simulation,” as Prior puts it.
Simulations don’t have to be visualised. When the Met Office runs its weather forecasting models it doesn’t need to generate thunder and lightning or replicate the experience of a hurricane. But visualisation (which may rely on sound and motion as much as visuals) enables live people to enter a simulation. This may be an end in itself, as in our example, or so that their reactions can be recorded and assessed, as in the testing of prototype designs.
The end result can be very convincing. “A high degree of fidelity is needed for visualisation so that it creates the virtual environment with precise accuracy and depth, so people forget they’re in a simulator,” says Tom Smith, a simulation solutions architect at Holovis.
“Key users for this type of system are military organisations for training, where they expect to see dust storms, blades of grass and even the effect from the rotor down-wash of a helicopter, all in lifelike detail. Military simulated graphics are now showing an accuracy of 30cm to a metre, so the technology has to be able to display that.”
Super-realism comes at a price, however. “The question is how real do you need it to be?” says Frank Reynolds, European marketing manager at Antycip Simulation. “A classic flight simulator with a copy of the actual plane’s cockpit and a hydraulic platform, supported by high fidelity, realtime visuals, can be almost as real as flying the actual plane, as all senses are being addressed.
“But that’s very expensive, so the simulation is normally built around the budget and need. A scaled down version can be a spherical 270-degree screen, with a warp and blended multichannel projection system, an office desk with generic flight controls, and a high fidelity software model of the plane.”
Pilots could then build up their experience relatively cheaply in the simpler simulator, and therefore derive more benefit when they graduated to the true-to-life version.
Nor is it always necessary to physically replicate the sensations the user would experience in real life, as racing driver Dean Stoneman explained after a spin in Ansible Motion’s Delta series simulation of the classic circuit at Spa, which includes a six degrees of freedom motion platform and dedicated motion controller, coupled with high-end graphics and motion cueing.
“The physical feedback provided by the simulator was extremely realistic. But the engineers say the secret isn’t necessarily to try to replicate the G forces from a real car, but to coax me into behaving as if I were in a real car. Ansible Motion’s technique relies on modelling the human vestibular system (which controls balance and orientation) – effectively ‘mapping’ the desired sensations of movement to the physical motions required from the simulator.”
In an ultra-realistic simulator the user’s peripheral vision needs to be completely immersed so they feel as though they are actually inside the data set, says Smith. “Traditionally environments would be brought to life in full or partial domes featuring projection or within curved screens. But recently, as the quality of virtual reality (VR) has started to improve and with the advances in head mounted devices (HMDs), some situations can be recreated outside this restricted space, allowing users more freedom to train individually or through networked experiences in a group.
“Airlines enhance the display even further to have a collimated (perfectly parallel) field-of-view visual system as seen by the pilots. This is where an outside image of runway and approaches to an airport is set up to infinity, using glass or mirrors to seamlessly blend the image that’s coming from four or five projectors to make it appear infinite.”
The key technical requirements of simulators are high contrast, resolution and brightness, so laser-based projectors are becoming the norm. Screen gain also needs to be optimised to match the projectors.
“As well as having the right resolution media and display solutions, the alignment is a crucial factor, as any overlaps or geometrical misalignment will instantly ruin the illusion,” says Smith. “Optical blending also helps to avoid the ‘birdcage’ effect, when overlapping light patterns aren’t blended correctly and lines can be seen. We use a GVBI Chronos system that puts a very fine mask in front of the projector with a series of tiny lines and holes, so when the two overlap only a certain percentage of light gets through.
Elaborate VR installations may use a CAVE, such as the one designed by Holovis for BAE Systems’ Aerospace Academy in Lancashire. This is a four-sided structure with volumetric projection on all four walls, where users wear a head-tracked device to immerse them into the virtual world, which appears to move according to their true perspective. The CAVE also allows groups to share the same virtual space without the constrictions of wearing a full headset. It is used to train people in aircraft assembly and maintenance without the risks associated with working on real planes.
Full, immersive VR is not always either necessary or appropriate, however. Augmented reality (AR), which overlays computer-generated elements on to the user’s view of the real world around them, is having a significant impact on training and maintenance, says Smith.
“AR is being used as an interactive guide, triggered by a person pointing a tablet device or wearing an HMD and looking at an item, and viewing and recognising real life parts which are mixed with virtual reality. This could help troubleshoot issues by the computer scanning the object, comparing what’s present and how it’s functioning against what should be there, and guiding people through procedures for changing consumables or conducting maintenance.”
We noted earlier that, although many simulators exist mainly for the user’s benefit, in terms of training and experience, they also enable users’ reactions to be recorded and assessed. The ability to measure live drivers’ responses to a car while it is still on the drawing board, for example, is hugely attractive to motor manufacturers, says Phil Morse, a technical liaison specialist at Ansible Motion.
Another key benefit is repeatability. “With a simulator you know the exact grip of the virtual surface across every square millimetre of its area, and you know exactly what the other vehicles will do,” says Morse. “Getting rid of all the ‘noise’ makes it easier to examine small changes that might be swamped by variability in the real world. Compare that with heading to a test track only to find the weather is different from the previous day or the surface is covered with frost.”
Simulators enable more people to gain experience of rare or expensive kit, and can save huge sums of money – a one hour training flight in a Eurofighter can cost £50,000 just for fuel, says Smith.
They are also less dangerous, not only for the user but the machine and its surroundings. “The Goliath crane used to build HMS Queen Elizabeth was simulated and visualised so users could both train on it before using the real thing, and plan the operations and movements of the crane to avoid unnecessary risks,” says Reynolds.
Although today visualisation tends to be the preserve of large organisations with deep pockets, before long we may all be using it. “It’s an odd feeling looking at a life-sized car, so realistic that you can see the reflections of the sky in the metallic paint, sitting inside it and seeing the images in the wing mirrors, then clicking a button on your handset to change the paint colour or the options on the dashboard,” says Prior. “This could be the future of car buying.”
One day it will be possible to link individual simulations into a whole virtual world. “The ability to network systems together in real time is one of the innovations shaping the future,” says Smith.
“In military simulation, for example, interlinking the different aspects of a mission, from flight simulation to ground level JTAC (joint terminal attack controller) and armed vehicles, is allowing full scale training missions to be run at very little cost and with no threat to life.”
If only we could fight all wars that way.