I recently spent a great morning with Professor Bob Stone of Birmingham University where he heads up the Human Interface Technologies (HIT) Team. Bob has a background in Psychology and Human Factors and been working in the area of Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) since the 1980s.
I first met Bob as part of our work on the Mayflower 400 project. We were developing a digital map and working on a virtual exhibition of the New World Tapestry and his team are working on a VR version of the Mayflower based on scale drawings and laser scans of the Mayflower II. This was a full size replica (based on similar ships designs) of the original which was built in the UK and sailed to the USA in 1957. Mayflower II is now being restored in time for the 2020 commemoration and will be berthed at Plimoth Plantation in the USA. The idea of the VR Mayflower is to give people an experience of what life would have been aboard. I was shown the latest virtual reality version of the Mayflower as moored in Plymouth with avatars of the passengers and crew on deck.
The ability to move around a reconstruction of Plymouth Harbour before she set sail for America and walk along the quay was really effective. I especially liked the attention to detail where the sea gulls cast their shadows on buildings as they circled noisily overhead. Yes – one for kids of all ages – look at one of them and you risk being ‘pooped’ on your headset! (I did wonder if for full VR effect we should have them swooping down to steal some of the passengers’ fish and chips but as that great British meal was not created until the 1860s it would not have been something that the Mayflower passengers and crew would have experienced eating). Bob and his team are working on the challenge of making the avatars interactive so they can engage with the public.
The journey to America of the separatists started on 22nd July 1620 when the Speedwell left Delftshaven bound for Southampton. The Mayflower sailed from Rotherhithe probably in late July or early August bound for Southampton. Her passengers were people who had been recruited by Thomas Weston and the Merchant Adventurers Company to settle in America after buying a share in the Joint Stock Company. In Southampton the Speedwell had to be repaired as she was leaking on the way over from Holland. Both ships were being provisioned in Southampton prior to sailing for America. Experienced sailors were recruited locally for the journey. Whilst in Southampton an issue arose over funding and the terms of contract being imposed by Thomas Weston and the Merchant Adventurers. Some of the provisions already paid for had to be unloaded and sold to pay for expenses. Finally on August 15th 1620 the Mayflower and Speedwell left Southampton bound for America. Although repaired in Southampton the leak in the Speedwell reoccurred so forcing a stop-off in Dartmouth for further repairs. Both ships left and traveled over 300 miles past Lands End when it was found the Speedwell was leaking faster than they could pump her out so they returned to England and put into Plymouth. It was agreed that the Speedwell was too unreliable to travel to the US so as many passengers and provisions as possible transferred to the Mayflower. Those who could not board her returned to Holland to try again later on different ships. With 102 passengers and an estimated 30 or so crew Mayflower eventually left Plymouth on September 6th 1620. 66 days later – on November 9th 1620 they sighted land in America. It was Cape Cod. Realising they were further North than they expected (and had permission to settle) they tried to sail South but rough seas caused them to search for a suitable settlement in the area they had arrived at. Whilst the men went ashore by rowing boat to explore for suitable locations the women and children and younger servants would probably have stayed on-board. By December 25th 1620 they had finally selected Plimoth as the location for their settlement and started work constructing buildings and fortifications against possible attack by wild animals and any unfriendly Native Americans. The delay meant that the Mayflower was unable to return to England due to possible winter storms so had to over-winter in America and sail back in the Spring. It would take until March 1621 before they had constructed enough buildings for everyone to disembark the ship. By then almost half the passengers and crew had died through illness resulting from the conditions on-board and the freezing cold winter. For many – especially the women and the younger children – they would had been living on-board for over 6 months! On 6th April 1621 Mayflower set sail for England and arrived in Rotherhithe probably in early May 1621 in a crossing that lasted less than half the time of the outward journey.
I was also shown a number of the remote controlled submersibles, surface vessels and UAVs that have been designed by Bob and his team over the years to be used in various projects and experiments. The latest version of these will be tested over the coming months as part of a study to evaluate a new Mixed Reality “remote science station” concept for the proposed Mayflower Autonomous ship that is being designed and built to sail to the US in 2020. It is hoped that the sensing devices on the ship will send a continual stream of data back to science stations located at numerous sites for members of the public and schoolchildren to actively follow and experience the transatlantic crossing.
The science station is being built by Bob and his team based on the MR desktop command and control system they are developing for a number of defence projects. What was really fun was I got to try it out. Based on our previous work in developing videos for incident and emergency situation training we could immediately see the value of a MR command and control system. Imagine a desktop table that converts into an interactive live 3D map. That view is controlled by live video feed from overhead Unmanned Aerial Vehicles (UAVs). Additional sensor data and remote video feeds are coming in from either fixed or movable cameras on the ground. Status data is available via virtual display screens. When you don the virtual reality headset you have a 360 view of what is happening. You can physically move around the table or zoom or out by swiping your hands using HTC Vive handheld controllers. Bob and his team are also experimenting with the latest VR gloves. What was even more amazing was using the latest TPCAST untethered headset system. This means I could move around physically without worry of reaching the limit of the cable connecting the VR headset to the PC. The VR headset meant I also had a constant virtual wall of information on displays that I could either monitor side by side or zoom in to see close-up.
What was even more exciting was the fact that coming soon will be a multi-mode remote system so I could be sharing the ‘incident management’ table with others – all us untethered. Each of us would be seeing the same situation developing but from our own point of view. Based on our many years experience of filming incident training exercises we can see how powerful and valuable this sort of system is for training and also live incident management. Commanders can see a situation over the whole area for themselves in real time rather than rely on verbal reports from those on the ground or having to visit each area personally using valuable time and risk missing developments in an area already visited. ‘Gold Command’ can have can have incident commanders in the same room all having remote feeds of the incident from teams on the ground so removing the issues and risks that communication lag time and physical distance brings. Having a real time 360 view and ability to switch to different team views also provides insight and potential advanced warning of how a situation is developing. This could enable resources to be deployed or re-deployed earlier or additional assets to be called on or moved to other locations on standby. This would save precious time in a rapidly developing situation.
I was then shown a VR recreation of HMS A7, a submarine that had sadly been lost at sea just before the outbreak of WWI with the tragic loss of all her crew. The reconstruction was done to preserve the records of her but also to allow others to explore her virtually as she is a Protected Wreck Site. Special licenses from Historic England or the Ministry of Defence are required to dive at such sites. Dive Trails are being developed by Historic England to allow divers to better prepare for and complete dives safely at such sites. The provision of virtual reality dive trails would help many more members of the public to explore the UKs underwater heritage so helping general awareness and enhancing education and research.
Having personally visited the British Museums/BP sponsored exhibition ‘Sunken Cities: Egypt’s Lost Worlds’ in 2016 I just wonder how much more informative and enjoyable it would have been if I had been able to ‘join’ the archaeologists underwater on one of their dives or even see a 3D map of the site and understand better the layout of the city and its objects. Whilst the traditional audio guide I rented was helpful having access to interactive QR codes on a printed guide and on display signage would have allowed me to have an even greater insight using my mobile phone or tablet. The idea of a subject matter expert such as a researcher or curator as a virtual ‘personal information and education guide’ has great appeal. (Note: Focus are developing a virtual exhibition of the New World Tapestry which has been stored in the reserves at Bristol City Museum for the last 10 years. The virtual exhibition is still being curated and can be seen here.)
I was also able to go aboard a second submarine. In this case, a Trafalgar Class boat, it was a virtual reconstruction of the interior including all the instrumentation and controls. Using the keyboard and mouse I was able to walk along corridors, up and down ladders and open and close compartments and doors. There had been an accident on board a submarine and the objective was to provide a virtual reality version so Royal Navy personnel could be trained on emergency response situations. Whilst in this case the interior was reconstructed virtually on a computer the interior could also have been filmed with a 360 degree camera system (we use 6 GoPro 4’s mounted in a special rig) and the resultant video stitched together.
The ability to virtually recreate physical environments and enhance the visual view with sound effects and external stimuli such as temperature and smells enhances room based training to another level. Whilst this can never fully replace physical training on specially designed rigs it has many advantages in terms of the physical space needed, the numbers of people who can be trained and the cost of creating and using it. This can be especially useful when multiple training scenarios are required or staff at distance from the training rigs or have limited through put.
I was then able to experience how Bob and his team have developed a VR tour of the area around Wembury, Dorset It was really interesting to see the level of detail and how even the level of light could be altered to simulate or match different times of day. Working with the medical teams from the Queen Elizabeth Hospital in Birmingham they have integrated their virtual reality programme into an exercise cycle that patients can use in their intensive care beds.
Feedback from patients and medical staff has indicated that it is a positive experience. The data will be evaluated so that the value of it in medical terms can be assessed. The project was recently covered by ITV Central in a news report which can be seen here.
The final set-up for me was the most impressive because it showed the power and potential of MR. The training was a battlefield casualty management scenario where the casualty is being treated on board a helicopter after evacuation. The inside of the helicopter passenger cabin has been physically recreated.
This has been scanned and mapped to create a virtual version. This then overlaid on top of the physical mock-up. The computer generated images were displayed on the headset. This meant you could reach out in the virtual world and touch the physical object that was there. In your head set you see your virtual hand touching the virtual object whist your real hand was actually touching the real object. Using a cable free head-set you could also move around the casualty and inside the helicopter. To add to the realism the swaying and buffeting movement of the helicopter in flight was recreated by motion in the headset. Whilst not actually moving it felt that you really were. That feeling was further enhanced by a rearward view with additional virtual reality soldiers and the ground below being passed over as we flew over it. The forward view showed the pilot and co-pilot and the sky passing by.
Using hand controllers I could examine ‘Steve’ the casualty and ensure the correct action was taken to address his battlefield injuries.
Steve was a real looking dummy with eyes, teeth and a tactile finish that felt just like real skin. In addition he had a simulated injury that enhanced the feeling that this was a a real causality recovery situation. This was just like a training exercise we filmed for the Department of Health in 2008 which involved the Casualties Union. In that case real people had special effects injuries. In this case ‘Steve’ had the simulated injuries. Bob and his team are exploring just how practical and valuable MR is when used in training.
Stop press: 24th July 2017 – Bob and the Birmingham University HIT team have just won a top academic health science award ‘in recognition of their innovative research and development into the use of Virtual and Mixed Reality technologies for civilian and military rehabilitation and for military defence medic training.’