A partnership to demonstrate the art of the possible in the world of robotic simulation.


The three-way collaborative project will use the forefront of new and emerging technologies available and brings together an extremely experienced team, bridging key areas in games/software development and healthcare, enabling Iceberg Creative (formally Dorset Creative) to work in partnership with the renowned Orthopaedic Research Institute (ORI) at Bournemouth University and highly acclaimed professors from BU’s Creative Technology Department. Over the next 3 years, the team will explore the development and evaluation of a prototype focusing on the simulation of hip surgery, via the use of a robotic arm for training purposes.

Robotics is revolutionising medicine. In total hip replacement surgery (THR), the level of accuracy we can deliver from this surgical intervention has been transformed using a robotic arm, producing significant improvements in patient-reported outcomes. But, highly qualified THR surgeons are not familiar with robotics technology and access to training is expensive and sporadic.

This year, the UK is expecting around 143k total hip replacement surgeries. We currently have about 1,780 active orthopaedic surgeons. About 420 of these are performing less than 50 THR surgeries a year, leaving the rest needing to perform about 100 surgeries a year to cover the remaining workload. So, with an ageing population, we really are in dire need of more THR surgeons in the UK (globally, it’s a similar story). But, it currently takes 9 years and approximately £560k to train someone to be qualified to perform a THR surgery, more for robotics-assisted. The time is lengthy because it takes practice – requiring 1. repeated opportunities of non-complex patient cases, 2. an expert surgeon and novice availability at the same time and location each time, and 3. costs to prep the theatre for a ‘teaching’ surgery session. In robotic arm-assisted surgery, there’s highly expensive equipment in ‘downtime’ for the teaching – and waiting lists of patients are just getting longer (and, NHS budgets are already stretched).

Immersive simulators create high ROI training because of the ease of widespread access to training and unlimited repetition. Research demonstrates that VR or use of immersive training for surgery creates better quality learning – heightened knowledge retention and an increased confidence in self-perceived ability. But, progress of surgical grade immersive training has come unstuck with limitations of the existing technology – specifically the ability of the graphics and haptics to keep up with what’s required for a surgeon to be ‘fooled’ into believing the simulator is real enough to forget they are in a simulation. The representation of how skin behaves is difficult to simulate, and blood / liquids require intensive special effects artists who can only produce a good representation when it doesn’t allow user interaction. Haptics are nowhere near advanced enough yet to create life-like simulations for the forces used in orthopaedics. Any transfer from simulator training to real surgery – where a patient’s life is at stake – needs to be seamless; the surgeon must feel confident to perform in reality as easily as they did when ‘qualifying’ on the simulator. Flip it and essentially, surgeons need to be fooled that reality is just the same as “the game”.

Cue – Iceberg Creative’s insatiable technical curiosity and lateral thinking! The need for more robotics-qualified surgeons is desperate. We must use the existing technology (with its limitations) to create a product which simply works, to demonstrate it is possible already to qualify surgeons in robotics for THR. By carefully studying the surgical procedure (we sent our technologists to observe live surgeries and have scrutinised many gory videos!), Dorset Creative have created an immersive training simulator for orthopaedics which can absolutely create an illusion for the surgeon. 

In early 2019, Iceberg Creative noticed that in robotic-assisted THR surgery, after dislocation of the joint and removal of the degenerated femoral head (which are stages no different from manual THR surgery – surgical skills that a surgeon is already expert in), the surgeon’s view is primarily focussed on either a medical imaging like CT scan data or, the anatomical landmarks of an exposed bone. Both of these visual inputs are simple to recreate in photo-realistic modelled computer graphics. The surgeon uses robotics hardware like a probe – which is easy hardware to recreate for touch authenticity – giving rise to kinesthetic memory. Rather than concentrate on teaching surgical techniques that a surgeon is already familiar with (current immersive surgical training fails here) we concentrated on teaching only the parts of being assisted by the robot – so that we don’t distract to break the illusion with unrealisms, but still giving a fully immersive experience.

In THR with assistive robotics, the surgeon uses optical markers placed on the hip for image guidance by the robot. Without very high accuracy here, the surgery thereafter is inaccurate as there becomes a compromise to all other stages of the operation – leading to heavily compromised post-operative patient outcomes (and likely a need for revision surgery). This landmark placement is the vital part of training for robotic surgery, and our simulator presents a way to exactly mimic the reality of what the surgeon experiences in real surgery.

To create this technology, we needed to know visual simulation items, like correct and incorrect placement and selection of markers such as orientation, location, depth. We also needed to understand the geometrics of markers and of the bone, typical posture that should be adopted by the surgeon and tool changes and the workflow. We needed to record surgery taking place for real, giving these reference points, multi-sensory information and an understanding of the process. We created a recording system which would not interfere with infrared (IR) navigation so’s not to interrupt the robotics using Kinect technology in a pair system, recorded in KinectStudio and rendered in Unity to make a 3D image. This was successful in calibrating and detecting tool maker points, giving us the data we needed. 

The simulator had to differentiate between the capability of the trainee from expert to novice, otherwise the speed of qualification to ‘certified competence’ would not change to solve the sector’s problem. Our simulator trains the surgeon at the right level using AI (eye and hand tracking and interactions), speeding up the overall process of the training and more essentially, working as an objective examination tool.

For software, we are using Unity3D and C#. Hardware is split into 1. visualisation technology – which is a VR headset for immersion, Lenovo explorer and a Windows mixed reality headset because it can be used with low specification graphics cards (this enables the user’s head orientation to be captured), and 2. interaction technology, a phantom omni for tool position and haptic feedback, and a leap motion controller for the natural interaction with hands, allowing a believable illusion to occur (this will let the user interact with tools in the virtual space and also receive force feedback). A Windows 10 pc with a minimum of 8GB RAM is used for the processing, as it supports mixed reality. We set to work in June 2019 and will complete the simulator during the first half of 2020.