Skip to Content

Pioneering Research Programmes

TECHNOLOGY FUTURES | 8 February, 2017


Deputy Head of School Dr Karen Rafferty
Deputy Head of School | School of Electronics, Electrical Engineering and Computer Science
View Profile
Deputy Head of School

You are walking across the floor of a huge factory. In front of you there is an array of robots working on a production line. You look around, studying the rest of the space – and then you take the headset off.

Now you are in a white room on the top floor of the Ashby at Queen's University. This is the home of virtual reality (VR) and robotics research within the Energy, Power and Intelligent Control Cluster (EPIC) at the School of Electronics, Electrical Engineering and Computer Science. And just to reinforce it, there are even a
couple of large Iron Man lookalikes standing to attention against the wall.

The fourth industrial revolution

Robotics, intelligent control and virtual reality make up one of the highest strategic priority areas of the UK and are a focus for Industry
 4.0 – what’s being called the fourth industrial revolution. This is the design of the ‘smart factories’ of the future where cyber physical systems monitor processes and make decisions based on a magical array of sensors.

It is also the target area for one of the new Pioneer Research Programmes (PRP) at Queen’s – Intelligent Autonomous Manufacturing Systems – led by Professor Seán McLoone and in which Dr Karen Rafferty, Deputy Head of School, is playing a key role.

She says, ‘You can use VR to create optimised factory designs, working out whether everything’s positioned properly, maximising efficiency, planning for expansion and rehearsing the response to every conceivable problem. But VR and its applications can help us make all sorts of great advances in the way we live and work.’

A Queen’s graduate, Karen came to the world of virtual and augmented reality over time

Her PhD research project, sponsored by the Civil Aviation Authority, was in the simulation and performance assessment of airport landing lighting, using what's known as environmental sensing – extracting useful information from the environment.

When she became a Queen's lecturer, her interest moved towards VR – ‘but that’s also about environmental sensing. With VR you’re immersed in a virtual environment. The visuals are very impressive – like a computer game – but we also want to be able to replicate touch, and even taste and smell are on the menu. That’s when I got into haptics – being able to feel and interact, which adds to the sense of realism.’

As well as working towards the objectives of the PRP, Karen sees many other exciting applications. In the medical world, she and her colleagues are developing simulators for training doctors to carry out keyhole surgery and maximise the benefits of novel hi-tech treatments. They are working with surgeons at the Royal Belfast Hospital for Sick Children and with opthalmologists on techniques for eye surgery.

And there are opportunities to link in robotics. ‘A surgeon might have a very slight hand tremor, which might cause a problem with a certain task, but robots are very precise and you could pass the task to them – although I'm not sure we’ll ever get to the stage where robots will take over completely. And of course there is a body of research that wonders – if we train surgeons using VR and robotics, is there a danger that they may become detached from the responsibility of making mistakes?’

Within the PRP, Karen is researching autonomous intelligent decision-making. ‘What can we use a robot for that would be better than using a human being? We’re trying to figure out – can robots learn and observe and, essentially, think?’

In VR, she says, there are new opportunities all the time. She and her team are developing assisted living devices, aids that will help older people and others to remain in their homes for longer, rather than going into specialist care.

‘There are a lot of gains to be had. People can be trained in it, learn in it and have fun in it. There is also the huge potential for using it to create empathy and understanding, where you can step into someone else’s shoes. ‘And just imagine walking through the park with a small wireless head-mounted display that tells you all about the plant or the tree that you’re looking at. It just gets better and better.’ 

Media inquiries

Read more from our experts in I-AMS