(Reuters Video) Eye-tracking wheelchair helps the severely disabled steer new course - "move around simply by looking to where they wish to travel" - [2:27]
http://www.reuters.com/video/2014/07/02 ... Innovation
https://www.youtube.com/watch?v=ZckE7FMWrw8
Wednesday, Jul 02, 2014 - 02:26
Scientists in London have developed an algorithm-based decoder system that enables wheelchair users to move around simply by looking to where they wish to travel.
The researchers at Imperial College London say the system is inexpensive and easy to use and could transform the lives of people who are unable to use their limbs.
Jim Drury has more.
▲ Hide Transcript
Algorithms working with inexpensive software could help quadriplegics steer wheelchairs simply by looking in their desired direction of travel.
An Imperial College London team says their newly devised system can read eye movements to tell if a person is merely gazing or wants to move.
Co-designer and student Kirubin Pillay says it's simple to use.
SOUNDBITE (English) KIRUBIN PILLAY, STUDENT, IMPERIAL COLLEGE LONDON, SAYING: "At the moment I'm just moving forward by looking to the floor but exactly at points on the floor that I would like to go to and the wheelchair is responding, so if I look right, slightly towards Will, I'll move there and if I look left as well I'll move there as well, and it just responds to my gaze and my desired location that I would like to go to." Visual information detected by cameras trained on both eyes is analysed by algorithms within 10 milliseconds, and translated into instructions for movement that's almost instantaneous, says researcher William Abbott.
SOUNDBITE (English) WILLIAM ABBOTT, RESEARCHER, IMPERIAL COLLEGE LONDON, SAYING: "We actually move our eyes upwards of three times a second, so there's huge information there, so essentially we track the pupil of the eye and via a calibration process we relate that to where the subject's looking in the world around them." Multiple sclerosis or spinal cord injury patients with severe paralysis, are usually able to move their eyes because they're directly connected to the brain.
For now, the team is keeping details of its decoding technology secret, but Pillay says it's an improvement on existing eye tracking systems.
SOUNDBITE (English) KIRUBIN PILLAY, PHD STUDENT, IMPERIAL COLLEGE LONDON, SAYING: "Current tracking software often uses a screen-based system where you have a screen open and you look at locations on the screen.
The problem with that is that it's very simplistic and also diverts the users' attention from the outside world and therefore there's more risk of not noticing obstacles or other things in the way." While the technology has been designed for the disabled, team leader Dr Aldo Faisal, from Imperial's Brain and Behaviour Lab, says it has much wider application.
SOUNDBITE (English) PROJECT LEADER, DR ALDO FAISAL, IMPERIAL COLLEGE LONDON, SAYING: "You could use it maybe one day to drive your car, you could use it to operate a robot, you may be able to use it to fly planes or drones or spaceships with this type of technology." Tests on able-bodied volunteers found they steered through crowded buildings faster and with fewer mistakes than when using other eye tracking technologies.
Trials on disabled patients are about to start, and the team hopes its system could be commercially available within three years.