How you could control your world with just your fingertips

  • Published
Woman manipulating virtual carImage source, Ultrahaptics
Image caption,

High-frequency sound waves can give us the sensation of touch

The QWERTY typewriter was introduced in 1872, and since then tapping on a keyboard or screen has become the standard way to interact with digital technology. But this isn't always convenient or safe, so new "touchless" ways to control machines are being developed.

Imagine being out for a jog, headphones on, and wanting to turn up the volume without breaking your stride. Or receiving a "new message" alert on your phone while driving and wanting to activate the text-to-speech function without taking your eye off the road.

These are scenarios where touchless control would come in handy.

"Today we interact with computers and devices not just at our desks but in a variety of different contexts - while on a run, on the subway, or in a car," explains Dr Sean Follmer, an expert in human computer interaction at Stanford University.

"With mobile computing devices like smartwatches or even, in the future, augmented reality glasses, we no longer have large surfaces on which to place keyboards or mice, so we need to create new input devices and technologies that can allow us to interact while we are on the go."

One of these technologies is radar.

Image source, Google
Image caption,

Google envisages being able to scroll through digital menus and make selections using finger gestures

Most of us might associate radar with air traffic control and military defence - firing radio waves at an aircraft and measuring the time they take to bounce back can reveal location and speed.

Google's Project Soli, external has adapted the concept and effectively miniaturised it.

Mini-radars fitted into a range of devices, such as smartphones, kettles, radios, or car dashboards, could enable users to activate and control them using gestures alone.

The tech is so precise it can differentiate between the subtlest hand gestures, such as thumb and index finger rubbing or pinching.

Now St Andrews Computer Human Interaction research group (SACHI) has honed the tech further using machine learning to make it suitable for object recognition.

Prof Aaron Quigley, SACHI chair, says: "A major issue we had to solve is that the energy that comes back from the objects we want to track is a remarkably complex signal.

"We need to train our system to recognise the objects from these thousands of overlapping signals and we solve this using advanced AI [artificial intelligence] algorithms."

The mini-radar tech could be applied to objects and materials, says lead researcher Hui Shyong Yeo, opening up the potential for assistive technology for blind and disabled people.

Recycling centres might be able to sense and process different types of materials automatically, he adds, or home security systems could detect if objects have been stolen.

Image source, Ultrahaptics
Image caption,

Virtual buttons that we can feel could be useful when driving or operating machinery

Sound is also being trialled as a means of touchless gesture control.

Bristol-based Ultrahaptics uses ultrasound signals - sound waves at frequencies above the range of human hearing - to create the feeling of touch in mid-air.

This so-called haptics technology makes it feel like you're pressing a button or turning a dial when it's just your fingers experiencing highly targeted sound vibrations.

One of the biggest challenges so far has been accessing enough computer power to keep up with the technological developments, says co-founder Dr Tom Carter.

"Our first prototype took 20 minutes to complete one computation on the most expensive PC we could buy - this meant that if you moved your hand you had to wait 20 minutes for the haptics to update," he says.

"Not exactly interactive!

"[The time delay now is] 10 microseconds on a very small, cheap processer like those you find in your mobile phone.".

Image source, Tom Carter
Image caption,

Ultrahaptics co-founder Tom Carter thinks touchless controls are useful in vehicles

But why do we really need touchless controls?

One of the areas where he sees the tech taking off is inside vehicles.

"Touch screens increase driver distraction," says Dr Carter. "You cannot feel the controls so you have to take your eyes off the road and look.

"With our technology, users can perform gestures in the air and receive tactile feedback to let them know that the system has recognised what they have requested."

While Ultrahaptics uses sound to create physical sensations on the skin, sound can also be used in a similar way to radar - to detect gestures.

Elliptic Labs says its software can turn existing speakers and microphones into ultrasound sensors that then enable users to select a music playlist or take a selfie, say, using a simple mid-air hand gesture.

Image source, Elliptic Labs
Image caption,

Elliptic Labs chief executive Laila Danielsen thinks all speakers will be gesture sensitive

"Our virtual smart sensor platform simply utilises the microphone and speaker already on a device to gather its ultrasound data," says chief executive Laila Danielsen.

Ultrasound signals can have a range of up to 5m (16ft) and can be generated with relatively little power, says Ms Danielsen. She thinks that within a few years every device with a speaker and a microphone will use ultrasound for at least one gesture.

"The most basic gesture a user can do is simply enter or leave a room," she says. "We expect appliances such as lights to turn on or off depending on a person's presence."

While consumers and the tech industry may currently be in thrall to voice control thanks to the growth of virtual assistants and the increasing sophistication of speech-recognition protocols, voice has its limitations, argues Stanford University's Dr Follmer.

"Voice works well for entering text or making discrete - such as on/off - selections. However, for spatial or continuous control it can be complicated. Smoothly changing the volume could be done more easily with a virtual slider or knob."

Voice control isn't optimal if you're in the middle of a conversation, say, or in a meeting or a library.

Dr Follmer anticipates that gestural interaction will continue to grow, particularly in the field of augmented and virtual reality where a physical mouse or keyboard is less useful.

Touchless control would also be useful in hospitals to help prevent the spread of germs, he concludes.

The QWERTY keyboard - whether real or virtual - will always have a role to play, but touchless is undoubtedly on its way.