Could ultrasound transform how we use our mobile devices?
'The future of touch is sound,' a recent Reuters article claims. But what does this mean exactly, and how might it impact the workplace?
According to the article, ultrasound technology - which we normally associate with unborn babies and cancer treatment - could transform the way we interact with our smartphones and tablets. Combine this with a new form of wave that's currently being developed by a startup company in the UK, and things start to look even more interesting.
British startup Ultrahaptics is partnering with Jaguar Land Rover to develop invisible, air-based controls which drivers can feel and adjust - for example, to turn down the radio or adjust the car's temperature - using only hand motions.
The technology means that users don't actually have to touch anything. As Tom Carter, co-founder and CTO of Ultrahaptics explains, "the controls find you in the middle of the air and let you operate them."
This type of sensor technology isn't necessarily anything new - think of Nintendo's Wii, for example - but it does take things a step further.
The implementation of ultrasound technology has the potential to move our interaction with devices from the 2D to the 3D. Some companies, such as Japanese startup Pixie Dust Technologies, are exploring combining mid-air haptics with tiny lasers, to create visible holograms of the controls.
When applied to the workplace, this could result in the ability to display and manipulate computer data in mid-air, using only our hands.
Of course, we've been promised this sort of sci-fi technology before, not only in movies such as Iron Man and Minority Report, but by researchers as well. In Japan, Hiroyuki Shinoda - known as the father of mid-air haptics - has been developing the concept of ultrasound tactile displays since the '90s.
But while the idea is still a work in progress, a mid-air gesture interface that combines touch and visuals could well be appearing in offices of the future.