For the past two years or so, Microsoft has been teasing the development of a new way to navigate mobile phones and tablets; 3D Touch.
The technology essentially detects the presence of single or multiple fingers hovering over a device’s screen and causes the operating system or open application to react. Moving fingers over the screen could potentially scroll content, such as a web page for example, and hovering a finger over a menu option could bring up multiple sub-menus to choose from or even video controls. Several reports from over a year go even suggested that Microsoft was going to incorporate this technology into the Windows phone’s Live Tiles. In that case, the finger movements would expand the Live Tiles to display options or sub-folders. In a way, it acts very similarly to how gestures work on Microsoft’s Xbox One video game console with the Kinect camera.
Microsoft was apparently planning to launch a Windows Phone, codenamed McLaren, that was going to use 3D Touch but the plans were cancelled. The tech giant was however awarded the patent for hover sensitive 3D Touch back in July 2015, which probably explains why Apple’s recent devices use pressure sensitive 3D Touch (which requires physical contact with the screen and different levels of pressure) instead, though that’s just speculation on our part.
For a while it’s looked like Microsoft has given up on using 3D Touch in their products but this week, Microsoft Research released a new video that demonstrated how the technology (now referred to as “Pre-Touch Sensing”) can be used and blatantly states that the company is still researching it for possible use in products in the future.
Most of the uses (see the video above) appear to be more refined versions of what we’ve heard before but this video is also one of the first looks at the Pre-Touch Sensing in use on, what looks like, a working smartphone.
What’s interesting in this video is that it also demonstrates new operating system elements that adapt depending on how many fingers are detected and how the device is being held. Also, while it’s impressive how many ways Pre-Touch Sensing can change the user experience, it admittedly also appears potentially confusing for users who simply want to touch the screen or point at something but would be greeted with a plethora of menu pop-ups. There will definitely need to be a lot user-testing done before implementing this in a commercial product.
In a newly published paper, Microsoft’s principal researcher, Ken Hinckley revealed some of his thoughts behind the evolution of touch screens. “It uses the hands as a window to the mind,” he says. “I think it has huge potential for the future of mobile interaction and I say this as one of the very first people to explore the possibilities of sensors on mobile phones, including the now ubiquitous capability to sense and auto-rotate the screen orientation.”
Would you like to see Pre-Touch Sensing used in a future Windows phone or even a Surface Pro or Surface Book? Share your thoughts with the community in the comments below.