While searching for something entirely different, I came across an article about the future of interaction design. It was written on November 2011 and it still is fairly valid. In fact, I am starting to become a “finger sliding” person as well. Even while typing this, I am using the capacity of my hands way less than when I write with the good old pen. And yes, this is a faster way to type, typewriters have been around for years (hence the qwerty keyboard structure) but oops I am losing the grip here!
There is much to talk about this, yet this post was more of a reminder to myself about this article and as inspiration for my future work in the field of interaction design. Click on the link and enjoy.
The next time you make breakfast, pay attention to the exquisitely intricate choreography of opening cupboards and pouring the milk — notice how your limbs move in space, how effortlessly you use your weight and balance. The only reason your mind doesn’t explode every morning from the sheer awesomeness of your balletic achievement is that everyone else in the world can do this as well.
With an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?
Right after I posted this, my facebook wall was invaded with the news of Google’s radar based wearable, Project Soli. As TNW news put it “Project Soli wants to make your hands and fingers the only user interface you’ll ever need.” or in TechCrunch “What it does is let you control devices using natural hand motions, detecting incredibly fine motions accurately and precisely, even through materials (you could install the sensor beneath a table, for instance). It does this using radar (I’ll get to that later) and allows you to manipulate tiny or huge displays with equal accuracy, thanks to the lack of a need for touch point sizing constraints. Haptic feedback is included, since your hand naturally provides it itself – your own skin offers friction when you touch fingertip to fingertip. Soli then is designed to reimagine your hand as its own user interface.”
Yet again I am very not impressed. I do not want my hands and fingers to be the only user interface and rubbing my thumb against my index finger the only interaction I have. In fact an interface is defined as “A boundary across which two systems communicate”, so reimagining my hand as its own user interface just does not click for me.
My desire still lays on the tangible, physical interactions, so even though Soli may open up new areas and offer a better approach to the “finger on screen” interaction, I can not help feeling it is just replacing “screen” with “air” or “other fingers”. That being said, with further explorations, it can turn into something interesting, but it just is not there yet for me right now.
Let’s see what the future has in store for us ;)