Gesturing the Future: Time to Redesign our Design Tools
Sometimes, the way we’ve been working for decades just feels right — even if it’s wrong.
The mouse and the keyboard as the default input system was ideal when most of our computer interactions were confined to text input and text output. (Remember the dreadful black screen with the blinking command prompt?) When computers got more powerful and design software entered the realm of 3D, the mouse and the keyboard hobbled along, despite ergonomic risks associated with this antiquated method.
But the emergence of touch screens, mobile devices, and — perhaps most important — gesture-aware hardware like Kinect suggests the dawn of a new era. In the near future, the way we inspect a detailed 3D assembly of a car or create an animation sequence may look a lot more like an exercise routine or a dance move.
In the video report below, you’ll see Autodesk research strategist Brian Pene remotely controlling an automotive design model (rendered in Autodesk Showcase) from a few feet away. It’s the same technology that lets you control a dancing avatar or a karate fighter from across the room in game titles like Dancemasters or Mortal Kombat (both available for Xbox 360 and Kinect).
Half way around the world from Pene’s California home office, Russian developers from 3DiVi are also looking into enabling gesture-based computing, not just for desktop machines but for Android OS devices. They plan to do it in the small power envelop of mobile hardware.
Usually cutting-edge, experimental technology tends to cost a lot more, because the pioneers have to underwrite the cost of initial risk. But not so with gesture-based computing. The new usage paradigm can be powered by the Leap Motion controller device, sold at Best Buys for less than $100.
My father was recently introduced to Windows 8 on a touch-enabled all-in-one consumer PC. Judging from the agility with which he has been poking at the YouTube shortcut and Skype app, I can tell he’s never going back to the mouse (which poses added burden on his arthritic fingers).
Gesture-based computing is the focus on my feature article for October issue of DE, called “Your Body is the Mouse; Your Hands are the Pointers.”
If you’re a design software developer working to incorporate Kinect-style gesture recognition, I’d like to hear about your work.
If you’re in the design and engineering profession, I’d like to hear your thoughts on how you feel gesture computing might improve the way you work.
Please let me a message in the comments or contact me at kennethwong [at] deskeng.com