Multi-touch is definitely one of the exciting technologies out there. Here is a fascinating video showing the multi-touch capabilities in SpaceClaim. Today a bunch of CAD bloggers (including myself) were sent emails pointing to this video showing something called 3D head-tracking. Basically, you navigate around a 3D model by moving your head. And since your head is connected to your torso by virtue of something called a neck, you end up waltzing around your computer while your butt is planted firmly in your chair. Sean Dotson joked on Twitter: “head tracking: yeah, but one sneeze and your model is destroyed.”
Personally, I find all of this quite interesting. Obviously multi-touch is great for tablet PCs where you are expected to move your hands around anyways. The SpaceClaim video actually shows geometry creation, not just 3D navigation. But this 3D head-tracking thing does sound (and look) a bit crazy. The thing is most of these technologies, in one way or the other, try and replace our good old mouse. And while doing so we are expected to move our hands (or in case of head-tracking, our bodies) in a manner that we are not used to or comfortable with.
The thing with technology, especially this kind, is that it can be applied in many ways to many things. Using it for 3D navigation on a 2D computer screen may not be the best application. Imagine a room full of people gyrating around their computers while being seated. That would give the term “back breaking work” a whole new meaning. But using this in places when the user is standing may make more sense. We are already seeing a lot of touch and multi-touch technologies used in TV newsrooms.
I am not sure why, but a lot of these technologies start by trying to kill the need for the mouse. And maybe one of them will actually end up doing that. But here is what I think. The mouse is merely a tool to point to a certain area of interest on the computer screen. Touch screens are the same. The same can be said about this head-tracking stuff. So when we are trying to replace the mouse we are actually trying to find a better tool to do the same thing – point to a certain area of interest. But then we are already doing that with our eyes. I mean even before we move the mouse, our hand(s) or whatever contraption that we are using to point, we have already moved our eyes to look at that particular area of interest on the screen. So if the computer can directly track our eyes, there will be no need for the tool at all. Both hands can be left on the keyboard (or whatever contraption replaces it) and mouse clicks can be achieved by pressing a key, quite similar to the left and right buttons on laptop track pads. I am tempted to suggest that a blink of a eye can serve as a click. But that would require you to control your blink until you have found something on the screen to click on, which would be nothing short of sheer physical and mental torture.
Using our eyes as a pointing device may sound like a crazy idea. But it is already being done, mainly for disabled users (read this). Of all the mouse-killer technologies out there, I am most interested in eye-tracking, mainly because of the reason we all love the mouse so much. It allows us to work comfortably. Of course, when using eye-tracking multi-touch applications would require me to be a highly accomplished cross-eyed human being. But I guess I can use my hands for those.
So for people looking to kill the mouse, the answer may very well be “looking”.