Multi-touch is to keyboards what the GUI was to command line
I remember when GUI OSes first came out (yes, I'm that old). DOS users like my dad dissed it because they could do things faster with command line. Frankly they could, but GUI was the clear way of the future.
I believe we are witnessing the same scenario play out with the inevitable mac tablet. I just hope apple still has the vision to push forward.
Unlike the GUI vs command line war where ease of use won out over speed, I think multi-touch can eventually beat keyboard in both areas. Touchscreen input improves with better software and I can already think of a butzillion ways to make it better.
If my analogy is off, give me a better one.
I believe we are witnessing the same scenario play out with the inevitable mac tablet. I just hope apple still has the vision to push forward.
Unlike the GUI vs command line war where ease of use won out over speed, I think multi-touch can eventually beat keyboard in both areas. Touchscreen input improves with better software and I can already think of a butzillion ways to make it better.
If my analogy is off, give me a better one.
Comments
If my analogy is off, give me a better one.
Multi-touch is to mice what the GUI was to command line.
If my analogy is off, give me a better one.
In this case I think there are no perfect analogies. We're in new territory. It's pretty obvious to that in 20 years most people won't have physical keyboards on their computers, even desktop computers, which will have a place even in the future. A customizable OLED desktop keyboard (like I've been saying, ever before that Optimus mockup) is too powerful not to happen. It would be far too cool and powerful to use to never happen down the line. It will be affordable too, very affordable to make. The mouse is still and will be a damn good tool for some time though, I think.
Multi-touch is to mice what the GUI was to command line.
Maybe a multi-touch trackpad is comparable to a mouse, but a multi-touch screen is way beyond the capabilities of a mouse.
What distinguished GUIs from the command line is that they relied on recognition rather than recall. Instead of the users needing to remember cryptic commands off the top of their head, GUI users are presented with recognizable objects that communicate their functionality. The pre-existing knowledge required of users was greatly reduced.
While it will definitely become common-place, I see it as being used alongside of keyboards, not replacing them. The tactile feedback of physical keys is incredibly valuable. This is because of the nature of the human brain and sensory input. No amount of technology will negate the benefits of tactile feedback. Touch screen interaction is useful, but non-optimal for certain tasks such as rapid text entry in an environment where dictation isn't possible.
Maybe a multi-touch trackpad is comparable to a mouse, but a multi-touch screen is way beyond the capabilities of a mouse.
I agree.
I guess I misunderstand your analogy. I see multi-touch as an improvement over the mouse in the way a user interacts with a computer in the same way the GUI was an improvement over the command line.
Also, it is important to look at why the GUI was an improvement. As mentioned previously, it has to do with recognition vs recall. Another advantage is the bringing the physical-object metaphor into computing. The metaphor is valuable because it implies a set of valid interactions.
Multi-touch doesn't bring with it any such new metaphor or philosophy of interaction. Instead, it is simply a more convenient way of manipulating on screen objects. The interaction is still pretty much the same, but more efficient. I wouldn't go so far as to call multi-touch more intuitive. It allows more expressive and complex interaction, but anything beyond basic multi-touch interactions are non-obvious and must be learned.
This isn't meant as a criticism of multi-touch. But rather as an honest analysis of its significance as compared to the command-line/GUI paradigm shift.
How would you touch type on a flat surface with no distinguishing features between 2 keys? You'd have to look at the keyboard while you were typing = major step back in usability.
To me that is what holds back multi-touch. How do you efficiently and easily enter text and data?
Entering commands and tasks is easy with MT. Working with big icons and buttons are great for this.
Until Apple and others can find a way to minimize text entry or find another way to do it (voice recognition?) then I think conventional methods of interacting with computers will continue to flourish.
When you get do to it, the keyboard is a pretty good way to work with text. Its been around for a long time for a reason.
Multi-touch has no bearing on keyboard usage in my opinion. We should really be discussing the affects it will have on mouse usage, not keyboard usage...
By that reasoning text-entry is holding back mouse usage as well. How do you efficiently enter data and text with a mouse?
..
Keyboards and mice go together.
Many implementations of MT forego the keyboard. MS surface and iphone being the examples that come to mind.
I know that MS and HP are pushing MT with keyboards. Who knows how that will turn out?
Keyboards and mice don't inherently go together and multi-touch isn't inherently keyboardless. The iPhone relies heavily on a virtual keyboard and surface isn't terribly useful for at the moment.
Basically, I'm rejecting the notion that multi-touch is a replacement for the keyboard.
Who knows how it will turn out? Well nobody is certain but there have been a couple decades of research and commercial products relying on touch screen interfaces. The tradeoffs are pretty well known. Touch screen interfaces didn't sneak up on us. Rather, they were sitting there waiting for someone to do something actually useful with them at an affordable price. Those actually useful things aren't generally expected to involve alpha-numeric data entry.
The one exception to this is with devices that are too small to have a physical keyboard without sacrificing other things like screen-size. For the most part this is limited to pocket computers (like the iPhone).
What do I see as the future of keyboards? Physical keys with embedded displays so that the labels can be customized for the context of the task at hand. This combined with a multi-touch screen, a mouse and voice interaction, is how I envision computing for the next few decades if not forever.
Some things evolve to a certain point and then stop. My favorite examples? Doorknobs and door handles. They haven't changed in over a thousand years.
Keyboards and mice don't inherently go together and multi-touch isn't inherently keyboardless. .
Keyboards and mice do go together in the GUI paradigm. Without the mice we're back with at the command line.
I agree that multi-touch isn't inherently keyboardless, but that does seem to be where its heading at the moment IMO. The iPhone has a virtual keyboard which I don't count as a keyboard per se. One implementation of MT and keyboards is the HP touchscreen pc. I've seen it and I'm pretty unimpressed. It doesn't seem to offer any advantages over classic GUI interface. YMMV.
Give it 5 years for multi-touch on desktop to really mature... Hopefully then maybe the keyboard is just a blank slate which you do gestures and "type", has tactile/haptic as well as visual feedback, and there is no need for a separate mouse, say the right hand side (or left) is the space where you "mouse". Only Apple can pull it off within 5 years, but hopefully when the economy turns around fully in a few years more money can start to go back into R&D so maybe a small company will come in and "revolutionize" desktop input... Maybe in 5 years there's no more desktops, just tablets that dock into screens, these screens have these fold-out slates for multi-touch tactile/haptic input etc.
Keep in mind the desktop will always be around, but will likely be quite a bit smaller of a market than the portable, given that costs will be reduced as the years go by, and usability on portable computers will go up.
Wet blanket mode:
...
Basically, I'm rejecting the notion that multi-touch is a replacement for the keyboard.
Well it really can't be for data entry. Voice recognition is the keyboard replacement. Of course folks have been saying that since the 80's and it's been consistently 5 years away for the last 25 years.
Voice coupled with multitouch can replace keyboard and mouse. Even if you stay with the WIMP metaphor. When? Oh, 5 years.
What do I see as the future of keyboards? Physical keys with embedded displays so that the labels can be customized for the context of the task at hand. This combined with a multi-touch screen, a mouse and voice interaction, is how I envision computing for the next few decades if not forever.
Nah. Why? You can have direct manipulation on a small MT display on the keyboard for task actions. Customized labels on physical keys is a concept that faded along with keyboard templates on function keys. As you say, the biggest advantage of the keyboard vs other input devices is the data entry part.
Specialized function keys are so far away from the home row that most folks have to look at them anyway to hit the right one. So looking at a trackpad display isn't any more onerous and you can still have sufficient physical cues to hit certain portions with accuracy without looking.
Or perhaps we're saying the same thing? I'm thinking not so much on the Optimus keyboards as much as trackpad displays.
Some things evolve to a certain point and then stop. My favorite examples? Doorknobs and door handles. They haven't changed in over a thousand years.
Well yes and no. Notice any doorknobs or door handles on automated sliding doors? Yes, they haven't replaced traditional doors but the interaction is different.
Well it really can't be for data entry. Voice recognition is the keyboard replacement. Of course folks have been saying that since the 80's and it's been consistently 5 years away for the last 25 years.
Voice coupled with multitouch can replace keyboard and mouse. Even if you stay with the WIMP metaphor. When? Oh, 5 years.
Nah. Why? You can have direct manipulation on a small MT display on the keyboard for task actions. Customized labels on physical keys is a concept that faded along with keyboard templates on function keys. As you say, the biggest advantage of the keyboard vs other input devices is the data entry part.
Specialized function keys are so far away from the home row that most folks have to look at them anyway to hit the right one. So looking at a trackpad display isn't any more onerous and you can still have sufficient physical cues to hit certain portions with accuracy without looking.
Or perhaps we're saying the same thing? I'm thinking not so much on the Optimus keyboards as much as trackpad displays.
Well yes and no. Notice any doorknobs or door handles on automated sliding doors? Yes, they haven't replaced traditional doors but the interaction is different.
I wouldn't be so optimistic about voice interfaces. A great deal of computer usage occurs in an environment in which it isn't desirable. I suppose we could enclose all office cubicles so that work places don't become noisily chaotic. Also, there are many times when you don't want to be overheard during data input. The obvious reason is privacy but there are others. For instance, customer service phone staff needs to be able to interact with their computer while not interfering with a human conversation.
As for customizable key labels on physical keyboards: The only reason it isn't popular is cost, not because it is a bad idea. Putting a display on each key is prohibitevely expensive (right now).
Similarly, GUI dropdown menus are the reason why keyboard templates/overlays disappeared. Overlays in the top desk drawer were inconvenient. It was more convenient to put application specific commands in GUI menus. These never get lost and make switching between multiple applications much easier.
And the door-knob analogy...
Automatic doors? They haven't changed in a thousand years either. They're analogous to gates operated from gate houses. There weren't any handles/knobs on these doors either.
While the analogy is somewhat stretched, it was meant to demonstrate that human characteristics and limitations define the optimal human interface for some tools or devices, no matter how advanced technology gets.
I only kid slightly you know, all too often when I'm working my gui is just used as a window-manager to tile my terminals, if I didnt have the full gui screen would work nearly as well :-p
Now, *displaying* my work on the other hand, gui all the way!
I guess my point here is that both definitely still have their place
I wouldn't be so optimistic about voice interfaces.
I'm sarcastically optimistic about voice interfaces. I think it will perpetually be 5 years away.
A great deal of computer usage occurs in an environment in which it isn't desirable. I suppose we could enclose all office cubicles so that work places don't become noisily chaotic. Also, there are many times when you don't want to be overheard during data input. The obvious reason is privacy but there are others. For instance, customer service phone staff needs to be able to interact with their computer while not interfering with a human conversation.
Sure, but the lack of tactile reference and feedback using virtual (multitouch) keyboards is a hindrance to data input speeds. Is it solvable? Sure, with some kind of surface that can deform to form key ridges.
i don't have the bookmark for the paper on multitouch keyboards vs physical keyboards handy but the error rates go up and the typing speed drops.
As for customizable key labels on physical keyboards: The only reason it isn't popular is cost, not because it is a bad idea. Putting a display on each key is prohibitevely expensive (right now).
Yes, it is a function of cost vs utility. It's not a bad idea at low cost...but the low cost/low tech solution was paper templates on the function keys that used to be popular. If the utility was high, we'd still have those.
The ZBoard gaming keyboard is another lower cost/lower tech alternative to the Optimus for gaming that didn't pan out.
Similarly, GUI dropdown menus are the reason why keyboard templates/overlays disappeared. Overlays in the top desk drawer were inconvenient. It was more convenient to put application specific commands in GUI menus. These never get lost and make switching between multiple applications much easier.
True.
While the analogy is somewhat stretched, it was meant to demonstrate that human characteristics and limitations define the optimal human interface for some tools or devices, no matter how advanced technology gets.
Sure. Technology changes a whole lot faster than human evolution.