A future without a keyboard and mouse

Posted:
in Future Apple Hardware edited January 2014
The keyboard and mouse have been the traditional input methods for as long as computers have been around but I have never felt they were very natural. When you communicate with another person face to face, imagine how slow it would be if you had to type every phrase.

They clutter up workspaces if you need to have papers on your desk. They constrain form factors of devices like laptops. People have to be taught how to use them.

The capacitive touch screen that the iPhone popularised made it clear that computers could exist perfectly well without hardware keyboards and mice but there's a new device that can help:

https://live.leapmotion.com/about/

[VIDEO]

It creates a 3D virtual space accurate up to 1/100th of a mm.

This allows full gestures, you would be able to type onto a virtual keyboard (good for multi-language keyboards), you can draw with pressure sensitivity and so on. All with just a tiny device.

It doesn't have to get rid of the tactile devices but none of them have to be connected. Imagine a gamepad. It can have no electronics inside but just dummy buttons, the 3D sensor will know which button you press. Same with a keyboard. This means an iMac can have the sensor fitted on the front and you just sit a dummy keyboard on the table. It would significantly cut the cost of Apple's wireless keyboards and remove the wake-up times over bluetooth.

This can add to the iPad's abilities as it could sense your intent before touching the screen so it could deal with hover gestures in browsers and for most things, prevent you having to smudge up the screen.

They can print a keyboard on the smart cover and you just tap on the pattern for it to know which keys you are pressing if you prefer tactile feedback or a virtual keyboard can show your finger locations with selection states as you hover your hands around to let you see which key you are about to press and just do a poke gesture to activate it. If you make a mistake, just make a swipe gesture to delete the character.

Apple seems to be wondering how to spend their hoard of cash, well I'd say this technology would be a good start. It can make Macs fun to use the way iOS devices are.

Comments

  • Reply 1 of 7
    MarvinMarvin Posts: 15,486moderator
    There are some other videos showing real usage on laptops and desktops:

    [VIDEO]

    [VIDEO]

    [VIDEO]

    The obvious flaw that people are quick to point out is having to hold your hands up while moving things but if the 3D volume was positioned over a surface like a laptop base, it would just act like a giant trackpad. Apple could perhaps use their laser perforation on the surface with an array of LEDs behind a sheet of e-ink-like material. A user could assign a pattern to the e-ink sheet and it wouldn't need to be refreshed constantly.

    Although you lose the tactile feedback, you can put anything on the surface. The Adobe Suite can have the palettes on it. Games can have their array of icons showing. The sensor can do predictive text too. As you type, the sensor can tell what key you are about to press and as you type, it can print the next character before you type it so you know you are on the right key.

    [URL=http://forums.appleinsider.com/image/id/165320/width/600/height/446][IMG]http://forums.appleinsider.com/image/id/165320/width/600/height/446[/IMG][/URL]

    The big advantage tactile devices always had was that you could touch the device and then activate it. With the 3D sensor, you get these two states on a touch surface and obviously more as you get depth. You would be able to draw using your finger hovering over the surface of the laptop base just like a Wacom. A Wacom has 2048 levels of pressure sensitivity, the LEAP would give you that in about 2cm of depth.

    No more capacitive stylus or any stylus, it lets you use your fingers to draw accurately but if you had to, just pick up any old stick you like.

    This can do facial recognition too. Just stick your face in the beam and it will be able to sample the exact shape of your face distinguishable from someone else. Pick up an iPad and it can load your user account based on your face sample, maybe even your hand.

    They are saying Winter 2012 for this shipping so it's not as far off as most of the other tech you read about. Even if Apple doesn't do anything with this directly, you can be sure end users will. Someone will find a way to use it with an iPad, even with external power if needs be and it can be stuck to the screen or something.

    iPhone/iPad 2013
    Tactile screen + 3D sensor + Rogue graphics + 128GB ReRAM
  • Reply 2 of 7
    tallest skiltallest skil Posts: 43,388member


    The mouse's time has come, certainly.




    But as good as I am with a touchscreen keyboard, I believe there will always be a legitimate desire with some for a physical one.


     


    Though seeing the functions of the keys as they change on a per-application basis will be a MIGHTY tempting reason to forgo the physical.


     


    It'd be like this, only not just a concept, not thousands of dollars, and built into the machine:


     


    optitact-text.jpg

  • Reply 3 of 7
    mike fixmike fix Posts: 270member


    I like physical input devices.  As I work my hands will rest on the keyboard, or mouse.  I would hate to have to keep lifting my hands or keep them elevated except when inputing.  This would also lead to all sorts of new carpel tunnel issues.

  • Reply 4 of 7
    junitorjunitor Posts: 8member


    i cant imagine what those professional star craft 2 players' hands will be like:  damn tired at 600 swipe per minute = intense exercise = need to go to gym to train up your muscles.


     


    but i will love to have these

  • Reply 5 of 7
    dobbydobby Posts: 797member


    They have a woman who can control a mechanical arm via thought so shouldn't a few finger clicks and wrist shakes be easier to put straight to the computer?


     


     


    http://www.sciencedaily.com/releases/2012/05/120516140002.htm

  • Reply 6 of 7
    doctorgonzodoctorgonzo Posts: 529member


    Vertical touch screen interfaces fail the ergonomics test. I believe even Steve Jobs dismissed them due to "gorilla arm". 

  • Reply 7 of 7
    MarvinMarvin Posts: 15,486moderator
    Vertical touch screen interfaces fail the ergonomics test. I believe even Steve Jobs dismissed them due to "gorilla arm". 

    It's only vertical in the demo. Imagine the 3D volume sitting just above the desk surface. You would be able to rest your arms on the desk and just move your fingers around. It can map a single finger's movement range to the entire screen. 1mm of movement gives 100 steps of accuracy so you can move a cursor on a 27" display from left to right, pixel by pixel by simply moving your finger 2.5cm.

    You can even program gestures for letters or words. Right now, we have every key mapped out but that's not really necessary. Consider a calculator, instead of having any physical form, you just look at the app on screen. Your hand can be resting on the table like a closed fist with only your index finger extended. As you move it inside a small range, the keys highlight and you just do a small tap on the desk to press the buttons. This is less strain than a physical numpad.

    A keyboard can be simulated using grids. Split the alphabet into 3 groups of 9 and your first 3 fingers can press the keys while the thumb and small finger switch the grids. The left hand can deal with modifiers or even split the grids between each hand. The visual respresentation can be on-screen.

    It will take a lot of experimentation to get something that feels natural and comfortable just like the iOS interface but it's just a matter of time.

    Physical inputs are obviously used with things like boot modes but there's no reason why gestures can't work here too. If your computer has crashed and won't boot, you can put your hands together like you are praying for it to work and it will know you need help and load recovery mode. Raise the middle finger for single user mode as you tell the computer who's boss.

    It makes input contextual, which is one of the most important things the iPad has done. If you open a video editing program, the keyboard is gone as it should be. In FCPX, you would be able to pan around with two fingers. If you need to cut, do a single finger movement downwards vertically to slice at the current location. You don't need to think about mapping fixed inputs to software they weren't designed for.

    At first, people will supplement standard inputs with gestures but as people get used to them and children grow up with them, the old ways will move towards obsolescense for the simple reason that they are a fixed subset of a more complex input system.

    You know, we might even be talking to our computers if they can do facial scanning that accurately but without speaking. We would just mime words and the computer can do a form of 3D lip-reading.
Sign In or Register to comment.