Inside the multitouch FingerWorks tech in Apple's tablet

Posted:
in iPad edited January 2014
The hyped anticipation surrounding the Apple Event later this week is looking for clues as to exactly what the company might deliver. One element of the anticipated new tablet's software side is related to Apple's 2005 acquisition of multitouch technology and expertise from FingerWorks.



This article features the evolution of the software side of tablets and technology related to multitouch interfaces. The hardware side of historical tablet products was profiled earlier in The inside track on Apple's tablet: a history of tablet computing.



The infusion of technology developed by FingerWorks meshed with research Apple had already been working on in the area of touch-based interfaces as an alternative to the keyboard. Since the origins of the personal computer in the mid 1970s, the conventional keyboard has always been its primary interface. But investigation into alternative finger touch methods of computer input was in progress at least as early as 1982, when Nimish Mehta at the University of Toronto published research involving cameras, placed behind a translucent panel, that could record multiple touch points of a user's hands.



Mice and then trackballs were added to provide pointer-centric navigation within the graphical environment that the Macintosh popularized in the 1980s. Having a keyboard still remained essential in personal computing however, in many cases being more efficient than trying to use an alternative device to point to items in a menu.



The stylus takes on the keyboard: 1990s



In the 1990s, the idea of stylus-based "pen computing" questioned whether the keyboard was still absolutely necessary, particularly in mobile devices. Apple's 1993 Newton Message Pad offered an external keyboard accessory, but it was primarily designed to be used via its stylus, using a series of pen gestures and handwriting recognition for text input. Apple also prototyped a PowerBook-based tablet system called the PenLite in 1993, but did not release it to avoid affecting Newton sales.



The Newton's advanced ink technology was criticized and mocked for its initial inaccuracy, a problem Apple largely corrected in its Newton 2.0 release. By then however, many were convinced that its ink recognition technology wasn't really feasible. When Palm launched the Pilot in 1996, it used a simplified alphabet system called Graffiti that greatly reduced recognition errors, although it also required learning a new handwritten input system of simplified letter forms.



Palm's Graffiti input software had premiered on the Newton, but Apple's PDA platform failed to reach a critical mass in sales. Palm's popular Pilot PDAs powered by Graffiti initially seemed to suggest a rosy future for stylus input as a keyboard alternative. While compact external keyboards were available for it as well, the value of the Palm Pilot's portability came largely from its pen.



in 1997, Apple released a new Newton form factor: the eMate, which paired a stylus with a conventional keyboard in a mini-laptop design. The device was aimed at education at a time when schools were unlikely to spring for full-powered, full priced notebooks for every student. Before it had much time to be evaluated by the market however, the entire Newton line was pulled in early 1998 as Apple worked to focus on its most promising platforms in an effort to return to profitability.



Microsoft belatedly attempted to deliver its own alternative to Palm's PDAs by morphing its unsuccessful Handheld PC product line (clamshell mini-laptops with keyboards) into stylus-based Palm PC PDAs around 2000 (they were later renamed Pocket PC after Palm objected to the name). The company also attempted to resurrect Newton-style freehand handwriting recognition by licensing technology from Apple spinoff General Magic in 1998.



Microsoft licensed its Pocket PC operating system (built on the Windows CE kernel) to a variety of hardware manufacturers over the decade of the 2000s, but the PDA category failed to materialize as a significant market. Since 2001, the company has also marketed a stylus-based tablet version of its desktop Windows platform (based on the Windows NT/XP kernel), which has also been unsuccessful outside of a few niche markets. Bill Gates' 2001 prediction that "within five years [the stylus-drive Tablet PC] will be the most popular form of PC sold in America" simply failed to materialize.







On page 2 of 3: Touch takes on the keyboard.



Touch takes on the keyboard: 2000s



The venerable keyboard survived unscathed through all the pen-centric hype of the 1990s, when the enthusiastic suggestion that mainstream users would migrate away from typing and back towards the older convention of writing with a stylus simply didn't pan out. Instead, the use of keyboards began to expand and morph into the unrestricted touch-based interfaces long envisioned by science fiction writers.



Rather than advancing the efficiency of keyboard typing by shifting toward a supposedly more "natural" input system designed to mimic the ancient writing instrument (as Pen Computing was largely seen to be fated to accomplish in the 90s, and as Gates predicted for the following decade), the stylus only gave users a slower, clumsier way to interact with their devices.



Vast sums invested into developing the technology to read handwritten text had failed to solve many of the problems associated with typing on a keyboard: the physical stress of writing was not much of an improvement over banging on a keyboard; the input speed was much lower; and benefits in the area of physical size were somewhat nullified by the inconvenience of having a stylus device that had to be stowed and was easy to lose.



Just as Palm's stylus-driven PDAs began reaching the height of their popularity, Apple launched a mobile device that similarly avoided any use of a keyboard. Rather than using a stylus input however, Apple's new iPod used a mechanical scroll wheel which made navigating through its menus quick and easy. It was not very good at entering any large amount of text, but it did become a very popular way to pilot through large music collections.



Starting in 2002, Apple's successive iPod designs used solid state (non-mechanical) touch-sensitive click wheels. Apple had previously pioneered the use of touch-sensitive trackpads in its 1994 PowerBook 500, which was the first notebook to use a solid state pointing device rather than a mechanical trackball or joystick. Nearly a decade later, the company was now making touch-sensitivity the primary user interface for a new class of mobile devices.



Meanwhile, Palm and Microsoft began adapting their stylus-driven PDA operating systems to serve as mobile phones. Palm's Treo line converted the conventional Pilot PDA into a device with a BlackBerry-like mini keypad and a stylus-driven screen. Microsoft's licensees developed a variety of devices with different combinations of mini-keyboards, full-sized sliding keyboards, and stylus-driven screens. Microsoft's original definition of its "Windows Smartphone" actually described a device without a touchscreen at all, navigated entirely by physical buttons.







Apple eyes FingerWorks: 2005



Early smartphone users commonly grew savvy enough at typing with their thumbs to simply ignore the unwieldy stylus provided to tap at the screen, particularly when entering text. As stylus use rapidly fell out a favor, new keyboard technology was released by a startup called FingerWorks. Its devices were essentially trackpads designed to respond to multiple touch points at once, enabling both keyboard-like chording and intuitive gestures similar to those used by the Newton, but performed by finger touch rather than a stylus.



FingerWorks was led by John Elias and Wayne Westerman, two pioneering multitouch researchers who had worked together at the University of Delaware. After founding the company in 1998, the pair produced a series of devices that served as multitouch trackpad devices, from the full TouchStream keyboards to the iGesture Pad, a multifunction, programmable mouse replacement peripheral.



The company gained positive reviews among an enthusiastic niche of power users and people with Repetitive Stress Injury, who reported that FingerWorks' large, low-impact trackpad devices enabled them avoid the taxing pain associated with using mechanical keyboards. However, the company continued to struggle to reach mainstream users up until its assets were mysteriously bought out by an unnamed source in 2005.







It was later revealed that FingerWorks' technology and founders had become part of Apple, after lawsuits against the Mac maker referenced its acquisition of FingerWorks. Additionally, a series of new patents filed by Elias and Westerman were associated with Apple.



Over the next year and a half, research within the company adapted FingerWorks' multitouch ideas from an opaque trackpad surface to a transparent layer of a capacitive touchscreen enabling the kind of direct, multitouch manipulation demonstrated by the iPhone in January 2007.







On page 3 of 3: Steve Jobs kills the stylus.



Steve Jobs kills the stylus



During Apple's year and a half of multitouch development, Jeff Han at New York University's Courant Institute of Mathematical Sciences demonstrated his own independent research into multitouch user interfaces at TED in February 2006. Han's demonstration quickly spread interest in multitouch features.



After seeing the iPhone's debut, Han reportedly said, "The iPhone is absolutely gorgeous, and I've always said, if there ever were a company to bring this kind of technology to the consumer market, it's Apple. I just wish it were a bit bigger so I could really use both of my hands."







At the iPhone's introduction, Steve Jobs boldly announced that stylus-driven interfaces that Microsoft's Gates had hailed just a half decade earlier were no longer worth investigating. "Now, how are we going to communicate this?" Jobs said of the iPhone. "We don't want to carry around a mouse, right? So what are we going to do? Oh, a stylus, right? We're going to use a stylus. No. Who wants a stylus? You have to get em and put em away, and you lose em. Yuck. Nobody wants a stylus. So let's not use a stylus."



We're going to use the best pointing device in the world. We're going to use a pointing device that we're all born with -- born with ten of them. We're going to use our fingers. We're going to touch this with our fingers. And we have invented a new technology called multi-touch, which is phenomenal. It works like magic. You don't need a stylus. It's far more accurate than any touch display that's ever been shipped. It ignores unintended touches, it's super-smart. You can do multi-finger gestures on it. And boy, have we patented it."



So we have been very lucky to have brought a few revolutionary user interfaces to the market in our time. First was the mouse. The second was the click wheel. And now, we're going to bring multi-touch to the market. And each of these revolutionary interfaces has made possible a revolutionary product: the Mac, the iPod and now the iPhone."







Competitors react to Apple's touch



Microsoft responded to the iPhone by demonstrating its own multitouch system: a camera-driven, table-top appliance called the Surface, which is designed to act as a large information kiosk that can respond to multiple touch points and to specially barcoded objects placed on it. However, the company has continued to sell a stylus-oriented interface for its ill-fated Tablet PC devices and Windows Mobile smartphones, with promises of a touch-based upgrade repeatedly delayed by technical problems. Windows Mobile 7 with iPhone-like touch features is now expected no sooner than early 2011, a full four years after the iPhone's debut.



Despite far more limited resources and failing fortunes, Palm was able to deliver its own multitouch device in the Palm Pre just three years after the iPhone. Palm's development team greatly benefitted from an infusion of Apple talent, from executive Jon Rubinstein to iPhone engineers looking to work on new projects outside of Apple.



In addition to Palm and Microsoft, other companies have also found it difficult to follow Apple's footsteps without also infringing upon its patented technology. RIM found its development of the iPhone-like BlackBerry Storm to be both challenging and problematic, while Google has cautiously worked to take its Android "Windows Mobile-killer" and modify it to work more like the iPhone, without also running into any multitouch grievances with Apple. The original Android prototypes were Windows Mobile-like devices with lots of physical buttons; years later, they are looking more and more like the iPhone, with expanded use of touch interface features.



The future of touch



Outside of smartphones, Apple has also applied its multitouch technology in MacBook trackpads and the new Magic Mouse. Both are rather conservative implementations of multitouch gestures which don't require much specialized training from users. For its tablet and future trackpad devices, Apple may introduce a new layer of sophistication in multitouch gestures. Patent filings suggest the possibility of a new interface that manipulates objects represented in a deep three dimensional space.



It's also possible Apple may release an advanced keyboard along the lines of FingerWorks' original TouchStream, presenting a flat touchpad with zero force, multitouch input. The company has steadily rolled out multitouch trackpad enhancements for its MacBook line, but has a long ways to go before it match the fancy gestures (with potential to learn programmable functions) that FingerWorks supported in its iGesture Pad and TouchStream keyboards. FingerWork's devices could enter modes suited to specific applications, such as games, Maya or Photoshop; or specific uses, including general desktop control, search, text selection and styling, and browsing functions.



Many critics initially assailed the iPhone's virtual keyboard, but the popularity of Apple's smartphone since suggests tremendous potential for new applications of multitouch interfaces that augment or even replace the conventional mechanical keyboard. In addition to helping users avoid RSI damage, touch sensitive input allows for a complex vocabulary of gestures, the input typing speed of a keyboard, the pointing accuracy of a mouse, and a customizable degree of complexity scaling from the needs of basic users to very advanced, specialized functionality.



The advantages of touch-driven interfaces are clear, and suggest lots of potential for future applications in both mobile devices and desktop systems. Apple certainly isn't alone in working to productize and deliver new technology in the category of multitouch devices, but the future of touch interfaces may make a big leap next week with Apple's expected tablet introduction.
«13456789

Comments

  • Reply 1 of 161
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by AppleInsider View Post


    At the iPhone's introduction, Steve Jobs boldly announced that stylus-driven interfaces that Microsoft's Gates had hailed just a half decade earlier were no longer worth investigating. "Now, how are we going to communicate this?" Jobs said of the iPhone. "We don't want to carry around a mouse, right? So what are we going to do? Oh, a stylus, right? We're going to use a stylus. No. Who wants a stylus? You have to get em and put em away, and you lose em. Yuck. Nobody wants a stylus. So let's not use a stylus."



    We're going to use the best pointing device in the world. We're going to use a pointing device that we're all born with -- born with ten of them. We're going to use our fingers. We're going to touch this with our fingers. And we have invented a new technology called multi-touch, which is phenomenal. It works like magic. You don't need a stylus. It's far more accurate than any touch display that's ever been shipped. It ignores unintended touches, it's super-smart. You can do multi-finger gestures on it. And boy, have we patented it."



    I’ve already started exercising my eye rolling in preparation of some posters crying foul on Apple for offering a stylus accessory despite the fact that the primary input for the tablet will be finger-based with the stylus being for select app features for particular users, like drawing diagrams in class.





    Start workout: superior rectus muscle intorsion with left extorsion with right and 1 and 2 and 3 and, superior rectus muscle extorsion with right and intorsion with left and 1 and 2 and 3 and… Also, do forget to drink plenty of fluids on Wednesday
  • Reply 2 of 161
    irelandireland Posts: 17,798member
    A part of me feels we won't see a stylus.
  • Reply 3 of 161
    Quote:
    Originally Posted by solipsism View Post


    I?ve already started exercising my eye rolling in preparation of some posters crying foul on Apple for offering a stylus accessory despite the fact that the primary input for the tablet will be finger-based with the stylus being for select app features for particular users, like drawing diagrams in class.



    I agree, there is nothing like a sharp pencil, or the stylus equivalent.

    Larry
  • Reply 4 of 161
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by Ireland View Post


    A part of me feels we won't see a stylus.



    As the de facto input method, absolutely not. As an included accessory, I doubt that, too.



    But there is a need for a stylus if this tablet is going to be marketed across the board, like I think it is. A stylus for signatures, drawing in many various situations, and even replicating the annotations that we do in textbooks with a simple stylus that can change from a highlighter, underliner, strikethougher(?), etc.



    They do have a fairly recent patent for a capacitance stylus.
  • Reply 5 of 161
    nagrommenagromme Posts: 2,834member
    I expect a stylus to be optional?or even left for third parties?as a compromise.



    P.S. The fingerworks stuff is really cool... but strangely enough, I think that middle Windows Mobile phone with the sideways flip keyboard is kind of cool too.
  • Reply 6 of 161
    solsunsolsun Posts: 763member
    Quote:
    Originally Posted by AppleInsider View Post




    Outside of smartphones, Apple has also applied its multitouch technology in MacBook trackpads and the new Magic Mouse. Both are rather conservative implementations of multitouch gestures which don't require much specialized training from users. For its tablet and future trackpad devices, Apple may introduce a new layer of sophistication in multitouch gestures. ][/url][/c]



    Id love to be able to connect the tablet via bluetooth to act as a full touch control surface for my desktop Mac Pro..
  • Reply 7 of 161
    tbelltbell Posts: 3,146member
    My step dad just bought a 27" iMac from the Apple store. He signed an iPod Touch using his finger.



    Quote:
    Originally Posted by solipsism View Post


    As the de facto input method, absolutely not. As an included accessory, I doubt that, too.



    But there is a need for a stylus if this tablet is going to be marketed across the board, like I think it is. A stylus for signatures, drawing in many various situations, and even replicating the annotations that we do in textbooks with a simple stylus that can change from a highlighter, underliner, strikethougher(?), etc.



    They do have a fairly recent patent for a capacitance stylus.



  • Reply 8 of 161
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by TBell View Post


    My step dad just bought a 27" iMac from the Apple store. He signed an iPod Touch using his finger.



    How did the signature compare to using a typical writing instrument?
  • Reply 9 of 161
    mactrippermactripper Posts: 1,328member
    Front mounted camera/web cam/finger pointer tracker combined with fancy software.



    The onscreen pointer then can be as big or as little as you wish, push in for bigger, out for smaller.



    Now imagine how you hold a pen to sign your signature, extend your index finger out just a wee bit and you get the idea.





    Could be he next great thing.
  • Reply 10 of 161
    irelandireland Posts: 17,798member
    Quote:
    Originally Posted by solipsism View Post


    As the de facto input method, absolutely not. As an included accessory, I doubt that, too.



    But there is a need for a stylus if this tablet is going to be marketed across the board, like I think it is. A stylus for signatures, drawing in many various situations, and even replicating the annotations that we do in textbooks with a simple stylus that can change from a highlighter, underliner, strikethougher(?), etc.



    They do have a fairly recent patent for a capacitance stylus.



    Do get me wrong, I see the need for it and want to see it, but you just don't know with Apple. Just look at the magic mouse, it's terrible.
  • Reply 11 of 161
    irelandireland Posts: 17,798member
    Quote:
    Originally Posted by solipsism View Post


    How did the signature compare to using a typical writing instrument?



    I can tell you it wasn't the same, I've seen videos of people signing the touch and you can just tell it's not the same thing. But I suppose for signing your name it would do.
  • Reply 12 of 161
    irelandireland Posts: 17,798member
    Quote:
    Originally Posted by MacTripper View Post


    Front mounted camera/web cam/finger pointer tracker combined with fancy software.



    The onscreen pointer then can be as big or as little as you wish, push in for bigger, out for smaller.



    Now imagine how you hold a pen to sign your signature, extend your index finger out just a wee bit and you get the idea.





    DONE, no rolling of eyes necessary.





    Next question please...



    No one asked you a question.
  • Reply 13 of 161
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by Ireland View Post


    I can tell you it wasn't the same, I've seen videos of people signing the touch and you can just tell it's not the same thing. But I suppose for signing your name it would do.



    That’s what I’d think. I’ve used Autograph for the Mac trackpad and it works okay for a signature but it’s not as natural as using a stylus/pen and it’s simply not effective for any intricate drawing or anything requiring a fine point.
  • Reply 14 of 161
    A stylus would be highly useful for using Photoshop etc. Even better would be a Wacom-style pressure sensitive one.



    However for general use the ol' digits are far more comfortable. What will probably be missing from the tablet is feedback to touch. Afaik Nokia is developing a technology that would allow you to use a virtual keyboard but feel the buttons by having the touch screen able to make the surface feel like a real button based on what is shown on screen.
  • Reply 15 of 161
    mactrippermactripper Posts: 1,328member
    Quote:
    Originally Posted by kasakka View Post


    A stylus would be highly useful for using Photoshop etc. Even better would be a Wacom-style pressure sensitive one.



    However for general use the ol' digits are far more comfortable. What will probably be missing from the tablet is feedback to touch. Afaik Nokia is developing a technology that would allow you to use a virtual keyboard but feel the buttons by having the touch screen able to make the surface feel like a real button based on what is shown on screen.





    Is that the same Nokia that's suing Apple (and vise versa) and now is giving free turn by turn GPS with their phones?



    http://www.computerworld.com/s/artic..._TomTom_Garmin
  • Reply 16 of 161
    A couple weeks ago AppleInsider reported that some who had seen the tablet were claiming that we would be surprised by the input method. Others have suggested that the tablet will make extraordinary use of multi-touch. A week ago an AI post suggested that iphones may come with touch sensitive panels on the back cover. This article quotes Steve Jobs emphasizing that one of Apple's keys to success is introducing intuitive new interfaces.



    Speculating and crossing fingers, I suggest that the key innovation in the iPad will be multi-touch back navigation, in particular "back-typing." The key problem with tablet computers up to this point is input method. You have to hold the tablet, and input data, at the same time. A stylus (while great for some purposes) is lousy for text input. Thumb typing is slow and awkward at best. Back-typing could solve the input problem, and completely change the character and usefulness of a tablet. If Apple can make it feel intuitive and easy, they will have a monumental success.



    Pick up a smallish, thin hardback book (whatever size you think the iPad will be). Hold it between your palms. Amazingly, all of your fingers are free to tap on the back of the book, while your thumbs are free to tap on the front. This does not work with an iPhone because it's too small. All we need now is a typing method that makes use of these freely tapping fingers. It will need to be a new method, probably one that does not require tapping in 26 different spots to choose letters, but there is no principled reason this can't be done. The thumbs could operate space and shift and choose between sets of letters, the fingers would merely tap. I'm convinced this could be at least as fast, and as natural, as standard typing.



    I don't believe that Apple would introduce a device that didn't provide some truly unique hardware related property. This is the one I'm voting for.



    ***



    On the fourth page of this discussion I posted a reply to the worry that typing requires lots of tactile/visual feedback. I don't think many people read all the way to the fourth page, so I decided to cheat and insert the same comment here:



    When I think about back-typing, I think of it as a completely new form of typing. Using a standard keyboard requires a lot of visual and/or tactile feedback because it depends on the position of your fingers as they choose between 30-50 keys. But if the device is smart enough, back-typing would not depend on the position of your fingers, but would require nothing more than tapping in place.



    Here's one possibility: 8 single fingers gives you 8 different symbols, but combinations of 8 fingers gives you as many as 256 different symbols! It would be fairly easy to come up with a set of taps that corresponds to all the standard keys. (This style of input is often called chording.) Thumbs will operate things like space, shift, delete and return. Especially in the learning process, a visual display will be important. I have no clear ideas about what this display would look like, but it could be rather small and out of the way since you don't have to touch it directly. In fact, it's mainly just reminding you which finger tap combinations activate which symbols. The main point is that the tactile feedback of individual keys just isn't necessary in this system. Indeed, once the system is mastered you could even turn off the display.



    Yes, this would require learning. (And I don't doubt that a standard touch keyboard will also be available for those who don't want to master this skill.) But the thing is to see the potential of this form of typing. It would allow me to sit in my favorite chair and type! I could lay in bed and type! I could pace the house and type! Laptops are really very awkward unless you have a table in front of you, and even then they are imperfect. To hold a screen in your hands and type on it as fast as you can tap your fingers–that would be stunning!



    Easy, rapid text input is an essential key to making a universally useful tablet. Let's face it, language is the medium of communication with each other and with computers. Inputting language into a device as easily and rapidly as possible is essential for most tasks. It's the one element that is so far missing from all tablet-style devices. If the iPad does not solve this problem it will still be an excellent all-purpose media-consuming device for print, video, music and gaming. But, if it does solve the text input problem it will replace laptops. Indeed, it could be the biggest computing revolution since the mouse.
  • Reply 17 of 161
    .



    Apparently no one posting, yet, can either type for real with all 8 fingers aka 'touch typing' like they taught in school, or play the Piano



    There is no substitute for "touch" of the fingers on some kind of "keyboard" - can talk all you want about thumbs on Blackberry, et al, but NOTHING can match the speed and accuracy of typing for real on a keyboard by a real typist. ty tyvm



    So now what do we do with iPhone or iTablet ?



    Interesting concept about a "vibrate/feeling" feedback from the Touch Screen, so can know where the fingers are without having to look - is what we typist do with the 'dots' on the F and J Keys so we can always find the "Home Row" - and since keyboards are of similar size/shape we're able to know where all the other keys are without having to look - again, same as the Piano



    But that darn Touch Screen - all flat and glass, no 'dots' - how will that problem be solved ?



    Humm, well, reckon we'll find out next week (smile)



    But it better have some kind of no-brainer sensory feedback method for the fingers



    Has served us well for 100s of years



    Why do you think they call them "keys" in the first place ?



    (smile some more)





    BC
  • Reply 18 of 161
    Quote:
    Originally Posted by Ireland View Post


    Just look at the magic mouse, it's terrible.



    Well you obviously don't know how to use it.
  • Reply 19 of 161
    Quote:
    Originally Posted by Ireland View Post


    A part of me feels we won't see a stylus.



    I agree. It might be a third-party device.



    As I was replying to this, I noticed that your 'tag line' disappeared. What is 'Unicorn'?
  • Reply 20 of 161
    isaidsoisaidso Posts: 750member
    Quote:
    Originally Posted by Ireland View Post


    I can tell you it wasn't the same, I've seen videos of people signing the touch and you can just tell it's not the same thing. But I suppose for signing your name it would do.



    So what's your point? People's signature is different when they sign on a hard screen vs. paper anyway. Hell,it's different even on a sheet of paper depending on whether a person is using any variety of different pen tip technologies, or a pencil
Sign In or Register to comment.