Palm to Apple: We'll "vigorously" defend our IP, too

12346»

Comments

  • Reply 101 of 113
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by DocNo42 View Post


    I'm just curious - do you have an example of someone else using a pinch for zooming before Apple debuted the iPhone?



    Prior art for pinch here: http://www.billbuxton.com/multitouchOverview.html



    · An early front projection tablet top system that used optical and acoustic techniques to sense both hands/fingers as well as certain objects, in particular, paper-based controls and data.

    · Clearly demonstrated multi-touch concepts such as two finger scaling and translation of graphical objects, using either a pinching gesture or a finger from each hand, for example.



    1991 Xerox PARC (where else?)



    Pinching was also in the 1992 Sun "Starfire" movie made by Bruce Tognazinni (formerly Apple GUI guru)



    Quote:
    Originally Posted by THT View Post


    It's good thing that the Minority Report UI has nothing to do with the iPhone, no? [sarcasm]I mean, come on, the iPhone requires that you use the device without a glove, while in Minority Report, you had to put on a glove. [/sarcasm] The Minority Report style interface, and the like one in Johnny Mnemonic, are crazy ravings from science fiction writers/directors who have basically zero understanding of user interface design. Nobody in their right minds would create UIs like that.



    Actually the producers for Minority Report based it the UI on some actual UI research...the concept art for variations on how the UI worked is interesting to look at and if I recall correctly done by someone in the field but I forget her name.



    They opted for a vertical presentation in the movie but there were some desk designs reminiscent of the Sun Starfire concepts. In any case, the "ravings" are part of most current multi-touch UI designs.



    So much for "zero understanding of user interface design" but scores high on the irony meter for the winner of the "least understanding of multi-touch UI history award". Minority report came out in 2002...a year after MERL had shown Diamond Touch at trade shows and a decade after the Starfire concept movie and the Xerox PARC digital desk. Even so, it has been an inspiration for a generation of UI designers (young ones but hey).



    There is definitely a strong "direct manipulation" school of thought within the UI community that's been around a couple decades. So pray tell, what makes you an expert on UIs?
     0Likes 0Dislikes 0Informatives
  • Reply 102 of 113
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by TenoBell View Post


    All of the business people I see are only using BlackBerry's. Clearly Palms current marketshare shows that it is not a common business device.



    When I started at my current job in 2003 a Palm was standard issue. I haven't SEEN a Palm device in mainline use in years. Everyone has a crackberry. No Treos.



    My Palm V is lost in a drawer somewhere and while I have a Tungsten, I actually haven't used it for anything useful in quite a while. Palm really killed itself and I dunno that the Pre will save it. The Pre, shipped in 2007 would have. Mid 2009? WTF? It is freaking webkit+bridge on top of linux.
     0Likes 0Dislikes 0Informatives
  • Reply 103 of 113
    thttht Posts: 6,018member
    Quote:
    Originally Posted by vinea View Post


    Prior art for pinch here: http://www.billbuxton.com/multitouchOverview.html



    · An early front projection tablet top system that used optical and acoustic techniques to sense both hands/fingers as well as certain objects, in particular, paper-based controls and data.

    · Clearly demonstrated multi-touch concepts such as two finger scaling and translation of graphical objects, using either a pinching gesture or a finger from each hand, for example.



    1991 Xerox PARC (where else?)



    Pinching was also in the 1992 Sun "Starfire" movie made by Bruce Tognazinni (formerly Apple GUI guru)



    Real artists ship.



    Reading up on the digital desk, the pinch they had wasn't a real-time pinch as seen in the iPhone or Jeff Han/Perceptive Pixel demos. Users of the digital desk's sketching application could copy-n-paste some object where the pasted object could be pinched for scaling the object. The object was represented as a black rectangle.



    I watched the Starfire video, and you'll have to tell me when the pinching happens, because I didn't see it. No doubt Tog thought of such things, though.



    If anything, it's all about slow evolution. What you see in the iPhone is unique. It uses some of the same multi-touch conventions that are in these demos, but the iPhone implements it different and implements it on a totally different device. And real artists ship. You have to give Apple some kudos for shipping. They found a real, useful application for a finger-based touchscreen UI.



    Quote:

    Actually the producers for Minority Report based it the UI on some actual UI research...the concept art for variations on how the UI worked is interesting to look at and if I recall correctly done by someone in the field but I forget her name.



    They opted for a vertical presentation in the movie but there were some desk designs reminiscent of the Sun Starfire concepts. In any case, the "ravings" are part of most current multi-touch UI designs.



    So much for "zero understanding of user interface design" but scores high on the irony meter for the winner of the "least understanding of multi-touch UI history award". Minority report came out in 2002...a year after MERL had shown Diamond Touch at trade shows and a decade after the Starfire concept movie and the Xerox PARC digital desk. Even so, it has been an inspiration for a generation of UI designers (young ones but hey).



    There is definitely a strong "direct manipulation" school of thought within the UI community that's been around a couple decades. So pray tell, what makes you an expert on UIs?



    I don't care for appeal to authority arguments. If you believe the Minority Report UI is a good UI, you should be able to communicate that.



    The Minority Report UI is utter crap. Why? People would get tired using the thing. This is the same reason why desktop touchscreens really don't work. One can only use it for short periods of time. Since it is so, what's the use? It can be the emergency TV remote I've been looking for when my TV remote is lost and I need change the channel? What application is it going to have? Some type of time of shortened sign language communicating with the computer? That'll be pretty steep learning curve, and tiring compared to just typing. The user interface conventions used in such a manner are pretty looking, but they are still very far away from being useful.



    Moreover, I'd submit the tabletop touch UIs such as the digital desk and MS Surface are not that useful and have limited markets only.
     0Likes 0Dislikes 0Informatives
  • Reply 104 of 113
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by THT View Post


    Real artists ship.



    Reading up on the digital desk, the pinch they had wasn't a real-time pinch as seen in the iPhone or Jeff Han/Perceptive Pixel demos. Users of the digital desk's sketching application could copy-n-paste some object where the pasted object could be pinched for scaling the object. The object was represented as a black rectangle.



    I watched the Starfire video, and you'll have to tell me when the pinching happens, because I didn't see it. No doubt Tog thought of such things, though.



    It's a very minor part of Starfire. But someone asked for prior art on pinch and there's 20 years worth of multi-touch prior art.



    Han certainly did not develop the pinch, nor did Apple.



    Quote:

    If anything, it's all about slow evolution. What you see in the iPhone is unique. It uses some of the same multi-touch conventions that are in these demos, but the iPhone implements it different and implements it on a totally different device.



    Everything is slow evolution. Anyone that tells you different is selling you something. Is the form factor different? Sure. But the underlying interactions are part of a couple decades worth of slow evolution.



    Quote:

    And real artists ship. You have to give Apple some kudos for shipping. They found a real, useful application for a finger-based touchscreen UI.



    I do give apple plenty of kudos for shipping. just like with the mouse and the GUI. The point however, it's that Apple wasn't the first to make something practical, but that it did not spring forth fully formed out of the brow of Jobs like some modern day Athena.



    Also, if you want to be pedantic, Xerox did ship and many of their concepts were based on research by Englebart who created hypertext, GUIs and mice at Stanford in the '60s



    http://sloan.stanford.edu/MouseSite/1968Demo.html



    If you don't think Engelbart was a real artist then you are a real ass.



    Quote:

    I don't care for appeal to authority arguments. If you believe the Minority Report UI is a good UI, you should be able to communicate that.



    The converse is also true. Rather than simply attacking MR folks as clueless in UI design you could have communicated why. Instead it was you do appealed to authority (your own) by simply declaring it to be so.



    Quote:

    The Minority Report UI is utter crap. Why? People would get tired using the thing.



    First, the UI elements used can be applied either vertically or horizontally. Second, you know there are these things called whiteboards and chalkboards that have successfully been used in the past without excessively tiring out their users...



    Quote:

    This is the same reason why desktop touchscreens really don't work. One can only use it for short periods of time. Since it is so, what's the use?



    Empty assertion. Why are would touch screen work surfaces not be usable for long periods of time? The gestures need to be no more energetic than mouse usage or moving physical paper around. Do you get tired from moving paper around your desk?



    The advantage of direct manipulation of objects can be seen in the improvements in workflow for artists doing direct manipulation on graphical tablets like the Cintiq vs mouse manipulation.



    The challenge for multitouch UI designers is to more closely map visualization of user information to direct manipulation objects. For some domains, this is easy. For others there will be little benefit. If you are creating text documents then there's little gain in moving away from the keyboard.



    Even so, when you watch the Starfire concept you can see where it can be helpful to many knowledge workers when it comes to information retrieval, manipulation and management.



    Quote:

    It can be the emergency TV remote I've been looking for when my TV remote is lost and I need change the channel? What application is it going to have? Some type of time of shortened sign language communicating with the computer? That'll be pretty steep learning curve, and tiring compared to just typing. The user interface conventions used in such a manner are pretty looking, but they are still very far away from being useful.



    These sound exactly like the same objections to GUIs. Typing is easier, there will be a steep learning curve, etc.



    Direct manipulation has lower cognitive load than abstracted manipulation. This is why GUI's are faster than command line interfaces. There is plenty of literature documenting this fact. If you really care, I can get you some citations.



    Quote:

    Moreover, I'd submit the tabletop touch UIs such as the digital desk and MS Surface are

    not that useful and have limited markets only.



    So far I haven't seen anything more than repeated, unsupported assertion from you. Enjoy eating crow sometime in the next decade.



    Vinea
     0Likes 0Dislikes 0Informatives
  • Reply 105 of 113
    thttht Posts: 6,018member
    Quote:
    Originally Posted by vinea View Post


    I do give apple plenty of kudos for shipping. just like with the mouse and the GUI. The point however, it's that Apple wasn't the first to make something practical, but that it did not spring forth fully formed out of the brow of Jobs like some modern day Athena.



    Also, if you want to be pedantic, Xerox did ship and many of their concepts were based on research by Englebart who created hypertext, GUIs and mice at Stanford in the '60s



    http://sloan.stanford.edu/MouseSite/1968Demo.html



    If you don't think Engelbart was a real artist then you are a real ass.



    I don't think they aren't "artists". It was a way of saying there is a wide gulf between a lab product and a shipping product. Xerox shipped a bunch of Alto/Star systems, but ultimately failed because they never got the business/product part of business. They had such a head start too, yet through various ill understandings of market conditions, never were able to capitalize on any of their ideas.



    It's also a way of saying that there is too much accreditation given to the idea-generators in these patent-dispute type discussions. It's just as much an invention of Apple to productize multi-touch into the iPhone as it was for researchers to come up with the idea and product demo it. In fact, Apple probably spent 10x the resources in making it a product than the combined resources of all these researchers, and the researchers did zero to really help Apple out.



    Quote:

    The converse is also true. Rather than simply attacking MR folks as clueless in UI design you could have communicated why. Instead it was you do appealed to authority (your own) by simply declaring it to be so.



    We're having the conversation now. Are you going to invite the MR folks into the forum so they can contribute, or are you going to do it? And yes it was utter crap. They added a lot of things to make it look cool: vertical presentation, airplane-sized gesturing motions, gloves with lights on the fingertips, and some strange visualizations of person's "thoughts". Through all that, I can't what problem it can be used to solve.



    Quote:

    First, the UI elements used can be applied either vertically or horizontally. Second, you know there are these things called whiteboards and chalkboards that have successfully been used in the past without excessively tiring out their users...



    Sure. Lots of whiteboards and chalkboards around. I'd submit that if people use them for a long time, their arms would get tired. It's got another thing in common with a MR-style UI, it's nice for collaborative environments, a limited market.



    Quote:

    Empty assertion. Why are would touch screen work surfaces not be usable for long periods of time? The gestures need to be no more energetic than mouse usage or moving physical paper around. Do you get tired from moving paper around your desk?



    Are we talking about MR, an MS Surface table, a touchscreen PC, or are we talking about something like the multi-point gestures Apple uses for MB/MBP? If a persons arm has to be supported by entirely by their shoulder, I submit it will have limited markets only.



    Quote:

    The advantage of direct manipulation of objects can be seen in the improvements in workflow for artists doing direct manipulation on graphical tablets like the Cintiq vs mouse manipulation.



    The challenge for multitouch UI designers is to more closely map visualization of user information to direct manipulation objects. For some domains, this is easy. For others there will be little benefit. If you are creating text documents then there's little gain in moving away from the keyboard.



    A mouse UI is a direct manipulation UI. Graphical tablets just replace them with stylii. The biggest advantage with a touch UI is having multiple points of contact. You can start adding certain gestures, but the more you add, the more and more it becomes a vocabulary a user has to learn.



    Quote:

    Even so, when you watch the Starfire concept you can see where it can be helpful to many knowledge workers when it comes to information retrieval, manipulation and management.



    I don't see it.



    Quote:

    These sound exactly like the same objections to GUIs. Typing is easier, there will be a steep learning curve, etc.



    Nope. The objections to the GUI from CMD line folks was more about the capability to communicate with the computer, while GUIs were for "non-computer" folks. I don't think anyone complained that typing is easier so therefore GUI's were harder.



    I suppose you could say I lack imagination here, but I'm thinking not. Not seeing much cost/benefit ratio for a MR-style UI other than limited collaborative markets. What problem is it trying to solve?



    Quote:

    Direct manipulation has lower cognitive load than abstracted manipulation. This is why GUI's are faster than command line interfaces. There is plenty of literature documenting this fact. If you really care, I can get you some citations.



    Sure. But how does it address ergonomics issues? Or any of the other issues such as unseating an entrenched technology?



    DVORAK format keyboards are supposedly superior. It's never going to replace QWERTY. You could make a case for T9 being the next input method never to change considering that more people may know how to do that than QWERTY. Even in the Starfire video, they had a 5/10 button chording keyboard (one for each finger), they've got something that is better ergonomically, but will never be replaced because of learning curve.
     0Likes 0Dislikes 0Informatives
  • Reply 106 of 113
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by THT View Post


    I don't think they aren't "artists". It was a way of saying there is a wide gulf between a lab product and a shipping product.



    It appeared to be a way of denigrating the foundational work of researchers without which we wouldn't have GUI or multitouch. Technology takes a solid decade at a minimum to move from initial prototype concept (that works) to the (mass) market place.



    Quote:

    Xerox shipped a bunch of Alto/Star systems, but ultimately failed because they never got the business/product part of business. They had such a head start too, yet through various ill understandings of market conditions, never were able to capitalize on any of their ideas.



    Xerox was one of those companies with a pocket of brilliance mired in a sea of management. Novell is another where good ideas and technology go to stagnate.



    But that doesn't mean that the work done at Xerox PARC wasn't what gave us modern UIs. Without it, all of those exploratory concepts would have had to have been redone by Apple and engineers and researchers are two different kinds of folks.



    Quote:

    It's also a way of saying that there is too much accreditation given to the idea-generators in these patent-dispute type discussions. It's just as much an invention of Apple to productize multi-touch into the iPhone as it was for researchers to come up with the idea and product demo it. In fact, Apple probably spent 10x the resources in making it a product than the combined resources of all these researchers, and the researchers did zero to really help Apple out.



    This is like saying that too much accreditation is given to folks like Galileo, Newton, Bohr, Einstein for generating ideas because the CERN folks spent 100x the resources in making a linear collider than all those researchers combined and those guys did zero to really help CERN out.







    Quote:

    We're having the conversation now. Are you going to invite the MR folks into the forum so they can contribute, or are you going to do it? And yes it was utter crap. They added a lot of things to make it look cool: vertical presentation, airplane-sized gesturing motions, gloves with lights on the fingertips, and some strange visualizations of person's "thoughts". Through all that, I can't what problem it can be used to solve.



    Gloves with lights on the finger tips was for detection of...you guessed it...fingertips. Vertical presentation was a sop to movie making because that allowed you to see Cruise while he did his thing. Same for airplane sized gestures but the gestures themselves of picking, sizing and throwing objects away are consistent with gestures developed for real UIs. Strange visualizations of person's thought was part of the plot. Complaining about that is like complaining about light sabers in Star Wars not making sense.



    Quote:

    Sure. Lots of whiteboards and chalkboards around. I'd submit that if people use them for a long time, their arms would get tired. It's got another thing in common with a MR-style UI, it's nice for collaborative environments, a limited market.



    A "limited" but very large market. Much of what knowledge workers do is collaborative or at least involves presentation and discussion of information. Hence the term powerpoint warriors. And we have these things called conference rooms that aways seems to have a good number of folks in them.



    Quote:

    Are we talking about MR, an MS Surface table, a touchscreen PC, or are we talking about something like the multi-point gestures Apple uses for MB/MBP? If a persons arm has to be supported by entirely by their shoulder, I submit it will have limited markets only.



    We're talking about multi-touch work surfaces. More in line with MS surface than MR but MR does have it's place in multi-user collaborative settings. In portable computers this would be expressed as multi-touch tablets and touchpads.



    Quote:

    A mouse UI is a direct manipulation UI.



    Yes, I was being imprecise with language. Although arguably not all mouse interactions are strictly direct manipulation if folks get anal about it.



    Quote:

    Graphical tablets just replace them with stylii.



    No, not in the context of drawing on a Cintiq. There is a level of abstraction in translation of mouse movements to screen action that is not present in directly drawing on the screen.



    Quote:

    The biggest advantage with a touch UI is having multiple points of contact. You can start adding certain gestures, but the more you add, the more and more it becomes a vocabulary a user has to learn.



    The objective of direct manipulation of objects in a MT environment is to provide natural metaphors and gestures that would not require memorization of many new computer specific gestures to interact with the data.



    Quote:

    Nope. The objections to the GUI from CMD line folks was more about the capability to communicate with the computer, while GUIs were for "non-computer" folks. I don't think anyone complained that typing is easier so therefore GUI's were harder.



    Review the objections of the time. Perceptually CMD line interfaces appear faster (and easier) to the user because the time spent in cognition is "lost" to human perception.



    Quote:

    I suppose you could say I lack imagination here, but I'm thinking not. Not seeing much cost/benefit ratio for a MR-style UI other than limited collaborative markets. What problem is it trying to solve?



    Providing closer interaction with computers than with the WIMP interface. Whether it will ultimately be successful is still to be seen but there is potential there. It also depends on the concepts following WIMP. If zoomable user interfaces replace the desktop metaphor of folders then multi-touch interaction becomes better than mouse interaction



    Quote:

    Sure. But how does it address ergonomics issues? Or any of the other issues such as unseating an entrenched technology?



    DVORAK format keyboards are supposedly superior. It's never going to replace QWERTY. You could make a case for T9 being the next input method never to change considering that more people may know how to do that than QWERTY. Even in the Starfire video, they had a 5/10 button chording keyboard (one for each finger), they've got something that is better ergonomically, but will never be replaced because of learning curve.



    That some new concepts do not make it does not imply that no new concepts make it. Or we'd still be using command line interfaces. For the next UI we need a few things: resolution independence, a new UI metaphor (perhaps ZUI, perhaps something else) and additional user input methods that provide even more direct manipulation of user information. Perhaps something else as well.



    It's taken 40 years for Englebart's concepts to fully mature. Multitouch is hitting the marketplace now about 20 years after inception and now has mainstream acceptance. We have a good decade before we can call it a bust on the desktop.
     0Likes 0Dislikes 0Informatives
  • Reply 107 of 113
    thttht Posts: 6,018member
    Quote:
    Originally Posted by vinea View Post


    It appeared to be a way of denigrating the foundational work of researchers without which we wouldn't have GUI or multitouch. Technology takes a solid decade at a minimum to move from initial prototype concept (that works) to the (mass) market place.



    You're reading into it too much. There's a fine line in semantics and I'm not saying that the authors of this demos and concept products are stupid or aren't artists, not in a way of attacking their character. But there's a vast gulf between shipping and demonstrating.



    This is an Apple forum. You've been here awhile and know your computer history. "Real Artists Ship" is a known saying originated by Steve Jobs to motivate the Mac team to get to the finish line. Art, that is, technology products, in the lab is great and all, but it's not art until consumers view it. "Real artists ship" was a motivational saying Jobs used to get the Mac team to the finish line.



    I know this very very well, in a variety of ways, and most of the time it is a failing to ship or finish. I'd rather say finish actually. In my business, basically it's quite rare to finish the job because of many different issues, many of them non-technical. It's not that different than in computing. Apple spent $1+ billion on Taligent/Copeland. Am I saying they weren't aren't artists or stupid, in that personal, character way? Absolutely not. But yes, real artists ship.



    Apple at that time (1992-1996) should have had that saying permanently etched in their heads. Instead, they were dicking around.



    Quote:

    Xerox was one of those companies with a pocket of brilliance mired in a sea of management. Novell is another where good ideas and technology go to stagnate.



    All people are smart. It's just that all people are not smart together. Humans are generally distributed the same across cultures, status, money, etc, so there is no reason the sea of management wasn't brilliant in there own way. Perhaps the researchers just weren't that good at communicating to management their vision for making money. (Jobs is really good at understanding how to make money from research vision, hence his uniqueness.)



    Quote:

    But that doesn't mean that the work done at Xerox PARC wasn't what gave us modern UIs. Without it, all of those exploratory concepts would have had to have been redone by Apple and engineers and researchers are two different kinds of folks.



    Engineers and researchers are certainly are. Researchers are free of constraints. Engineers have to ship and have all the contraints in the world.



    Quote:

    This is like saying that too much accreditation is given to folks like Galileo, Newton, Bohr, Einstein for generating ideas because the CERN folks spent 100x the resources in making a linear collider than all those researchers combined and those guys did zero to really help CERN out.



    I don't think your analogy really fits. I'm thinking of it more as an idea for a book and the actual production of a book.



    Quote:

    Gloves with lights on the finger tips was for detection of...you guessed it...fingertips. Vertical presentation was a sop to movie making because that allowed you to see Cruise while he did his thing. Same for airplane sized gestures but the gestures themselves of picking, sizing and throwing objects away are consistent with gestures developed for real UIs. Strange visualizations of person's thought was part of the plot. Complaining about that is like complaining about light sabers in Star Wars not making sense.



    So, are you saying the MR-UI is stupid or not? If the MR-UI was a sop to movie making, then how is it a good UI?



    You don't need lights or gloves for detection of finger tips. Cameras do it just fine with the bare hands today. And yes, I still have qualms about gestural concepts being talked about - pick-up with hand and move around, multi-point zoom & rotate, etc. - being good UI conventions for computers.



    Obviously, it works to a point (Apple has really limited what one does) for handhelds as the ergonomics fit and where mice/style don't work that well or aren't applicable.



    Quote:

    A "limited" but very large market. Much of what knowledge workers do is collaborative or at least involves presentation and discussion of information. Hence the term powerpoint warriors. And we have these things called conference rooms that aways seems to have a good number of folks in them.



    I'm a Powerpoint warrior. I don't see this helping me much. Heck, most people barely even touch the capabilities of Powerpoint in 1995 let alone 2007. The vast majority of Powerpoint charts are simply digital versions of transparency slides done 20 years ago.



    The art of presentation is still very much driven by the speaking capabilities of the presenter, where just understanding what to communicate is still essence of the problem. It's not the tool.



    Quote:

    We're talking about multi-touch work surfaces. More in line with MS surface than MR but MR does have it's place in multi-user collaborative settings. In portable computers this would be expressed as multi-touch tablets and touchpads.



    Hey, we're somewhat in agreement since MR-UI is not being talked about as useful UI here!



    Quote:

    No, not in the context of drawing on a Cintiq. There is a level of abstraction in translation of mouse movements to screen action that is not present in directly drawing on the screen.



    Don't know what you are getting at.



    Quote:

    The objective of direct manipulation of objects in a MT environment is to provide natural metaphors and gestures that would not require memorization of many new computer specific gestures to interact with the data.



    Natural is a rather vague term. Natural as in "that's what we do with real objects" has it's place, but it is often not the case as the digital world doesn't have many physical world analogs. Apple itself still implements dragging physical media to the trash-can to eject. That aint no natural metaphor. It is a useful convention, so nobody cares.



    Quote:

    Review the objections of the time. Perceptually CMD line interfaces appear faster (and easier) to the user because the time spent in cognition is "lost" to human perception.



    I wasn't interpreting "These sound exactly like the same objections to GUIs. Typing is easier, there will be a steep learning curve, etc." to mean this. Easier to me means ease-of-use, and obviously a GUI would be easier to use. Faster? Well, like all things it depends on what you are doing. Simple file operations as Tog did testing in? Sure it's true. But we don't live in a world where all we do is the simple stuff. Everything is a tradeoff.



    Obviously, nobody argues about it anymore as GUIs today combine both (like in Mac OS X and other Unix-likesystems) so its all moot anyways.



    My recollection was always CLI was hand because people had to learn commands and the CLI "language", GUIs was easy because everything was easier-to-do, while CLI was better because once the CLI is learned it offered you the power of programmer in its scripting capabilities while with the GUI you were limited with what the programmar offered in the OS or apps.
     0Likes 0Dislikes 0Informatives
  • Reply 108 of 113
    jeffdmjeffdm Posts: 12,954member
    Quote:
    Originally Posted by THT View Post


    This is an Apple forum. You've been here awhile and know your computer history. "Real Artists Ship" is a known saying originated by Steve Jobs to motivate the Mac team to get to the finish line. Art, that is, technology products, in the lab is great and all, but it's not art until consumers view it. "Real artists ship" was a motivational saying Jobs used to get the Mac team to the finish line.



    ...

    But yes, real artists ship.



    It's a good motivating tool, but a bunch of people believing a platitude doesn't make it true, and I doubt it would hold legal water, the legal definition of prior art still stands regardless of whether it holds up to someone's playground definition.



    I really don't see anything about your argument that's convincing, by taking up some odd definition of a word, I could even say a definition made to be propaganda, and then using that definition to make a point, ignoring any other possible definition, even if they are more widely accepted. The premise is shaky at best, so the reliability of the conclusion is shaky too. I think it's quite dangerous to define art as something you would only find in a store, site or catalog, which is how your argument seems to boil down.



    There clearly a value to shipping, but the value I see is commercial, not as a validation of some definition artistic merit.
     0Likes 0Dislikes 0Informatives
  • Reply 109 of 113
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by THT View Post


    You're reading into it too much. There's a fine line in semantics and I'm not saying that the authors of this demos and concept products are stupid or aren't artists,



    Right. Saying that they aren't real artists is exactly like not saying that aren't artists. That's not a "fine line in semantics"...that's a direct quote.



    Quote:

    not in a way of attacking their character. But there's a vast gulf between shipping and demonstrating.



    Yes, there is a vast gulf but one virtually impassable without going through the research lab demo part. Name one product that sprouted fully formed like Athena from Zeus' brow.



    Quote:

    All people are smart. It's just that all people are not smart together. Humans are generally distributed the same across cultures, status, money, etc, so there is no reason the sea of management wasn't brilliant in there own way.



    No, not all people are smart. As far as the brilliance of Xerox management, I'd say that the lackluster performance of the corporation during that period indicates less than brilliance on the part of the management of the time.



    Quote:

    Perhaps the researchers just weren't that good at communicating to management their vision for making money. (Jobs is really good at understanding how to make money from research vision, hence his uniqueness.)



    Or perhaps you're simply into denigrating researchers again?



    Quote:

    Engineers and researchers are certainly are. Researchers are free of constraints. Engineers have to ship and have all the contraints in the world.



    Riiight. Researchers have no constraints whatsoever and never have deadlines.



    I'm an engineer (loosely speaking...I am not a PE) and I have supported research and I can tell you from personal experience that the engineering discipline is easier.



    Quote:

    I don't think your analogy really fits. I'm thinking of it more as an idea for a book and the actual production of a book.



    It fits because the fundamental techniques are developed during research. You can't build a usable multi-touch device without the fundamental research any more than you can build a linear accelerator without the fundamental research.



    Quote:

    So, are you saying the MR-UI is stupid or not? If the MR-UI was a sop to movie making, then how is it a good UI?



    I'm saying that the MR-UI is based on real UI concepts tweaked for hollywood. I'm saying that many of the gestures used are also used in real MT interfaces today. In any case, Han's interface is in use, it is vertical and uses some of the gestures in the MR-UI.



    Quote:

    You don't need lights or gloves for detection of finger tips. Cameras do it just fine with the bare hands today.



    Not as reliably or as easily.



    Quote:

    And yes, I still have qualms about gestural concepts being talked about - pick-up with hand and move around, multi-point zoom & rotate, etc. - being good UI conventions for computers.



    Good UI conventions for what he was doing on a largish surface. On a smaller work surface the gestures can be more subtle.



    Quote:

    Don't know what you are getting at.



    It's very simple. On a Cintiq you're literally drawing on the monitor because the digitizer is built into it. It's as close as pen/pencil/brush on paper as you can get in the digital medium. You look directly at where your hands are drawing on the surface.



    Likewise, direct manipulation of objects under your fingers on a desktop is more direct than manipulation of those objects via mouse. This does require a reasonably sized work surface.



    Quote:

    Natural is a rather vague term. Natural as in "that's what we do with real objects" has it's place, but it is often not the case as the digital world doesn't have many physical world analogs.



    Natural in the same sense of drawing directly on the screen with a stylus.



    Quote:

    Apple itself still implements dragging physical media to the trash-can to eject. That aint no natural metaphor. It is a useful convention, so nobody cares.



    That's a less than natural metaphor. However, dragging objects to the trash to get rid of them is a natural metaphor. For multi-touch that might be flicking them to one part of your work surface, while flicking them to another part puts them away for later use.



    Quote:

    Obviously, nobody argues about it anymore as GUIs today combine both (like in Mac OS X and other Unix-likesystems) so its all moot anyways.



    OSX does not combine both which is why it is a superior unix and the most popular desktop unix ever. You CAN open a shell in OSX. Few folks ever do and the UI sure isn't designed around it unlike in other unix/linux platforms.



    That is the primary failing of Gnome and KDE as desktop environments.
     0Likes 0Dislikes 0Informatives
  • Reply 110 of 113
    thttht Posts: 6,018member
    Quote:
    Originally Posted by JeffDM View Post


    It's a good motivating tool, but a bunch of people believing a platitude doesn't make it true, and I doubt it would hold legal water, the legal definition of prior art still stands regardless of whether it holds up to someone's playground definition.



    I really don't see anything about your argument that's convincing, by taking up some odd definition of a word, I could even say a definition made to be propaganda, and then using that definition to make a point, ignoring any other possible definition, even if they are more widely accepted. The premise is shaky at best, so the reliability of the conclusion is shaky too. I think it's quite dangerous to define art as something you would only find in a store, site or catalog, which is how your argument seems to boil down.



    There clearly a value to shipping, but the value I see is commercial, not as a validation of some definition artistic merit.



    Well, if there is actual patent suit involving pinch-to-zoom, everyone will be asking if the prior art being presented is in fact prior art.



    So Apple is taking an idea for zooming and implementing it on a handheld device. How is the Xerox PARC Digital Desk and Starfire video prior art to the Phone pinch-to-zoom? The Digital Desk did not implement zooming in the same way as the iPhone OS X. It's a desk-based system requiring the use of multiple hands. The Starfire video, not even an actual product, was also a desk-based product. There appears to be no handheld devices that have shipped with multi-touch before the iPhone.



    So is the idea of pinch-to-zoom patentable? No. I don't think anyone agrees that it should be. Even so, there is a lot of grey area. Is the rubber-banding and 2D scrolling in the iPhone patentable? Apple just was awarded that patent? Is the pinch-to-zoom implementation patentable? It's done differently than the prior "art" presented here. Obviously, what they did was take an idea, put a whole lot of work and money into it, and implemented onto a handheld piece of hardware in a very unique way.



    It's more of an extension of the multi-touch functionality in Apple Powerbook, MacBook trackpads in my mind. So, I think there is a lot of Apple/Fingerworks history in the iPhone implementation. But Digital Desk? Starfire? They were presenting ideas, and if it is prior art, then prior art is too broad.
     0Likes 0Dislikes 0Informatives
  • Reply 111 of 113
    thttht Posts: 6,018member
    Quote:
    Originally Posted by vinea View Post


    Right. Saying that they aren't real artists is exactly like not saying that aren't artists. That's not a "fine line in semantics"...that's a direct quote.

    ...

    Yes, there is a vast gulf but one virtually impassable without going through the research lab demo part. Name one product that sprouted fully formed like Athena from Zeus' brow.

    ...

    No, not all people are smart. As far as the brilliance of Xerox management, I'd say that the lackluster performance of the corporation during that period indicates less than brilliance on the part of the management of the time.

    ...

    Or perhaps you're simply into denigrating researchers again?



    Riiight. Researchers have no constraints whatsoever and never have deadlines.



    If you think that's what I'm saying, then fine. That's life I suppose.



    Quote:

    I'm an engineer (loosely speaking...I am not a PE) and I have supported research and I can tell you from personal experience that the engineering discipline is easier.



    I've been an engineer too, a project engineer today, and I deal with a lot of researchers. There's a vast difference in what the the three do. Researchers provide the fundamental tools we use. No argument. But, I know for a fact that when tools are put into practice, the tools have to be effectively rewritten, using the same basic fundamentals yes, but rewritten to cover for things the original tool never meant to do. So going from idea, to product demo, to actual product, what's been implemented changes a whole lot.



    I can't see how this is any different from iPhone, to demo, to idea. In this case, all of the demos aren't even close to what the iPhone is doing other than the idea of multi-touch gestures. Basically everything is different.



    It's the same old argument in all patent discussions. Who gets the credit and what is really original or what is an invention.



    Quote:

    It fits because the fundamental techniques are developed during research. You can't build a usable multi-touch device without the fundamental research any more than you can build a linear accelerator without the fundamental research.



    People have patents on implementing fundamentals developed by the physics giants all the time. In patent cases, I don't think the work from Galileo, Newton, Bohr, Einstein, etc are used as prior art.



    Intel probably has untold number of patents involving CMOS chip manufacturing. I don't think the fact that Bohr and others who are the "fathers" of quantum mechanics developed the fundamentals that Intel uses can be used as prior art to invalidate its patents.



    Quote:

    I'm saying that the MR-UI is based on real UI concepts tweaked for hollywood. I'm saying that many of the gestures used are also used in real MT interfaces today. In any case, Han's interface is in use, it is vertical and uses some of the gestures in the MR-UI.



    But this is what I was trying to convey. I said the MR-UI is crap. How is what you are saying negating what I'm saying? Are you saying the UI conventions being tweaked for Hollywood in the MR-UI produces a good UI?



    Han's device is a touch device, unless I'm confusing something. MR-UI is gesture "device" with a person gesturing in a front of camera. That's a pretty big gulf in ergonomics. And I didn't say Han's UI-conventions are crap, just that his market is limited.



    Quote:

    Not as reliably or as easily.



    iPhoto 09 has facial recognition that can work on pets. Laptops manufacturers are already trying to use built-in webcams to recognize bare hand gestures. I think it'll be reliable, especially when Tom Cruise is waving like he's a deck-hand from Topgun.



    Quote:

    Good UI conventions for what he was doing on a largish surface. On a smaller work surface the gestures can be more subtle.



    But my argument has been opposite. Apple's multi-touch trackpads are finding some nice use. It's also convenient and ergonomic. Large surfaces like a desktop monitor and touching it (like in the HP all-in-one), I'm thinking not. A mouse or a trackpad will feel better.



    Quote:

    It's very simple. On a Cintiq you're literally drawing on the monitor because the digitizer is built into it. It's as close as pen/pencil/brush on paper as you can get in the digital medium. You look directly at where your hands are drawing on the surface.



    Perhaps for specific drawing applications. The LCD has to be as close as possible to the glass/plastic surface, virtually right on the surface, I buy into. Today's tech not yet.



    Quote:

    Likewise, direct manipulation of objects under your fingers on a desktop is more direct than manipulation of those objects via mouse. This does require a reasonably sized work surface.



    My argument would be a mouse is ergonomically less taxing, and hence better and easier to use. Maybe it's not as natural in the way we deal with physical objects, but a mouse would be preferred because it is less taxing.



    Quote:

    OSX does not combine both which is why it is a superior unix and the most popular desktop unix ever. You CAN open a shell in OSX. Few folks ever do and the UI sure isn't designed around it unlike in other unix/linux platforms.



    Basically everyone that I work with use both. This includes researchers and engineers. Obviously not everyone uses it, but for the ones that do, this is a big big plus to have both environments.



    Gnome and KDE basically fail as GUI environments because they are engineered by people using their spare time, hobbyists, and are without direction on what the product should be. OS X on the other hand is pretty good, except that it uses BSD/Mach/NeXTSTEP CLI conventions instead of the Linux most people use.
     0Likes 0Dislikes 0Informatives
  • Reply 112 of 113
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by THT View Post


    If you think that's what I'm saying, then fine. That's life I suppose.



    What you write is what you write. That's life I suppose.



    Quote:

    I've been an engineer too, a project engineer today, and I deal with a lot of researchers. There's a vast difference in what the the three do. Researchers provide the fundamental tools we use. No argument. But, I know for a fact that when tools are put into practice, the tools have to be effectively rewritten, using the same basic fundamentals yes, but rewritten to cover for things the original tool never meant to do. So going from idea, to product demo, to actual product, what's been implemented changes a whole lot.



    Of course it changes. However, without doing the research you never get to the polish phase unless your products are all derivative of someone else's work.



    Quote:

    I can't see how this is any different from iPhone, to demo, to idea. In this case, all of the demos aren't even close to what the iPhone is doing other than the idea of multi-touch gestures. Basically everything is different.



    The point is that without a decade of prior research there would have been no fingerworks and no iPhone.



    Quote:

    But this is what I was trying to convey. I said the MR-UI is crap. How is what you are saying negating what I'm saying? Are you saying the UI conventions being tweaked for Hollywood in the MR-UI produces a good UI?



    I'm saying (for the 3rd time) that the basis of the MR-UI was based on UI research and served as inspiration for some UI developers working on real products since that may have been thier first exposure to gestures. Not everyone is familiar with the body of work that preceeded Minority Report.



    So, as crappy as you think the MR UI is, it has had a large impact on UI designers and gestural interfaces and it is based on work from MIT, MS and other researchers.



    Quote:

    Han's device is a touch device, unless I'm confusing something. MR-UI is gesture "device" with a person gesturing in a front of camera. That's a pretty big gulf in ergonomics. And I didn't say Han's UI-conventions are crap, just that his market is limited.



    The FITR-based multi-touch interface detects the gestures made on the surface of the screen (they appear as glowing blobs) just as the gestures are detected from the glowing LEDs...



    You seem to have difficulty in generalizing from one concept to closely related concepts.



    Quote:

    iPhoto 09 has facial recognition that can work on pets.



    Face detection is largely pattern detection of two eyes which are usually high contrast objects in comparison to the face. Facial recognition is based largely on the specific patterns of detected faces and pattern matched across multiple photos.



    Tilting your head often causes iPhoto's facial detection to fail. Faces with similar facial features (beards, baldness, etc) require quite a bit of learning for iPhoto to have a shot at correct identification. Recognizing pets probably isn't as hard given that folks tend to take pictures of fewer animals than people.



    In any case, it all depends on whether the fingers can be determined from the background. Yes, there are camera based gesture interfaces but it IS a lot harder than detecting IR LEDs and more error prone.



    Quote:

    Laptops manufacturers are already trying to use built-in webcams to recognize bare hand gestures. I think it'll be reliable, especially when Tom Cruise is waving like he's a deck-hand from Topgun.



    Trying yes. Reliable not quite yet...



    There is one technique (shown by a MS researcher) of detecting certain finger shapes (circle) and simple hand gestures (open and closing the circle) via web cam that relies on the trick of forming a circle with your fingers which provides a higher contrast target for tracking against a static background (your keyboard or whatever).



    Should be able to replicate that behavior in the Touchless SDK but I haven't gotten around to it yet.



    Quote:

    Perhaps for specific drawing applications. The LCD has to be as close as possible to the glass/plastic surface, virtually right on the surface, I buy into. Today's tech not yet.



    It is right on the surface. You've never used a Cintiq (or probably seen one) but somehow feel qualified to say it doesn't work.



    Quote:

    Basically everyone that I work with use both. This includes researchers and engineers. Obviously not everyone uses it, but for the ones that do, this is a big big plus to have both environments.



    So what? Some folks I work with also use a shell but they're geeks. Researchers and engineers are hardly mainstream users. Heck, even many developers no longer use shells except in rare cases.



    Quote:

    Gnome and KDE basically fail as GUI environments because they are engineered by people using their spare time, hobbyists, and are without direction on what the product should be. OS X on the other hand is pretty good, except that it uses BSD/Mach/NeXTSTEP CLI conventions instead of the Linux most people use.



    Given that the BSD conventions predate the Linux ones I dunno that is some great failing. Not that "CLI conventions" are all that different given it's largely dependent on which shell you choose to use.
     0Likes 0Dislikes 0Informatives
  • Reply 113 of 113
    thttht Posts: 6,018member
    Quote:
    Originally Posted by vinea View Post


    I'm saying (for the 3rd time) that the basis of the MR-UI was based on UI research and served as inspiration for some UI developers working on real products since that may have been thier first exposure to gestures. Not everyone is familiar with the body of work that preceeded Minority Report.



    So, as crappy as you think the MR UI is, it has had a large impact on UI designers and gestural interfaces and it is based on work from MIT, MS and other researchers.



    Ok. So, I'm perfectly content to continue to believe the MR-UI is crappy. That it is based on gestural UI research doesn't mean that it is a good implementation afterall. One that you say was altered to look cool for a movie. You'd think the alterations would be able to make the gestural UI research concepts crappy ones. And, we're back to square one.



    What do you think of the MR-UI, then? Its efficacy? Its usefulness in the real world?



    Quote:

    The FITR-based multi-touch interface detects the gestures made on the surface of the screen (they appear as glowing blobs) just as the gestures are detected from the glowing LEDs...



    I'm failed to be impressed. In MR, they had camera systems that could identify you from afar through a retinal scan, let alone facial recognition. In the MR world, you'd think they would be able to detect bare hand gestures. In today's world they are trying with bare hands, yes. I still have my qualm about its usefulness over existing input mechanisms (mouse, keyboard, trackpad).



    Actually, I've done the gloved hand thing for 3D virtual worlds navigation. Almost 2 decades ago. Haven't really seen anything in the commercial market with it though.



    Quote:

    You seem to have difficulty in generalizing from one concept to closely related concepts.



    This is a patent, prior art discussion. If we spoke in generalities, there'd basically be no point in having a patent discussion. The MR-UI, the Tog Starfire video, the Digital Desk, all are interesting demos. Not much similarity to a cellphone. Not that much similarity to each other. The MR-UI used a nontouch 4 point system for zooming. Digital Desk used a 2 point touch system at the corners with a black square, and as I said before, I didn't see it in the starfire video to see how they did it. The iPhone uses a 2 point touch system at any location within the window.



    No to mention that if the Digital Desk and the Starfire concept were made into shipping product, they'd pretty made be totally different from their concept forms. One wonders if even the same multi-touch algorithms will be used.



    Quote:

    Face detection is largely pattern detection of two eyes which are usually high contrast objects in comparison to the face. Facial recognition is based largely on the specific patterns of detected faces and pattern matched across multiple photos.



    Tilting your head often causes iPhoto's facial detection to fail. Faces with similar facial features (beards, baldness, etc) require quite a bit of learning for iPhoto to have a shot at correct identification. Recognizing pets probably isn't as hard given that folks tend to take pictures of fewer animals than people.



    In any case, it all depends on whether the fingers can be determined from the background. Yes, there are camera based gesture interfaces but it IS a lot harder than detecting IR LEDs and more error prone.



    Trying yes. Reliable not quite yet...



    I will await gestural recognition using gloves with LED lights then. It'll be like a stylus, with the gloves hung on the back of the monitor. Maybe some vendor will notice that gloves with LED lights on the finger makes there web-cam based gestural input system really work, and decide to ship it.



    Quote:

    There is one technique (shown by a MS researcher) of detecting certain finger shapes (circle) and simple hand gestures (open and closing the circle) via web cam that relies on the trick of forming a circle with your fingers which provides a higher contrast target for tracking against a static background (your keyboard or whatever).



    Interesting, but why not just simplify? Ie, don't try to read my hand as if I'm using sign language, just detect based vertical, horizontal, diagonal, in-out motions and open or closed hands. That's 18 inputs.



    Quote:

    It is right on the surface. You've never used a Cintiq (or probably seen one) but somehow feel qualified to say it doesn't work.



    There's still a millimeter of glass or plastic shielding the LCD. I'd like it to be zero like it is with pencil and paper.



    Quote:

    So what? Some folks I work with also use a shell but they're geeks. Researchers and engineers are hardly mainstream users. Heck, even many developers no longer use shells except in rare cases.



    "OSX does not combine both which is why it is a superior unix and the most popular desktop unix ever. You CAN open a shell in OSX. Few folks ever do and the UI sure isn't designed around it unlike in other unix/linux platforms."



    Ok, I'm obviously interpreting "OSX does not combine both" differently.



    Quote:

    Given that the BSD conventions predate the Linux ones I dunno that is some great failing. Not that "CLI conventions" are all that different given it's largely dependent on which shell you choose to use.



    No failing no. Just a statement that most people are more familiar with Linux conventions and moving to Mach/BSD can be a chore.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.