I don't think that is a good rationale though because we can't change the shape or physical attributes of our fingers for different tasks in the real world. In the digital world we can though.
If, in the real world, we could use out finger and create a precise line of any pattern or shape or level of precision we would have no need for different tools.
Disagree. For some people - including myself - the muti-touch on the Magic Mouse is a one-way trip to the RSI clinic. While I like the ability to scroll without having to hit a wheel or ball, using two fingers in a sideways motion while keeping the mouse from moving with the thumb or other fingers induces instant wrist pain.
If you have any kind of wrist pain already when mousing, I'd strongly suggest avoiding the Magic Mouse - or at last avoiding using any of the multitouch features.
I don't think that is a good rationale though because we can't change the shape or physical attributes of our fingers for different tasks in the real world. In the digital world we can though.
If, in the real world, we could use out finger and create a precise line of any pattern or shape or level of precision we would have no need for different tools.
Only by zooming in to an insane amount of precision, which robs you of the "big picture" of whatever you're working on. While you could get away with that on a Microsoft Surface-style "big screen" interface, on something that's 10in or less it will be unworkably bad.
Imagine if you are using the tablet to draw on and your primary PC monitor to view the overall picture? I think this could work very well. I also think you would get used to using our finger to draw on if you chose to do that. Though it would probably be easier for most people to just use a pen stylus as we are used to doing. We are creatures of habit.
Imagine if you are using the tablet to draw on and your primary PC monitor to view the overall picture? I think this could work very well. I also think you would get used to using our finger to draw on if you chose to do that. Though it would probably be easier for most people to just use a pen stylus as we are used to doing. We are creatures of habit.
Well in a machine that's primarily portable, having to have a monitor attached would be pretty fatal
It's interesting, too, that people are talking about it as if it's a choice between pen OR touch. It doesn't have to be. The current generation of Wacom tablets, for example, let you use either by sensing when a pen is close and "switching off" touch sensitivity at that point, No pen, and your fingers will work.
This solved the problem that was a major issue with the stylus on the Newton: the "lean-on" problem. Newton's touch screen was passive: that is, it responded to anything which touched it, including your hand rather than the pen. Most people who write tend to lean on the surface they're writing on to a lesser or greater degree. So, with Newton, you had to train yourself to not lean on it, which disrupted your normal handwriting - and so made it more "scrawly" and less easy to recognise.
Microsoft's TabletPC solve this initially by simply dictating that all tablets had to have what's called "active" pen digitisers. That is, they didn't respond to touch, but to the proximity of electronics in the pen. This was great, because it meant you could lean on the screen - and as screens get bigger, you find people doing this more. And it massively improved the legibility of what you were writing. Coupled to a seriously good handwriting recognition engine by the time Of XP SP2, and you had a very good system.
Two big problems, though. First, active digitisers were expensive, and accounted for a good proportion of the $200-500 more you'd pay for a TabletPC compared to an equivalent conventional laptop. Second, no touch with the fingers at all locked you out of using gestural stuff in the interface, and meant that if you lost your pen you were screwed.
(Microsoft later relented, and started allowing passive, resistive touch in TabletPC. This meant you could touch the screen - but in classic PC-world actions, manufacturers all used cheap resistive touch screens to drive down the cost, which gave a really shitty user experience.)
Now, though, we have the technology to do touch which is capacitive, giving a good slick experience with fingers, AND active, which gives a good experience with active pens. In graphics tablet form it's not expensive - but I don't know how expensive it would be to integrate it into a screen.
Will Apple use it? I don't know. They could deliver something which allows you to use an active pen for great quality drawing, handwriting or diagrams but also your fingers for a virtual keyboard and that slick iPhone-style user interface.
My gut feeling is that it comes down to cost. If they can make something that combines "the world's greatest touch interface" with "the world's greatest pen interface" they'll do it - but if it's a choice between the two for cost reasons, touch will win out.
As the de facto input method, absolutely not. As an included accessory, I doubt that, too.
But there is a need for a stylus if this tablet is going to be marketed across the board, like I think it is. A stylus for signatures, drawing in many various situations, and even replicating the annotations that we do in textbooks with a simple stylus that can change from a highlighter, underliner, strikethougher(?), etc.
They do have a fairly recent patent for a capacitance stylus.
stylus is a must .. the germs alone will make all the UPS guys cringe
Well in a machine that's primarily portable, having to have a monitor attached would be pretty fatal
What I mean by that is to use he tablet as a peripheral input device when you aren't using it as a portable. So it has a dual use. When you are using it as a portable it works as a more rudimentary input device for different applications to jot down notes or cut and paste things from documents and doing small art projects. Then you use it for more detailed work in conjunction with your primary PC.
A couple weeks ago AppleInsider reported that some who had seen the tablet were claiming that we would be surprised by the input method. Others have suggested that the tablet will make extraordinary use of multi-touch. A week ago an AI post suggested that iphones may come with touch sensitive panels on the back cover. This article quotes Steve Jobs emphasizing that one of Apple's keys to success is introducing intuitive new interfaces.
Speculating and crossing fingers, I suggest that the key innovation in the iPad will be multi-touch back navigation, in particular "back-typing." The key problem with tablet computers up to this point is input method. You have to hold the tablet, and input data, at the same time. A stylus (while great for some purposes) is lousy for text input. Thumb typing is slow and awkward at best. Back-typing could solve the input problem, and completely change the character and usefulness of a tablet. If Apple can make it feel intuitive and easy, they will have a monumental success.
Pick up a smallish, thin hardback book (whatever size you think the iPad will be). Hold it between your palms. Amazingly all of your fingers are free to tap on the back of the book, while your thumbs are free to tap on the front. This does not work with an iPhone because it's too small. All we need now is a typing method that makes use of these freely tapping fingers. It will need to be a new method, probably one that does not require tapping in 26 different spots to choose letters, but there is no principled reason this can't be done. The thumbs could operate space and shift and choose between sets of letters, the fingers would merely tap. I'm convinced this could be at least as fast, and as natural, as standard typing.
I don't believe that Apple would introduce a device that didn't provide some truly unique hardware related property. This is the one I'm voting for.
Evilmole, the big things I wonder about are parallax and and the ability for the capacitive screen to capture detail when drawing. From what I have gathered there are limitations with capacitive touch screen size and detail and accuracy.
Also, is parallax a problem on the iPhone?
I love how the iPhone has no ridges or edges on the surface so the drawing/touching surface is completely flat, unlike most tablets out there. But does this make the screen thicker, increasing parallax issue on the iPhone compared to other tablets?
I love how the iPhone has no ridges or edges on the surface so the drawing/touching surface is completely flat, unlike most tablets out there. But does this make the screen thicker, increasing parallax issue on the iPhone compared to other tablets?
I don't know the answer to that (and if I did, I'd probably be working for Apple )
The trade off that Apple makes may well be simple "this is not a product designed for detailed work". And I'd rather they did that than push the limits of what can be done at a reasonable price with current technology. I'm excited either way
I am hoping it can at least do work comparable to the Modbook, with a slightly smaller screen. Though the president of Axiotron seems to be very confident Apple isn't going that direction. I really hope he is wrong!
What difference does it make whether a stylus is included or not. There are capacitive stylii that exist, and the iPhone touch screen doesn't even know the difference. Haven't any of you bought something from Apple recently with their new iPod Touch checkout system? You sign on the iPod!
No, it's not about the actual stylus that matters. It's whether Apple has taken the Newton handwriting recognition tech from the 90s and developed it even further such that it's really useful today. Plus, you'll want third party developers to be able to tap into that technology.
It's always important to think about the big picture for these kinds of things. A capacitive stylus is no different than your finger.
Now, if only someone could come up with a nail-based touch system! It's still tactile enough and yet more precise than a fingertip....
I am hoping it can at least do work comparable to the Modbook, with a slightly smaller screen. Though the president of Axiotron seems to be very confident Apple isn't going that direction. I really hope he is wrong!
Don't count on it. This is going to be based off of the iPhone OS. We may not even get the simple ability to download a file through Safari. The Modbook, otoh, is an actual computer though it's a monstrosity.
Don't count on it. This is going to be based off of the iPhone OS. We may not even get the simple ability to download a file through Safari. The Modbook, otoh, is an actual computer though it's a monstrosity.
What if the Tablet is used as a peripheral that works in conjunction with your primary PC though? This way it wouldn't matter if you use the iPhone OS because you are just using it as an input device. So it may not be an all in one solution like the Modbook but you get the same results with your primary PC. IMO it would be a waste of a tablet if you can't use it for dual duty like this -- media player and drawing pad -- especially for a 1000 bucks.
I don't understand why Apple hasn't yet introduced a multi-touch keyboard/mouse surface as FingerWorks has. For a company that prides itself on innovation, they're being surprisingly conservative. I was surprised that they came out with the Magic Mouse at all, instead of an external trackpad, which would have been more useful considering the usefulness of the MacBook (Pro) trackpad.
As a geek, I find that the movement from keyboard to mouse/trackpad and back again wastes precious seconds from a day of heavy coding. That's why I've memorized as many keyboard shortcuts as possible, and why IDEs such as Eclipse have introduced so many of them, even for such mouse intensive tasks as switch from the editor to the project view. In my close to 10 year career as a developer, I've probably spent over a solid month switching from keyboard to mouse.
I'm hoping Apple is working on this, and are just doing their do diligence in resolving the design, ergonomic, and QA issues inherent in such a device.
Comments
If, in the real world, we could use out finger and create a precise line of any pattern or shape or level of precision we would have no need for different tools.
Well you obviously don't know how to use it.
Disagree. For some people - including myself - the muti-touch on the Magic Mouse is a one-way trip to the RSI clinic. While I like the ability to scroll without having to hit a wheel or ball, using two fingers in a sideways motion while keeping the mouse from moving with the thumb or other fingers induces instant wrist pain.
If you have any kind of wrist pain already when mousing, I'd strongly suggest avoiding the Magic Mouse - or at last avoiding using any of the multitouch features.
I don't think that is a good rationale though because we can't change the shape or physical attributes of our fingers for different tasks in the real world. In the digital world we can though.
If, in the real world, we could use out finger and create a precise line of any pattern or shape or level of precision we would have no need for different tools.
Only by zooming in to an insane amount of precision, which robs you of the "big picture" of whatever you're working on. While you could get away with that on a Microsoft Surface-style "big screen" interface, on something that's 10in or less it will be unworkably bad.
Imagine if you are using the tablet to draw on and your primary PC monitor to view the overall picture? I think this could work very well. I also think you would get used to using our finger to draw on if you chose to do that. Though it would probably be easier for most people to just use a pen stylus as we are used to doing. We are creatures of habit.
Well in a machine that's primarily portable, having to have a monitor attached would be pretty fatal
It's interesting, too, that people are talking about it as if it's a choice between pen OR touch. It doesn't have to be. The current generation of Wacom tablets, for example, let you use either by sensing when a pen is close and "switching off" touch sensitivity at that point, No pen, and your fingers will work.
This solved the problem that was a major issue with the stylus on the Newton: the "lean-on" problem. Newton's touch screen was passive: that is, it responded to anything which touched it, including your hand rather than the pen. Most people who write tend to lean on the surface they're writing on to a lesser or greater degree. So, with Newton, you had to train yourself to not lean on it, which disrupted your normal handwriting - and so made it more "scrawly" and less easy to recognise.
Microsoft's TabletPC solve this initially by simply dictating that all tablets had to have what's called "active" pen digitisers. That is, they didn't respond to touch, but to the proximity of electronics in the pen. This was great, because it meant you could lean on the screen - and as screens get bigger, you find people doing this more. And it massively improved the legibility of what you were writing. Coupled to a seriously good handwriting recognition engine by the time Of XP SP2, and you had a very good system.
Two big problems, though. First, active digitisers were expensive, and accounted for a good proportion of the $200-500 more you'd pay for a TabletPC compared to an equivalent conventional laptop. Second, no touch with the fingers at all locked you out of using gestural stuff in the interface, and meant that if you lost your pen you were screwed.
(Microsoft later relented, and started allowing passive, resistive touch in TabletPC. This meant you could touch the screen - but in classic PC-world actions, manufacturers all used cheap resistive touch screens to drive down the cost, which gave a really shitty user experience.)
Now, though, we have the technology to do touch which is capacitive, giving a good slick experience with fingers, AND active, which gives a good experience with active pens. In graphics tablet form it's not expensive - but I don't know how expensive it would be to integrate it into a screen.
Will Apple use it? I don't know. They could deliver something which allows you to use an active pen for great quality drawing, handwriting or diagrams but also your fingers for a virtual keyboard and that slick iPhone-style user interface.
My gut feeling is that it comes down to cost. If they can make something that combines "the world's greatest touch interface" with "the world's greatest pen interface" they'll do it - but if it's a choice between the two for cost reasons, touch will win out.
We'll see come Wednesday!
As the de facto input method, absolutely not. As an included accessory, I doubt that, too.
But there is a need for a stylus if this tablet is going to be marketed across the board, like I think it is. A stylus for signatures, drawing in many various situations, and even replicating the annotations that we do in textbooks with a simple stylus that can change from a highlighter, underliner, strikethougher(?), etc.
They do have a fairly recent patent for a capacitance stylus.
stylus is a must .. the germs alone will make all the UPS guys cringe
Well in a machine that's primarily portable, having to have a monitor attached would be pretty fatal
What I mean by that is to use he tablet as a peripheral input device when you aren't using it as a portable. So it has a dual use. When you are using it as a portable it works as a more rudimentary input device for different applications to jot down notes or cut and paste things from documents and doing small art projects. Then you use it for more detailed work in conjunction with your primary PC.
A couple weeks ago AppleInsider reported that some who had seen the tablet were claiming that we would be surprised by the input method. Others have suggested that the tablet will make extraordinary use of multi-touch. A week ago an AI post suggested that iphones may come with touch sensitive panels on the back cover. This article quotes Steve Jobs emphasizing that one of Apple's keys to success is introducing intuitive new interfaces.
Speculating and crossing fingers, I suggest that the key innovation in the iPad will be multi-touch back navigation, in particular "back-typing." The key problem with tablet computers up to this point is input method. You have to hold the tablet, and input data, at the same time. A stylus (while great for some purposes) is lousy for text input. Thumb typing is slow and awkward at best. Back-typing could solve the input problem, and completely change the character and usefulness of a tablet. If Apple can make it feel intuitive and easy, they will have a monumental success.
Pick up a smallish, thin hardback book (whatever size you think the iPad will be). Hold it between your palms. Amazingly all of your fingers are free to tap on the back of the book, while your thumbs are free to tap on the front. This does not work with an iPhone because it's too small. All we need now is a typing method that makes use of these freely tapping fingers. It will need to be a new method, probably one that does not require tapping in 26 different spots to choose letters, but there is no principled reason this can't be done. The thumbs could operate space and shift and choose between sets of letters, the fingers would merely tap. I'm convinced this could be at least as fast, and as natural, as standard typing.
I don't believe that Apple would introduce a device that didn't provide some truly unique hardware related property. This is the one I'm voting for.
nice ideas
I can only imagine that there would be a link to Minority Report UI technology and the tablet.
Unless of course Spielberg is somehow "joining forces" with Apple.
stylus is a must .. the germs alone will make all the UPS guys cringe
I don't think this is really aimed at that market
Yup. Credit where it's due.
Now about that Magic Mouse . . .
err not really ireland rehashed tired old data
we
looked for a tablet since 1996 or earlier
the dream of a newton never died among millions of mac faithful
and many also felt the tablet was supposed to come before the iphone
so
posting a topic 9 yrs later means zippo
sorry
back in 1999 mac rumours did a series of articles about a tablet very much like the one that may come
except it was for children and the finger interface was for painting
yet a children's tablet product would be nice
apple sued them by the wayu
9
Also, is parallax a problem on the iPhone?
I love how the iPhone has no ridges or edges on the surface so the drawing/touching surface is completely flat, unlike most tablets out there. But does this make the screen thicker, increasing parallax issue on the iPhone compared to other tablets?
I love how the iPhone has no ridges or edges on the surface so the drawing/touching surface is completely flat, unlike most tablets out there. But does this make the screen thicker, increasing parallax issue on the iPhone compared to other tablets?
I don't know the answer to that (and if I did, I'd probably be working for Apple
The trade off that Apple makes may well be simple "this is not a product designed for detailed work". And I'd rather they did that than push the limits of what can be done at a reasonable price with current technology. I'm excited either way
What difference does it make whether a stylus is included or not. There are capacitive stylii that exist, and the iPhone touch screen doesn't even know the difference. Haven't any of you bought something from Apple recently with their new iPod Touch checkout system? You sign on the iPod!
No, it's not about the actual stylus that matters. It's whether Apple has taken the Newton handwriting recognition tech from the 90s and developed it even further such that it's really useful today. Plus, you'll want third party developers to be able to tap into that technology.
It's always important to think about the big picture for these kinds of things. A capacitive stylus is no different than your finger.
Now, if only someone could come up with a nail-based touch system! It's still tactile enough and yet more precise than a fingertip....
Is that the best you can do?
I am hoping it can at least do work comparable to the Modbook, with a slightly smaller screen. Though the president of Axiotron seems to be very confident Apple isn't going that direction. I really hope he is wrong!
Don't count on it. This is going to be based off of the iPhone OS. We may not even get the simple ability to download a file through Safari. The Modbook, otoh, is an actual computer though it's a monstrosity.
Imagine feeling the sensation of pressing keys on a virtual keyboard. Or feeling the different textures in a photo.
It would be the perfect evolution of touch screens. And a killer feature people would rave about.
Don't count on it. This is going to be based off of the iPhone OS. We may not even get the simple ability to download a file through Safari. The Modbook, otoh, is an actual computer though it's a monstrosity.
What if the Tablet is used as a peripheral that works in conjunction with your primary PC though? This way it wouldn't matter if you use the iPhone OS because you are just using it as an input device. So it may not be an all in one solution like the Modbook but you get the same results with your primary PC. IMO it would be a waste of a tablet if you can't use it for dual duty like this -- media player and drawing pad -- especially for a 1000 bucks.
As a geek, I find that the movement from keyboard to mouse/trackpad and back again wastes precious seconds from a day of heavy coding. That's why I've memorized as many keyboard shortcuts as possible, and why IDEs such as Eclipse have introduced so many of them, even for such mouse intensive tasks as switch from the editor to the project view. In my close to 10 year career as a developer, I've probably spent over a solid month switching from keyboard to mouse.
I'm hoping Apple is working on this, and are just doing their do diligence in resolving the design, ergonomic, and QA issues inherent in such a device.