My iPad Pro gets here Monday -- the Pencil in the first 2 weeks of December.
Can the iPad Pro / Pencil combination detect specific locations on the screen by hovering ala menu hints?
I don't think that there is a developer API for the pen -- yet.
But I would like to see a CAD app where the Pencil can be used to hover over and detect/highlight an edge or surface of a drawn component, say a box -- and then when pressed, allow the user to drag the edge or surface to resize the box.
Do any of the Pencil-aware apps suggest that this can be done?
You worthy eraserhead commentors are missing a very big, maybe a little-too-subtle point about digital ink or graphite.
That is, digital erasing is a magical, non-physical process, just like laying down a mark is when it's made of electrons or photons under glass. So you touch a software switch and you undo your electronically recorded line by touching with the same fine signaling tool the input sensors that first recorded the line. You remove no material.
Your wish to trigger this line or smudge removal process with a fat, circular "eraser" on the other end of the electronic pencil would make a hideous mockery of the fundamental breakthrough that digital drawing is.
It would be like insisting that the maker of your horseless carriage supply you with a whip to urge the motor to go faster on hills.
You have to flip the thing over and position it EXACTLY opposite of the way it's positioned for tip down input: that's about as different as it gets without actually dropping the thing. That additional shift in the position of the device interrupts workflow, granted for leisurely use perhaps not enough to be a significant issue.
My iPad Pro gets here Monday -- the Pencil in the first 2 weeks of December.
Can the iPad Pro / Pencil combination detect specific locations on the screen by hovering ala menu hints?
I don't think that there is a developer API for the pen -- yet.
But I would like to see a CAD app where the Pencil can be used to hover over and detect/highlight an edge or surface of a drawn component, say a box -- and then when pressed, allow the user to drag the edge or surface to resize the box.
Do any of the Pencil-aware apps suggest that this can be done?
Do you mean hovering the Pencil over the screen without touching it? I see no support for that anywhere in any apps. I don't think it's possible with the current hardware. This is meant for interacting with the screen physically.
Take this with a grain of salt, of course (I do realize he's disliked in the comments), but Ming-Chi Kuo got a lot of stuff right about the Pencil before anyone else. So if you believe what he says, he seems to think a more advanced Pencil with additional sensors (which could potentially allow writing on surfaces other than the iPad, or hover gestures) is coming in a few years.
Does the iPen work well with screen protectors such as invisible shield.
I do not have a screen protector to test. However, considering the Pencil works fine with a sheet of printer paper on top of the iPad screen, I feel pretty confident that any screen protector will also work just fine.
My iPad Pro gets here Monday -- the Pencil in the first 2 weeks of December.
Can the iPad Pro / Pencil combination detect specific locations on the screen by hovering ala menu hints?
I don't think that there is a developer API for the pen -- yet.
But I would like to see a CAD app where the Pencil can be used to hover over and detect/highlight an edge or surface of a drawn component, say a box -- and then when pressed, allow the user to drag the edge or surface to resize the box.
Do any of the Pencil-aware apps suggest that this can be done?
Do you mean hovering the Pencil over the screen without touching it? I see no support for that anywhere in any apps. I don't think it's possible with the current hardware. This is meant for interacting with the screen physically.
OK, no hovering.
Can you press and move the Pencil without writing anything -- kind of like drawing with invisible ink? If so, I infer that the Pencil (when pressed) is causing the iPad Pro to register exact x,y screen locations (down to a pixel level?) to the APU. Thus it would be possible to press/select an edge or surface then drag it ... It may be more than the iPad Pro, alone, can handle to keep track of the underlying drawing on the screen -- but an iPad Pro connected to a Mac could do the job.
This would work with CAD, FCPX, Photoshop, etc, apps running on the Mac and using the iPad Pro / Pencil as a graphics tablet ala Wacom.
Take this with a grain of salt, of course (I do realize he's disliked in the comments), but Ming-Chi Kuo got a lot of stuff right about the Pencil before anyone else. So if you believe what he says, he seems to think a more advanced Pencil with additional sensors (which could potentially allow writing on surfaces other than the iPad, or hover gestures) is coming in a few years.
Can you press and move the Pencil without writing anything
I'm not sure what you're asking here. If you use an app that supports drawing and it's in the proper mode, pressing and moving the Pencil draws a line. Tilt the pencil and it can offer shading or thicker strokes, if the app supports it (Apple's Notes, for example).
If you're not in a drawing mode or drawing app, then the pencil basically just replicates your fingertip. It can scroll, rotate, and interact with objects onscreen just how you'd expect.
Quote:
Originally Posted by Dick Applebaum
Interesting -- he gets some things right and a lot of things in the ballpark.
Back in January, he undersold the final capabilities of the Pencil (it does have gyroscopes for that tilt sensing, for example), but overall he was pretty accurate. He was first to reveal the stylus itself, first to reveal Lightning connector for charging, and he knew that it would be an entirely optional accessory. Oh, and he nailed the exact final size of the iPad Pro itself, as well.
I think it's cool when AI authors engage the readership. Answering good questions and adding additional information, like @nhughes is doing here is very helpful.
Can you press and move the Pencil without writing anything
I'm not sure what you're asking here. If you use an app that supports drawing and it's in the proper mode, pressing and moving the Pencil draws a line. Tilt the pencil and it can offer shading or thicker strokes, if the app supports it (Apple's Notes, for example).
If you're not in a drawing mode or drawing app, then the pencil basically just replicates your fingertip. It can scroll, rotate, and interact with objects onscreen just how you'd expect.
I checked the Developer docs and they added Pencil APIs. So, it appears that when you touch the display with the Pencil you get the exact x,y pixel locations during touch-start to touch-end. You can draw a line, or not.
The Apple Pencil movie:
[VIDEO]
at about :50 in shows the info available
at about 1:23 in shows combining Finger input and Pencil input.
This means that something like a CAD app (or other Pro app) could be developed:
totally within the iPad Pro (with RAM being the limiting factor)
using a Mac with iPad Pro as a graphics tablet
Guess I'll have to break out my developer jacket and see what I can accomplish by the time my Pencil arrives ...
I think it's cool when AI authors engage the readership. Answering good questions and adding additional information, like @nhughes is doing here is very helpful.
If you can't right-click while holding the Pencil a few millimeters above the screen then it's a failure. /s
I like my Wacom stylus (for the eraser, for the OS X support... I guess for the buttons, though I don’t really use them), so I’m excited to try out the Apple Pencil.
Anyone know if Best Buy has the iPad Pro on display and if they have any demo Apple Pencils? I assume Apple Stores themselves have both, right?
Of course the eraser at the end is something that has been thought about. It has existed for a really long time in real life pencils, and quite a while in dedicated drawing pads and more recently in the Surface line-up. Could be an engineering issue I suppose, but if its already been done successfully, then I am sure the folks working at Apple could have figured it out as well.
You are correct that attaching magnetically or inside makes little sense given the thickness of the Pencil and iPad Pro. There are no doubt going to be plenty of cases with a Pencil holder if they don't already exist. I can imagine there will be at least one companymaking a keyboard cover with an integrated Pencil holder.
I bet next years Apple Pencil and/or third party solutions will offer an eraser at the end. It does seem like a teeny bit of an oversight but no where near a deal breaker if you are in the market for the iPad Pro and Pencil.
But that's my whole point. The eraser has a ready been done. There's no way that Apple didn't think about it. So either they had engineering challenges or they decided against doing it for whatever reason.
I checked the Developer docs and they added Pencil APIs. So, it appears that when you touch the display with the Pencil you get the exact x,y pixel locations during touch-start to touch-end. You can draw a line, or not.
The Apple Pencil movie:
at about :50 in shows the info available
at about 1:23 in shows combining Finger input and Pencil input.
This means that something like a CAD app (or other Pro app) could be developed:
totally within the iPad Pro (with RAM being the limiting factor)
using a Mac with iPad Pro as a graphics tablet
Guess I'll have to break out my developer jacket and see what I can accomplish by the time my Pencil arrives ...
This is really something!
Does this address some of your questions? Links to a video for the Umake sketching app.
Someone emailed and asked about pressure sensitivity, said they couldn't get it to work. Just showing here that it works fine when pencil shading or using the marker in Apple's Notes.
I PRAY that Apple will make the next gen iPad and iPhone compatible with the Apple Pencil.
I was a long time palm user and appreciate pen input.
Hope Apple also provides handwriting input.
I doubt it will be compatible with the iPhone. Maybe the Plus model. But somehow I doubt it. The iPad mini, possibly, the iPad Air definitely. There's additional technology Apple has to build into the hardware to enable it. That seems to unnecessarily drive up the cost of the phone which is meant to be portable, and thus you wouldn't be carrying around a pencil anyway, and would depart from Apple's current mantra that it is not a stylus, but a drawing tool. Perhaps Apple will eventually give into pressure from users, and create a Pencil mini that can be used as a stylus with the phone, but right now I think they'd have to eat a lot more crow than they'd like to justify it, and it would seriously dilute their marketing message.
For example if I have a Word document can I scribble on it?
If I have an email or photo can I do the same?
What about a PDF file?
Not sure what "universal markup ability" would be good for if the app does not have a mean to save the annotations etc.
MS Word has been updated with Pencil support (and it works great), but you need the paid subscription version. PDF Expert works fine, too. Not clear how scribbles on an email could be saved back to the email server.
Perhaps OT, but I just got to see the iPad Pro in person. It's badass. It isn't for me, because I'm not an artist or creator of content, but I can appreciate the cool factor for a specific consumer. I am sure they will sell many.
Comments
My iPad Pro gets here Monday -- the Pencil in the first 2 weeks of December.
Can the iPad Pro / Pencil combination detect specific locations on the screen by hovering ala menu hints?
I don't think that there is a developer API for the pen -- yet.
But I would like to see a CAD app where the Pencil can be used to hover over and detect/highlight an edge or surface of a drawn component, say a box -- and then when pressed, allow the user to drag the edge or surface to resize the box.
Do any of the Pencil-aware apps suggest that this can be done?
That is, digital erasing is a magical, non-physical process, just like laying down a mark is when it's made of electrons or photons under glass. So you touch a software switch and you undo your electronically recorded line by touching with the same fine signaling tool the input sensors that first recorded the line. You remove no material.
Your wish to trigger this line or smudge removal process with a fat, circular "eraser" on the other end of the electronic pencil would make a hideous mockery of the fundamental breakthrough that digital drawing is.
It would be like insisting that the maker of your horseless carriage supply you with a whip to urge the motor to go faster on hills.
Disagree.
You have to flip the thing over and position it EXACTLY opposite of the way it's positioned for tip down input: that's about as different as it gets without actually dropping the thing. That additional shift in the position of the device interrupts workflow, granted for leisurely use perhaps not enough to be a significant issue.
@nhughes
My iPad Pro gets here Monday -- the Pencil in the first 2 weeks of December.
Can the iPad Pro / Pencil combination detect specific locations on the screen by hovering ala menu hints?
I don't think that there is a developer API for the pen -- yet.
But I would like to see a CAD app where the Pencil can be used to hover over and detect/highlight an edge or surface of a drawn component, say a box -- and then when pressed, allow the user to drag the edge or surface to resize the box.
Do any of the Pencil-aware apps suggest that this can be done?
Do you mean hovering the Pencil over the screen without touching it? I see no support for that anywhere in any apps. I don't think it's possible with the current hardware. This is meant for interacting with the screen physically.
Take this with a grain of salt, of course (I do realize he's disliked in the comments), but Ming-Chi Kuo got a lot of stuff right about the Pencil before anyone else. So if you believe what he says, he seems to think a more advanced Pencil with additional sensors (which could potentially allow writing on surfaces other than the iPad, or hover gestures) is coming in a few years.
http://appleinsider.com/articles/15/01/18/kgi-apple-likely-to-launch-stylus-to-enhance-upcoming-129-inch-ipad-user-experience
Does the iPen work well with screen protectors such as invisible shield.
I do not have a screen protector to test. However, considering the Pencil works fine with a sheet of printer paper on top of the iPad screen, I feel pretty confident that any screen protector will also work just fine.
OK, no hovering.
Can you press and move the Pencil without writing anything -- kind of like drawing with invisible ink? If so, I infer that the Pencil (when pressed) is causing the iPad Pro to register exact x,y screen locations (down to a pixel level?) to the APU. Thus it would be possible to press/select an edge or surface then drag it ... It may be more than the iPad Pro, alone, can handle to keep track of the underlying drawing on the screen -- but an iPad Pro connected to a Mac could do the job.
This would work with CAD, FCPX, Photoshop, etc, apps running on the Mac and using the iPad Pro / Pencil as a graphics tablet ala Wacom.
Interesting -- he gets some things right and a lot of things in the ballpark.
Can you press and move the Pencil without writing anything
I'm not sure what you're asking here. If you use an app that supports drawing and it's in the proper mode, pressing and moving the Pencil draws a line. Tilt the pencil and it can offer shading or thicker strokes, if the app supports it (Apple's Notes, for example).
If you're not in a drawing mode or drawing app, then the pencil basically just replicates your fingertip. It can scroll, rotate, and interact with objects onscreen just how you'd expect.
Interesting -- he gets some things right and a lot of things in the ballpark.
Back in January, he undersold the final capabilities of the Pencil (it does have gyroscopes for that tilt sensing, for example), but overall he was pretty accurate. He was first to reveal the stylus itself, first to reveal Lightning connector for charging, and he knew that it would be an entirely optional accessory. Oh, and he nailed the exact final size of the iPad Pro itself, as well.
I think it's cool when AI authors engage the readership. Answering good questions and adding additional information, like @nhughes is doing here is very helpful.
I checked the Developer docs and they added Pencil APIs. So, it appears that when you touch the display with the Pencil you get the exact x,y pixel locations during touch-start to touch-end. You can draw a line, or not.
The Apple Pencil movie:
[VIDEO]
at about :50 in shows the info available
at about 1:23 in shows combining Finger input and Pencil input.
This means that something like a CAD app (or other Pro app) could be developed:
Guess I'll have to break out my developer jacket and see what I can accomplish by the time my Pencil arrives ...
This is really something!
Hear! Hear!
... and doing it without an alias!
I like my Wacom stylus (for the eraser, for the OS X support... I guess for the buttons, though I don’t really use them), so I’m excited to try out the Apple Pencil.
Anyone know if Best Buy has the iPad Pro on display and if they have any demo Apple Pencils? I assume Apple Stores themselves have both, right?
But that's my whole point. The eraser has a ready been done. There's no way that Apple didn't think about it. So either they had engineering challenges or they decided against doing it for whatever reason.
[edit: spelling]
I was a long time palm user and appreciate pen input.
Hope Apple also provides handwriting input.
I checked the Developer docs and they added Pencil APIs. So, it appears that when you touch the display with the Pencil you get the exact x,y pixel locations during touch-start to touch-end. You can draw a line, or not.
The Apple Pencil movie:
at about :50 in shows the info available
at about 1:23 in shows combining Finger input and Pencil input.
This means that something like a CAD app (or other Pro app) could be developed:
Guess I'll have to break out my developer jacket and see what I can accomplish by the time my Pencil arrives ...
This is really something!
Does this address some of your questions? Links to a video for the Umake sketching app.
Someone emailed and asked about pressure sensitivity, said they couldn't get it to work. Just showing here that it works fine when pencil shading or using the marker in Apple's Notes.
I PRAY that Apple will make the next gen iPad and iPhone compatible with the Apple Pencil.
I was a long time palm user and appreciate pen input.
Hope Apple also provides handwriting input.
I doubt it will be compatible with the iPhone. Maybe the Plus model. But somehow I doubt it. The iPad mini, possibly, the iPad Air definitely. There's additional technology Apple has to build into the hardware to enable it. That seems to unnecessarily drive up the cost of the phone which is meant to be portable, and thus you wouldn't be carrying around a pencil anyway, and would depart from Apple's current mantra that it is not a stylus, but a drawing tool. Perhaps Apple will eventually give into pressure from users, and create a Pencil mini that can be used as a stylus with the phone, but right now I think they'd have to eat a lot more crow than they'd like to justify it, and it would seriously dilute their marketing message.
Not sure what "universal markup ability" would be good for if the app does not have a mean to save the annotations etc.
MS Word has been updated with Pencil support (and it works great), but you need the paid subscription version. PDF Expert works fine, too. Not clear how scribbles on an email could be saved back to the email server.
Perhaps OT, but I just got to see the iPad Pro in person. It's badass. It isn't for me, because I'm not an artist or creator of content, but I can appreciate the cool factor for a specific consumer. I am sure they will sell many.