On-device processing key to iPadOS Scribble's success hints Apple SVP Craig Federighi

Posted:
in iPad edited September 2020
Apple's handwriting recognition in the Apple Pencil relies on recognizing strokes, an interview with Craig Federighi reveals, while new features such as iPadOS' Scribble rely on massive amounts of onboard machine learning processing.




Introduced as part of iPadOS 14, Scribble enables users to fill out text fields and forms using the Apple Pencil, without needing to type anything out. It accomplishes this by performing onboard processing instead of cloud-based versions, as well as taking advantage of machine learning to improve its accuracy.

Speaking to Popular Mechanics, Apple SVP of software engineering Craig Federighi explains how the Apple Pencil's handwriting recognition was produced. It all started with "data-gathering" by asking people around the world to write stuff down.





"We give them a Pencil, and we have them write fast, we have them write slow, write at a tilt. All of this variation," said Federighi. "If you understand the strokes and how the strokes went down, that can be used to disambiguate what was being written."

Combining the stroke-based recognition with character and word prediction also means that a lot of processing has to take place. As speed is of the essence, this eliminates the use of cloud-based processing of handwriting recognition, and instead forced Apple into a system involving on-device processing.

"It's gotta be happening in real time, right now, on the device you're holding," insists Federighi., "which means that the computational power of the device has to be such that it can do that level of processing locally."

Apple's expertise in chip design has led to the new iPad Air 4 having the A14 Bionic, Apple's fastest self-designed SoC, packing 11.8 billion transistors, a 6-core CPU, a new 4-core graphics architecture, and a 16-core Neural Engine that is capable of up to 11 trillion operations per second. Apple has even added CPU-based machine learning accelerators, which makes machine learning tasks run up to 10 times faster.

Comments

  • Reply 1 of 8
    jas99jas99 Posts: 150member
    Only Apple. Well done, again. 
    StrangeDaysMisterKitwatto_cobra
  • Reply 2 of 8
    How else would something like that be done?  Server side processing would be too slow.  It’s got to be near instantaneous.
    dysamoriawatto_cobraelijahg
  • Reply 3 of 8
    How else would something like that be done?  Server side processing would be too slow.  It’s got to be near instantaneous.
    The new Siri translate app hits a server somewhere. I'd bet that Alexa uses a server.

    I'm glad that Apple is improving the technology. If I'm driving in the middle of nowhere, and give a voice command I'd like it to work. Same for turning off the lights at home. This is why the A14 has a neural engine twice as big as the A12. An interesting question about Apple Silicon is how much of a neural engine will be included.

    watto_cobra
  • Reply 4 of 8

    ph382 said:
    The new Siri translate app hits a server somewhere.

    Actually it doesn't.  The whole translation happens on-device, offline.
    Rayz2016watto_cobra
  • Reply 5 of 8
    How is the performance on an original iPad Pro?
  • Reply 6 of 8
    mjtomlinmjtomlin Posts: 2,673member
    dysamoria said:
    How is the performance on an original iPad Pro?
    Works pretty good! I wrote this with the pencil on my original iPad Pro! And I have crappy handwriting
    watto_cobra
  • Reply 7 of 8
    elijahgelijahg Posts: 2,759member
    Now if only Siri could do the same, I might actually begin recommending it to people.
  • Reply 8 of 8
    Have they moved calendar appointments, reminders and calls to on device processing? I use a VPN that can cause Siri’s connection to not work sometimes and if it did it on device the call itself would still go through. 
Sign In or Register to comment.