iOS 15 Live Text feature can copy and paste text from any image

in General Discussion edited June 2021
Live Text is a new feature in iOS 15, iPadOS 15, and macOS Monterey that allows users to select, translate, and search for text found within any image.

One of the new features announced at WWDC this year was Live Text. Live Text is coming to iOS 15, iPadOS 15, and macOS Monterey which allows users to select, copy, look up, and even translate text found in any image.

The demo used during the keynote was a meeting whiteboard with handwritten text. Opening the camera app on iPhone and pointing it at the whiteboard, a small indicator appears in the bottom right corner showing text is recognized in the viewfinder.

Tapping the indicator then allows you to copy or share the text wherever you'd like. During the demo, Apple VP Craig Federighi pasted text in a Mail message with bullet formatting included.

Not only does this work when capturing a photo, but Live Text can be used with any photo found in your Photos app to select and share text. Any time the indicator appears in the lower right corner of the screen, some text can be picked from the image.

Combined with system-wide translation in the new versions of iOS, iPadOS, and macOS, users can select text in a photo and tap the Translate command to read it in another language without ever leaving the image. For those traveling internationally, this feature could assist with reading street signage, menus, and many other use cases bound only by imagination.

If an image includes a phone number or street address, users can tap on the text in the image to make a call or search for a location. Even a restaurant sign can prompt a search in maps for that place of business, and getting directions is just a few taps away.

Live Text will also make words captured in a photo searchable with Spotlight. Searching for a word or phrase will reveal photo results which users can tap to jump directly to that photo in their library.

In addition to text features, Apple has incorporated a look-up feature for objects and scenes. When pointing the camera or viewing a photo of art, books, nature, landmarks, and pets, the device will display relevant information with links to learn more.

Live Text will recognize seven languages at launch, including English, Chinese, French, Italian, German, Spanish, and Portuguese. This feature also requires the A12 Bionic chip or newer, which was found in the iPhone XS.

Stay on top of all Apple news right from your HomePod. Say, "Hey, Siri, play AppleInsider," and you'll get latest AppleInsider Podcast. Or ask your HomePod mini for "AppleInsider Daily" instead and you'll hear a fast update direct from our news team. And, if you're interested in Apple-centric home automation, say "Hey, Siri, play HomeKit Insider," and you'll be listening to our newest specialized podcast in moments.


  • Reply 1 of 9
    KBuffettKBuffett Posts: 95member
    It would be great if one could search text from photos in one’s library.
  • Reply 2 of 9
    jkichlinejkichline Posts: 1,369member
    KBuffett said:
    It would be great if one could search text from photos in one’s library.
    I believe they had demonstrated that one can as text in images is indexed by spotlight.
  • Reply 3 of 9
    amar99amar99 Posts: 181member
    KBuffett said:
    It would be great if one could search text from photos in one’s library.

    Yes pretty sure they covered that feature in the keynote.
  • Reply 4 of 9
    mistergsfmistergsf Posts: 241member
    I've been using a third party app on my Mac called TextSniper to capture text from images. It's great to have Live Text since it will work on all my Apple devices. Looking forward to it!
    edited June 2021 williamlondonwatto_cobra
  • Reply 5 of 9
    JapheyJaphey Posts: 1,767member
    KBuffett said:
    It would be great if one could search text from photos in one’s library.
    It specifically says in the article that you can. It was also in the keynote. 
  • Reply 6 of 9
    JapheyJaphey Posts: 1,767member
    This was one of my top 3 favorite features announced at WWDC. I know Google has something similar, but I don’t use Google, so this is a big deal. It’s an obvious precursor to whatever AR ambitions Apple may have, and I can’t wait to see how they bring to their HUD next year (hopefully). By then it will have had a year in the wild to mature and should be quite spectacular. 
  • Reply 7 of 9
    CheeseFreezeCheeseFreeze Posts: 1,249member
    I’m hoping for another simple feature that now requires an app: “Paste unformatted”, removing any formatting and just pasting it as plain text.
  • Reply 8 of 9
    You don’t have to snap a pic you can just tap the icon on the lower right hand corner … also when texting you can tap the text box and you’ll get a pop up that says text from camera … nice and it’ll copy text from an image right into you’re text message 
  • Reply 9 of 9
    Geo_tmGeo_tm Posts: 1member
    Alternative title- Iphone users never heard of Google lens - and rely on their (fake) God to add an already existing feature to their phones-
Sign In or Register to comment.