Often you will encounter situations where extracting text from an image can be time consuming. Whether it’s a quick online search, sharing specific information, or digitizing phone numbers or emails, you’ll need to use an OCR app such as Google Lens to get the job done. It’s okay to use a third-party OCR app to do what you need to do, but we recommend using a native alternative app. And that’s exactly why Live Text has become a favorite of iOS 15 fans. If you want to experience this new feature in action, read What Live Text is and how to use Live Text on iOS 15 on iPhone and iPad.
- 1 Use live text on iOS 15 on iPhone and iPad (2021)
- 1.1 What is iOS 15 Live Text? How does it work?
- 1.2 Live text compatible iPhone and iPad
- 1.3 Languages supported by live text
- 1.4 How to use live text on iOS 15
- 1.5 Use live text within apps such as Apple Messages and WhatsApp on iOS15
- 1.6 Use live text in Spotlight search on iOS 15
- 1.7 Search in Safari on iOS 15 using live text
- 1.8 Please check the file size of the selected live text before sharing
- 1.9 Translate selected text offline using live text from iOS15
- 1.10 Apple Live Text or Google Lens: Which is better?
- 2 Tips for using live text on iPhone and iPad like a pro
Use live text on iOS 15 on iPhone and iPad (2021)
This section not only describes how the live text feature works in iOS 15, but also how to use it in multiple ways, including the camera app and photo app. So let’s get started!
What is iOS 15 Live Text? How does it work?
Live text is a classic example of how Apple can take inspiration from third-party products to create something more efficient and deeply integrated into the Apple ecosystem. If you’re late for the party and the iPhone 11 series night mode seems appealing, AirPods didn’t take long to achieve resolution in the true wireless earphone segment. The repertoire of Cupertino giants coming from behind and adding more compelling alternatives to existing apps and features is well known. And Live Text inherits all the characteristics that Apple knows about.
Google Lens has long been the best app for recognizing objects and animals and extracting text from images. However, live text isn’t as integrated into the Android experience as it is on iOS 15, iPadOS 15, and even macOS Monterey.The· Live text feature works like Google Lens Extracting text from an image (or almost everything around you) is pretty intuitive. Simply point your camera at any object to extract phone numbers, email addresses, directions on Apple Maps, and more.
Live text can be used not only in native photo and camera apps, but also in other apps such as Safari, Messages and WhatsApp. You can even launch it from Spotlight and quickly find something that catches your eye. More precisely, you can trigger live text from any text input field on your iPhone or iPad. So it’s safe to say that iOS 15 Live Text is one of the best features of OCR (Optical Character Recognition).
Live text compatible iPhone and iPad
Unfortunately, not all iOS 15 compatible iPhones and iPads support live text. Therefore, make sure you have a compatible iOS device to use this feature.
Below is a list of live text compatible iPhone models.
- iPhone Xs
- iPhone Xs Max
- iPhone XR
- iPhone 11
- iPhone 11 Pro
- iPhone 11 Pro Max
- iPhone SE 2
- iPhone 12
- iPhone 12 mini
- iPhone 12 Pro
- iPhone 12 Pro Max
The following is a list of iPad models supported by Live Text.
- iPad Pro 2018
- iPad Pro 2020
- iPad Pro 2021
- iPad 8th generation
- iPad Air 3
- iPad Air 4
Note: Live Text also supports MacBook Air with M1, MacBook Pro with M1, Mac Mini with M1, and the new 24-inch iMac 2021.
Languages supported by live text
Currently, iOS 15 live text recognizes 7 languages, including: English, French, Spanish, German, Portuguese, Italian, Chinese.. Apple plans to add support for more languages in the coming months.
How to use live text on iOS 15
Method 1: Use live text with the iPhone and iPad camera apps
Live text is neatly integrated into the iOS 15 stock camera app. So whenever you come across something you want to search, copy, share or translate, it’s very easy to do.
1. Launch the Camera app on your device and point it at the object or text. afterwards, Look for the “Live Text” icon Tap it. Note that the live text icon appears in the lower right corner for portrait orientation and in the lower left corner for landscape orientation.
2. The live text will now recognize the text in the image.Then you can Select text Use one of the following options as needed:
- copy: Tap to copy the selected text.
- select all: Tap it to select all the extracted text.
- look up: Tap it to find the meaning of the selected word or search the web.
- translation: Tap it to translate the selected text.
- share: Tap it to share the extracted text with email, iMessage, or any other app. You can also save the text in the File app.
It’s worth noting that tapping a phone number brings up a new context menu with multiple options such as FaceTime, sending a message, adding to contacts, and copying. Similarly, tapping an email address gives you the option to send an email immediately.
Method 2: Use Live Text in the Photos app on your iPhone and iPad
Live text works seamlessly in the Photos app as well. So if you come across an image (including screenshots) and want to extract text from it, it’s easy to do. Follow the steps below.
1. Open Photo app On your iPhone or iPad. Then go to the image where you want to extract the text.
2. Then tap the small tap Live text icon (It looks like a square viewfinder with three lines) It’s in the lower right corner of the screen. Live text instantly recognizes all available text in the image.
3. Then select the required text Copy, share, search, or translate.
Method 3: Recognize handwriting on iOS 15 using live text
Live text is just as efficient in extracting text from handwritten notes. Live Text OCR is very useful if you want to digitize your notes for secure access between devices.
Just start Camera app On iOS 15 devices. Next, make sure the finder appears around the handwritten note.Then tap Live text button Converts handwriting to text. Then select the text and do what you want.
Use live text within apps such as Apple Messages and WhatsApp on iOS15
Imagine you are in a conversation with a colleague or friend. Suddenly you need to extract and share the text from the image. Want to get everything done without leaving the conversation thread? Yes I will. This is the flexibility that iOS 15 offers by allowing you to call live text from within your messaging app.
Open a messaging app like Apple Messages. Then press and hold the text entry field. afterwards,”Text from the camera“option. Then use the iPhone / iPad camera app to extract the text from the image and share it without leaving the messaging app. It’s pretty cool.
Use live text in Spotlight search on iOS 15
Spotlight has many interesting hacks, including the ability to make live text available on iOS 15. To use this feature, swipe down from the center of the home screen to display Spotlight Search. Then touch the text entry field and click “Text from the camera“option. Now, go ahead and do what you need to do, as described in the steps above.
Search in Safari on iOS 15 using live text
Having smart OCR tools accessible in your browser at any time means that you can use them to recognize text and start your search without wasting time. So the next time you come across something intriguing, don’t forget to launch the live text OCR feature in Safari on iOS 15.
To use it, press and hold the text box in the search bar and clickText from the camera“option. Then follow the steps above to extract the text from the image or the actual object and start a quick search.
Please check the file size of the selected live text before sharing
Interestingly, iOS 15 also allows you to check the file size of the selected text before sharing. This is useful if you want to minimize the file size of the selected text for smooth sharing.
- After using the live text feature to recognize the text on your iPhone or iPad, select the text you want to share. next,”share“” Button in the popup context menu.
2. Next, “Performance file size“” Option for iOS15 shared sheet. It’s done! The file size of the selected text is instantly displayed at the top of the screen.
Translate selected text offline using live text from iOS15
In particular, you can also translate the selected text offline. Save it in case your iOS 15 device is not connected to the internet, but live text continues to work and problems should not occur. Please note that offline translations may not be as accurate as online translations.
To get started, make sure conversion is enabled on your device. Also, make sure you have downloaded the language you want to translate.
- Open the Settings app on your iOS 15 device. Then scroll down and say “translate“.
2. Next, “On-device mode“option.
3. Also, download the language you want to translate. At the beginning of this article, we have listed the supported languages.Then tap Downloaded language Download your preferred language and make it available offline.
4. That’s it! From now on, you will be able to translate languages even if your iPhone or iPad is not connected to the internet.
Apple Live Text or Google Lens: Which is better?
Google Lens has been around for a few years now, but Apple’s live text feature is just getting started. Therefore, in terms of experience, Google offers take cake. Not only that, Google Lens also has an advantage in terms of universal compatibility.
Unfortunately, live text isn’t because it’s integrated into iOS 15, iPadOS 15, and macOS Monterey. Another division that Lens is ahead of its time is translation support for a whopping 103 languages, as opposed to Live Text’s compatibility with seven languages.
But what’s favoring Live Text is its unparalleled system-wide integration into the Apple ecosystem. And it can get even better over time. That leaves us to the last frontier, efficiency. To avoid the temptation to make a definitive conclusion, here’s a detailed comparison of Akshay between Apple Live Text and Google Lens.
Tips for using live text on iPhone and iPad like a pro
That’s it for the live text of iOS15. iOS 15 is still working, but LiveText works very reliably. And by the time Apple finally releases the latest iOS iteration this fall, it should be even better. Google Lens may have years of expertise, but Apple’s Live Text seems to have a clear advantage in ease of use and intuition, at least for now. By the way, what do you think of iOS 15 live text? Share your thoughts on the best features of iOS 15, such as focus mode, the ability to drag and drop files between iOS 15 apps, FaceTime calls to Android, and Apple Digital Legacy.