Ever wanted to copy text from an image or poster and find out its meaning and translation? Now, Apple’s new image recognition feature, Live Text, makes it a lot easier. In addition, you can recognize objects, plants, animals, and monuments to help you understand the world around you.
But how well does it actually work? I decided to compare Apple Live Text with Google Lens. It has similar functionality and has been around for a long time. Let’s learn more about both and see how each works.
Contents
- 1 Live Text and Google Lens: Compatibility
- 2 Live Text and Google Lens: Features
- 3 Live Text and Google Lens: Text Recognition
- 4 Live Text and Google Lens: Translation
- 5 Live Text and Google Lens: Visual Lookup
- 6 Live Text and Google Lens: Ease of Use
- 7 Live Text and Google Lens: Accuracy
- 8 Live Text and Google Lens: Privacy
- 9 Verdict: iOS 15 Live Text or Google Lens-Which is better?
Live Text and Google Lens: Compatibility
First, Live Text is exclusively available throughout the Apple ecosystem and is compatible with:
- And iPhone A12 bionic chip Then run iOS15
- Run iPad mini (5th generation) or later, iPad Air (2019, 3rd generation) or later, iPad (2020, 8th generation) or later, iPad Pro (2020) or later, iPadOS15
- Mac with M1 chip running macOS Monterey
On the other hand, Google Lens is available for both iOS and android device. In this article, we will compare live text on iOS 15 with Google Lens on the same iPhone 11.
Live Text and Google Lens: Features
Live text is essentially Apple’s answer to Google Lens. Therefore, it provides many similar features. Let’s understand more about both.
What is iOS 15 Live Text?
Live Text on iOS 15 adds smart text and image recognition capabilities to your iPhone camera. You can use it to separate text from an image by simply pointing the camera at the target material or recognizing the text in the image in your photo library.Then you can look up Online text, copy It, translate Put it in a supported language, or share It
For live text Visual lookupYou can find details of objects, animals, plants, and places by pointing the camera at them or analyzing their photos.
What is a Google lens?
Google Lens was launched in 2017 and the app preview is pre-installed on Google Pixel 2. It then began rolling out as a standalone app for Android smartphones and is now integrated into the camera app for high-end Android devices.
On iOS, Google Lens Google app.. It provides text recognition, translation, object and location recognition, barcode and QR code scanning, and is also useful for student homework questions.
Since it’s been around for a few years, Google Lens is currently more developed than Apple Live Text and works much better. We tested and compared both key features. The results are described below.
Live Text and Google Lens: Text Recognition
After multiple tests with different types of text, we can conclude that the live text is currently hit or missed. Sometimes it works, and sometimes it just doesn’t recognize the text.
This is true whether you try to use it with an image of text or point the camera directly at the text.
Also, Handwriting recognition It didn’t work at all. This is probably because iOS 15 is still in beta testing and we look forward to testing text recognition again when the official release is released.
Tap to use live text Live text icon It is in the lower right of the image. This icon only appears if the system detects text in the image.
To recognize the text in the camera viewfinder, tap the general area of the text, then Yellow icon It’s at the bottom right of the screen.
As mentioned earlier, iOS 15 is currently not very good at recognizing the presence of text in images, so it works intermittently.
After several times text recognition worked well, I was able to select text and see what context actions I could perform.For example, I can choose to do Copy, select all, look up, translate, And Share the text.

Alternatively, you were offered the option to add a reminder to your calendar when the live text recognizes the time.

However, despite multiple attempts, LiveText did not recognize the phone number or address.
In comparison, Google Lens does the extraordinary task of recognizing all kinds of text, including handwriting. We also provided contextual actions related to phone numbers, web addresses, addresses, and more. So at the moment, this is a clear winner and very useful.

Live Text and Google Lens: Translation
Apple Live Text is currently available in only 7 languages, but Google Lens is available in all 108 languages supported by Google Translate.
Google Lens places the translation directly on the text in the image, and live text displays the translation below the image.

Both looked pretty accurate, but Google Lens has the added benefit of translating handwriting. This is pretty cool.

Live Text and Google Lens: Visual Lookup
Visual lookup is a recognizable feature object, Monument, plant,and animal Tap a small one to take a photo “I” icon Below the image.
Object identification
Unfortunately, this feature doesn’t seem to work in iOS 15 Beta. I tried various easily identifiable objects, such as iPhones and books, but didn’t get any results.

As shown in the screenshot below, Google Lens had no problem identifying things.

Identifying landmarks
Apple Live Text did not get results when trying to find a location in a photo.
Google Lens was pretty familiar with identifying landmarks, like my photo from Da Nang, Vietnam.

Identify plants and flowers
Again, this didn’t work with Apple Live Text, but Google Lens was able to accurately identify flowers and plants.

Animal identification
Like all other visual lookup features in live text, this doesn’t seem to work in iOS 15 Beta. As for Google Lens, it was able to identify my cat, but not so clear about the breed.

Live Text and Google Lens: Ease of Use
Most live text features aren’t working properly right now, but one of the areas it wins is Simple of use.. It’s integrated into the iPhone camera app, so it’s very intuitive and convenient. I’m sure it will be a hit when the official version of iOS is released and live text is at its best.
In comparison, Google Lens is a bit clunky because you need to get in to access it. Google app Tap Google lens icon.. Then you need to swipe to select the specific feature you want to use.

Live Text and Google Lens: Accuracy
At the moment, Google Lens is much more accurate than Apple Live Text for recognizing text, handwriting, addresses, phone numbers, objects, locations and more. This is because Google had more data and had enough time to make Google Lens incredibly intelligent.
Still, I’m confident that there will be some significant improvements next year after Live Text is officially released.
Live Text and Google Lens: Privacy
Google is renowned for tracking and using data to personalize the experience, improve services, and drive AI development further. Therefore, it goes without saying that data about what you search for using Google Lens is stored.
We didn’t find much information about Live Text’s privacy, but Apple prioritizes privacy, which will probably be incorporated into Live Text. You don’t have to worry about your data being stored on the server or shared with third parties. Please see this space for future updates after the official public release of iOS 15.
Verdict: iOS 15 Live Text or Google Lens-Which is better?
This concludes the comparison between Google and Apple’s image recognition tools. Since Google Lens has been around for several years, it’s more sophisticated and superior than live text when it comes to recognizing and translating text, searching for things and places online, and more.
But when it comes to ease of use, integration with the Apple ecosystem, and of course privacy, nothing beats Apple. The convenience that Google Lens offers comes at the expense of data.
That’s why Apple can catch up when it comes to image recognition technology, but Live Text is a great built-in feature for Apple users who are set up to enhance their use of iPhones and other Apple devices. What do you think? Please let us know in the comments below.
read more: