Live Text in iOS 15 vs Google Lens in Android 12 Comparison

by -50 views

iOS xv brings a number of features to the table. One of the central improvements in the operating system is the power to excerpt text out of pictures. Called Live Text, iOS 15 makes it easier for you to fail a phone number or visit a website direct from your photographic camera. Android devices accept had this feature for quite a while, chosen Google Lens. How Live Text in iOS 15 compares against Google Lens on Android 12? Let’s dig in.

After posting a detailed Siri in iOS 15 vs Google Assistant in Android 12 concluding week, YouTuberIn Depth Reviewsposted a video comparing Alive Text in iOS 15 vs Google Lens on Android 12. Before we take a await at the test results, we should keep in our minds that iOS 15 is in beta right at present. Apple will fine-melody the text extracting feature before stable gyre out in the fall.

Live Text vs Google Lens: Text Recognition

The YouTuber divided the test into six parts. The outset one, and the almost important one, is how well the text recognition works on both. iOS 15 has a better text recognition system since you can directly search for the text from Spotlgith — no need to become to the image and select the text, you tin can straight do information technology past typing the text you’re looking for directly in the Spotlight.

When extracting text from images that have printed texts, such every bit contact cards and pamphlets, both Google Lens and Live Text worked but fine. However, in extracting text from paw-written images, Google Lens took the lead. Overall, both text extracting services worked but fine.

Translating Text

When it came to translating text from images taken from a computer, both Google Lens and Live Text worked fine. Notwithstanding, once again, Google Lens translated the text better when information technology came to translating text from manus-written images.

Visual Lookup

Visual Lookup is a characteristic in iOS 15 that provides you with the information of a landmark direct from the Photos app. Google Lens can already do this thing, with an added ability to search for objects also. Every bit expected, Lens came on top since Live Text doesn’t recognize objects as of at present. Even in extracting landmarks from an image, Live Text made an mistake identifying Dubai’south Tolerance bridge equally New Castle’due south Millennium span.

It’southward articulate from this test that Alive Text is all the same in beta right at present. Apple will further fine-tune the software over the upcoming days, and we’ll report more than on this equally the launch comes closer.

Eager to give iOS 15’south new Live Text feature a try? Y’all tin can download and install iOS 15 Beta 1 on your iPhone right now.

On a personal note, I’m all the same a fan of Live Text on iOS xv. The fact that y’all can search for whatsoever text from any image directly from the Spotlight excited me. Have you tried Live Text yet? Practise you prefer Google Lens or Apple tree’south Alive Text? Why do you lot prefer it over the other? Allow us know in the comments department beneath!


Posted by: