Ryan Haines / Android Authority
I’ve been using Google Lens for as long as I can remember. Whether I’m trying to figure out which version of a Nike running shoe I’m looking at or if the plant I just brushed up against is poison ivy, I love to whip out Google Lens for a quick answer. Then, when Apple announced Visual Intelligence for the iPhone 16 as part of its Apple Intelligence rollout, I was intrigued. It looked almost exactly like the experience I got from Google but with additional answers powered by ChatGPT. I waited patiently for the public beta of iOS 18.2 to arrive and set out to explore the latest chapter in the battle of Apple vs Google. Here’s how Apple Visual Intelligence compares to Google Lens.
Oh hey, a use for the Camera Control
Ryan Haines / Android Authority
I’ve had an interesting journey with the iPhone 16 Pro’s Camera Control so far. After about a week, I went from having high hopes for it to largely ignoring it, deciding that Apple had some work to do before I was ready to love a manual control. Well, someone at Apple must have listened to my complaints because the easiest way to activate Apple’s Visual Intelligence is via a long press of that very same Camera Control — the only way to activate it, in fact.
Yes, two versions of Apple’s Google Lens alternative launched with iOS 18.2 — one for iPhone 16 users and another for everyone else (or at least those with an iPhone 15 Pro). If you have the iPhone 16 Pro, you’ll get a version of Visual Intelligence that works partly on-device, identifying whatever your camera sees and searching Google for related images. Otherwise, you’ll get an off-device version powered by ChatGPT instead — more on that in a minute.
So far, I’m loving Apple’s hardware-based activation of Visual Intelligence. It requires just long enough of a press that you probably won’t activate it by accident, unlike Type to Siri, which triggers on accident most of the time. While using Visual Intelligence, the Camera Control continues to function like a manual camera control, letting you zoom in or out and trigger the shutter to search for an image, though I think it requires a bit firmer of a press than when I’ve used it in the camera app itself. However, that might be in my imagination as I’m used to a half-press letting me change between Photographic Styles, depth of field, and exposure rather than just triggering the shutter.
Google LensApple Visual Intelligence
I might even prefer Apple’s Camera Control trigger to Google’s current Lens setup. I keep jumping into the Pixel Camera app on my Pixel 9 Pro, expecting to activate Google Lens with a simple tap, only to remember that it’s not there. Instead, I’ve had to train myself to reach for the Google Assistant widget that lives across the bottom of my home screen. Technically, it’s no harder to reach than the Pixel Camera app itself, but I just keep reaching for the wrong one.
Questions? Ask ChatGPT
Ryan Haines / Android Authority
If you’ve ever used Google Lens, diving into Apple Visual Intelligence will probably feel familiar. The overall interface is almost identical, and the process of pointing your camera and pressing the shutter button is as familiar as can be. In fact, you might even get some of the same results when you search because both platforms use Google as their default engines. However, it’s how the two visual search engines behave when you need more than a basic image search that sets them apart.
With Google Lens, you get three options by default — the Lens button, Google Translate, and a homework helper for math equations. Each has its purpose, and they’re all lined up along the bottom edge for easy access. So far, I’ve used two of the three — I don’t have kids, nor am I a student — but I’ve found that they’re about as straightforward as can be. Using Lens is exactly as you’d expect, just point your camera and search, but the Translate feature is a lifesaver. I’ve used it for everything from translating foreign Pokemon cards to getting an English version of a Dutch menu, and it’s smooth as can be. You can also fall back on Google to look for more information about whatever your Lens is seeing.
ChatGPT in Visual Intelligence is like Google Translate, homework help, and more all rolled into one.
On the other hand, Apple Visual Intelligence offers just two buttons — one for Search and another for Ask. For the most part, Search is a lot like the overall experience of using Google Lens, but it’s slightly better automated. If you point it at a restaurant or shop, Visual Intelligence will automatically bring up a pop-up menu with hours, reviews, and options like calling the business or opening the website. For the same results from Google Lens, you have to press the shutter button and search for what you’re looking at.
However, Apple’s Ask feature is what makes Visual Intelligence interesting. It’s powered by ChatGPT and functions like the homework and translate buttons (and a Google search button), all rolled into one. Whether I needed help caring for sunflowers, as seen above, or making up a math problem — which is harder than you’d expect after being out of school for several years — ChatGPT seemed to have a fairly quick solution for me. Perhaps the only time I seemed to stump ChatGPT was when I wanted to use the Search half of Visual Intelligence to find the name of a local restaurant and then switch to the Ask half to find out what was on the menu. It cleverly picked out that I was looking at a bistro but couldn’t find the actual menu for the restaurant, instead suggesting things that a bistro might have.
How often do you use a visual search like Google Lens?
0 votes
More than once per week
NaN%
Less than once per week
NaN%
Never
NaN%
Apple Visual Intelligence vs Google Lens: Which one is better?
Ryan Haines / Android Authority
So, if Google Lens and Visual Intelligence are kind of the same but kind of different, how are you supposed to choose between them? If we ignore the fact that Apple’s Visual Intelligence only works on a few iPhones, the best way to decide is to test them against each other — which is exactly what I did. I took my iPhone 16 Pro and Pixel 9 Pro for a little walk around my sliver of Baltimore to see how they’d handle a few basic tasks and simple questions.
First, I stopped by my local independent bookstore to check its hours. When I pulled up Visual Intelligence, it immediately recognized my location and automatically brought up information related to the shop, as seen below. Apple stacked the current Yelp score with a few photos at the top and convenient buttons for calling and checking the website within my reach. On the other hand, Google Lens required that I press the shutter button to search for the shop window before giving me any information. When I did, I got a quick summary of Google’s reviews and whether or not the shop was open, but it was a little harder to reach for the buttons that would provide more information.
Apple Visual IntelligenceGoogle Lens
From there, I headed down the block to a local garden in the back of a historic home. We’re rapidly approaching the end of fall here in the Mid-Atlantic, so there wasn’t much life to explore, but I found a few sunflower plants clinging to the last little bits of life. I’m the furthest thing from a botanist, though, so I needed Visual Intelligence and Google Lens to tell me exactly what kind of sunflowers they were. Apple quickly pinned the plant as the thin-leaved sunflower (which is how I would have described it) and offered several links to related websites from the image search.
Lens, on the other hand, had a different idea. It came back convinced that I was looking at a Jerusalem Artichoke, which is slightly more invasive but also good for feeding butterflies. The Jerusalem Artichoke also has slightly different leaves, but they didn’t seem to match the ones I was looking at. After some further research when I returned home from my walk, I’m still not sure which visual search was correct, so I guess it’s good I was asking about flowers rather than potentially poisonous mushrooms. However, I’ll still give Apple a few points for offering some pretty reasonable advice on how to care for sunflowers should you decide to grow them yourself. It was certainly easier to ask ChatGPT for advice and get a quick answer than to head to Google and do a bit of my own research.
ChatGPT is nice, but Google Lens is far, far more accessible for most people.
Ultimately, that’s how my head-to-head comparison went — every time Google Lens or Visual Intelligence slipped up, I went, “Oh, maybe the other one is better,” only to have the results flip on the next test. For as much as I preferred Apple’s automatic shop recognition, I preferred Google’s use of its own reviews over those from Yelp. As nice as it is that Google has a dedicated homework button, a good chunk of Lens users will probably never need it. It wouldn’t be fair to call this competition a tie — I don’t think it is — but I have to imagine it would be tough to sway iPhone users to Lens or Pixel users to Visual Intelligence.
Perhaps the best way to decide between the two is to look at which one you can access. Google Lens is far more accessible, with support for just about any device that can download Google Assistant. That means it’ll work on your iPhone, your budget Samsung phone, or most tablets — just not a Google TV or Nest speaker. Apple’s Visual Intelligence, on the other hand, only works on the iPhone 16 series and the two iPhone 15 Pro models, and even then, some of the features are reserved for those with the Camera Control.
It’s a pretty strict limitation as far as I’m concerned, and one that would lead me to push most people towards the universality of Google Lens. However, Apple’s results are catching up quickly, and with a little bit wider support, Visual Intelligence could probably challenge Google’s visual search superiority.
Google Pixel 9 Pro
Impressive AI-powered features
Excellent build quality
Flexible, capable cameras
Reliable update commitment
Apple iPhone 16 Pro
Excellent hardware
Flexible cameras
Solid software support
Comments
Source link
GIPHY App Key not set. Please check settings