My Blog
Technology

Google Search’s Video AI Lets Us Be Stupid

Google Search’s Video AI Lets Us Be Stupid
Google Search’s Video AI Lets Us Be Stupid


You can now get answers to all the dumb questions you’re too embarrassed to ask another person or struggle to phrase in traditional Google search. 

The Google I/O keynote this week was a two-hour advertisement for all the ways AI will augment and infiltrate many of the company’s biggest software and apps. There were demonstrations showing how existing AI features will get supercharged by Gemini, Google’s flagship generative AI-powered chatbot. But one of the more impressive examples was how it can empower Search to answer your questions asked while taking a video.

This is the AI future my shame-fearing self wants when I don’t know a seemingly obvious car part or whether I should get a rash checked out by a doctor.

On the other hand, I can’t ignore that the helpfulness is amplified by how much Google Search’s quality has nosedived over the last few years. The company has effectively invented a band-aid for a problem that it has continued to make worse. 

More from Google I/O 2024

On the Google I/O stage, VP of product on Google Search Rose Yao walked viewers through how they can do this. She used Google Lens to troubleshoot a malfunctioning record player, recording a video while carefully asking aloud, “Why will this not stay in place?” 

Without naming the offending part — the tone arm, which carries the needle over the record — Yao forced Lens to use context clues and suggest answers. Search gave an AI summary of what it estimated the issue to be (balancing the tonearm), gave suggestions for a fix, identified the record player’s make and model, and spotlighted the source of the information so she could look for further answers.

Read more: Google Ups Its AI Game With Project Astra, AI Overviews and Gemini Updates

google video search

Google/Screenshot by CNET

Yao explained that this process was made possible by a series of AI queries strung together into a seamless procedure. Natural language processing parsed her spoken request, then the video was broken down frame by frame by Gemini’s context window to identify the record player and track the motion of the offending part. Search then looked through online forums, articles and videos to find the best match for Yao’s video query (in this case, an article from audiophile manufacturer Audio-Technica).

Currently, you can do all these things separately and arrive, more or less, at the same answer… eventually. You could point Google Lens at something and get it to identify an object. You could also carefully phrase your problem and hope someone else asked about something similar on Quora or Reddit or elsewhere. You could even try searching your record player’s brand, trial-and-erroring your way to figuring out its exact model so you can refine your search. 

But assuming the Gemini-powered Google Lens works as demonstrated, you can’t get your questions answered by the internet as fast as what we saw on the Google I/O stage. Perhaps more importantly, you’ll get a lot of help while asking sensitive — and possibly embarrassing — questions.

Think of the possibilities. “What part of the car is this?” you might ask. “How often should I change these?” you might say, pointing to bed sheets. “What’s the best way to clean this?” you could say from your car as you point toward the food stain on your shirt. “How do I turn this into a pitcher of margaritas?” you may overconfidently ask as you point toward a counter covered with ingredients. Or perhaps when pointing to a part of your body in worrisome shape, “Should I get this checked out?”

Rose Yao using pixel phone to search with google gemini AI to search

Rose Yao gets results on her phone screen from her Google Lens-recorded video and spoken question.

Screenshot/James Martin/CNET

Google Lens, Search and its AI tools are no substitute for expertise or medical perspectives, so don’t think that the company has replaced professional opinions. But it can help you get over that agonizing first hurdle of trying to figure out what to search. In the record player example above, I needed to describe in text which part was having trouble — so I searched “anatomy of a record player” to visually identify the part while writing this article. 

Seasoned internet searchers can take it from there. But Google Lens could speed through the friction of refining searches when troubleshooting specific issues, which can be made all the harder if it’s a rare issue with sparse results. If it’s difficult to pinpoint the issue in a search term and your frustration compounds with shame, you might abandon your search. 

Thus, the Google Lens process — assuming it works broadly enough that people use it to look things up in real life — seems like a great enabler for a lot of the simple questions that you might have missed answers to decades ago. Heck, for those with severe anxiety, asking the faceless Google Lens for help could be a lifesaver instead of a human being. 

And if Google Lens lets me ask which part of my engine is the oil cap without having to suffer the judgment of my mechanic who I’ve been going to for years, so much the better.

Of course, these answers are only helpful if they’re correct. A Google I/O promo video shared with the audience had another example of using Google Lens to get answers; in this case a malfunctioning film camera. As The Verge noticed, Search’s AI-provided answers included opening the back plate, which would’ve exposed it to daylight and ruined the undeveloped roll of film. 

If the company’s AI can’t avoid making harmful suggestions, it shouldn’t be thoughtlessly parsing online sources of information. Then again, maybe the reason I’m so intrigued by AI surfacing search results is that it’s gotten harder to find useful intel online. 

AI, the Google Search Band-Aid

Google Lens’ new and useful capabilities are a reminder that information is harder to find on the internet these days, full stop. Search results are front-loaded with ads that look just like legitimate links, and after multiple algorithm tweaks over the years that mix up what results surface first, the overall quality of highlighted sites in results seem far worse than in the past. 

Amid those algorithm tweaks upending how sites get traffic through search, the search ecosystem suffers as sites turn to SEO tricks to rank pages higher than competitors (full disclosure: CNET uses some SEO tricks). I’ve heard multiple friends ruefully say that they append every Google search with “Reddit” to have a chance of getting their query answered.

In this reality, with manual searches producing less helpful results every year, using an AI to automatically parse through the drivel seems like the better choice. But for the search ecosystem, this seems like a temporary fix that’s harmful in the long run. If enough people rely on AI to do their searching for them, sites depending on that traffic will starve — and there will be no online answers for Google to send its AI to fetch.

Editors’ note: CNET used an AI engine to help create several dozen stories, which are labeled accordingly. The note you’re reading is attached to articles that deal substantively with the topic of AI but are created entirely by our expert editors and writers. For more, see our AI policy.



Related posts

Premier League Soccer Livestream: How to Watch Everton vs. Man United From Anywhere

newsconquest

Individuals are monitoring Russian oligarchs’ personal jets and superyachts

newsconquest

Google Pixel 6A Deal Brings New All-Time Low Price in Amazon’s Prime Day Event

newsconquest