Google has presented its vision of the future – it will release its AI agents online, which will search the internet and answer questions based on the real environment, tastes and preferences of a given person. Additionally, it will also introduce simultaneous translation during meetings in Google Meet.
Google is facing one of its death knells? Artificial intelligence is a bigger competition than ever, and chatbots such as ChatGPT or AI-based search engines such as Perplexity are emerging as alternative searches for information or finding solutions, which is hitting the company's core business. So Google boasted about its work on a new device, which is supposed to prove to both users and investors that the 30-year-old search engine will still count in the next decade of its existence.
Google is expanding its AI mode to all users (for now, those in the US). The first difference will be the way queries are processed. Instead of looking at the entire question, the AI will break it down into “subheadings” and generate additional searches based on them. It will also use the history of Google searches to personalize the answers more. It will be up to users to connect them to Gmail and other Google apps.
See alsoHow artificial intelligence will affect the work of accountants
The search itself will also change . It will not only be based on queries entered by users, but also using a camera that scans the surroundings.
Google Wallet is like a bank – if you don't verify yourself, you won't get in
Some Google Wallet users have already noticed a subtle change in the app's operation. Even if we had an unlocked phone, after launching the wallet, a screen appears that forces user verification. Contrary to appearances, this step makes sense.
MORE…
Google will also fill out the appropriate forms for us when buying, say, tickets. All you need to do is ask the search engine, for example, about the cheapest tickets to a Bryan Adams concert. AI will review the upcoming performances in terms of different locations and prices, then fill in the data required for the purchase and already present us with a “finished product” for which we will only have to pay. Reservations at restaurants or cafes are to be made on the same principle.
The Lens service will go a step further. Currently, it allows you to ask questions about photos you've taken. Now, however, it will answer questions such as whether this bicycle screw (photo) is suitable for repairing a bicycle frame that we'll cover with a phone camera.
Google has also introduced real-time speech translation in Google Meet. For now, the feature is only available to AI Premium subscribers. But it is already known that what they are testing may soon become widely used. When a user turns on the feature in a Google Meet video call, the AI audio model uses their speech to translate live into another language. Google started with English and Spanish, but more languages will be added in the next few weeks. It is like watching a movie on TV – the original sound will be slightly muted, and the translated words will stand out.
“Think of a simultaneous interpreter, or someone who listens to a speaker while simultaneously speaking in another language,” says Yulie Kwon Kim, vice president of product at Google Workspace. “We’ve taken that to the next level. The interpreter is not another person’s voice, but the speaker’s own voice.”
Live speech translation in Google Meet is here. Speak naturally—your words are translated in near real time while preserving your tone, voice, and expression. Available in English and Spanish, with more languages coming soon. → https://t.co/c3Do5qhPNu#GoogleIO pic.twitter.com/Yd793BYKtQ
— Google Workspace (@GoogleWorkspace) May 20, 2025
Google has also decided to compete in the “glasses” discipline, including with Meta. The new device, which has not yet received an official name, has a built-in screen, speaker and camera. Unlike previous products in Google frames, at first glance they look like regular glasses. It can also be equipped with corrective lenses.
Android XR has been combined with the AI Gemini assistant, which opens up virtually unlimited possibilities. For example, listening to a person in a language unknown to us and the ability to turn on subtitles that translate their speech similarly to subtitles in a cinema. The built-in screen allows you to display not only text, but also images or applications. So when visiting an unknown city, you can turn on Google Maps on the glasses, and not walk around with your nose in your smartphone. The glasses can also be used to take pictures.
– When I look at the world, I have glimpses of a proactive world, and the answer will be provided by AI agents, among other things. Everything will be better and better – said Sundar Pichai, CEO of Google and its parent company Alphabet, quoted by CNN.
Speaker at developer conference:
Or on X here: https://t.co/FjpBuW9x8X
— Sundar Pichai (@sundarpichai) May 20, 2025
ed. aw