Google Lens will help you find places serving that muffin you photographed
Google’s much-anticipatedPixel 7 and 7 Proare scheduled forthe October 6 Pixel launch event, but it was preceded by the annualSearch On conferencepertaining to Google’s main business — Search. The company discussed several interesting improvements, including changes to makeGoogle Lensmore intuitive to use and Search improvements which help with pangs of hunger for aspecificfood.
Google knows all too well that translation is one of the primary use cases for Lens. The company now uses Generative Adversarial Networks (GANs) to realistically overlay translated text against the same background as the text in a foreign language. This makes the translation seem realistic and immersive, so a poster in a foreign language won’t lose context in translation.

Google says Lens answers 8 billion questions every month, and the company trusts users are ready for the next big step — combining image searches on Lens with text inputs using a feature calledmultisearch. It introduced us to this feature at Google I/O, following which it was beta tested in the US. At Search On, Google announced multisearch is now available in 70 new languages.
Multisearch is also getting better with localized result delivery using what Google is callingmultisearch near me. Starting this fall in the US, the feature should help you point Lens at an object, and find similar items at retailers near you. The company says this will work with food and plants too. Just have Lens identify the item in question, and use multisearch to find a restaurant that serves the same dish or a gardening store that sells saplings of that plant.

Speaking of food, Search can help yousatiate your desire for a specific dishmore easily than before. Instead of searching for Chinese restaurants near you, you’re able to directly search “noodles near me,” and Search will help you reach the nearby restaurants that serve noodles. Google says you can further narrow down your search depending on the desired level of spice, giving the example of “soup dumplings.” To help you find the best place for a dish, Google says that in the coming months, Search will use machine learning to identify the specialty dishes for every restaurant, replete with filters for vegan and vegetarian preparations. Google could add more filters in due course.
Besides these improvements to Lens and Search, Google also introduced us tochanges in Mapsat the conference, like theaerial viewsfor landmarks and a new “vibe check” feature for neighborhoods you plan to visit. TheSearch results display is also changing, so you find all the pertinent information in a more easy-to-consume layout.

The note-taking app I should have used all along
Broader branding hints at wider paid-tier ambitions

An advanced, compact, purpose-built device
$135 is its lowest price in months
![]()
Strong alternatives to aging competitors
Pixel downsides never change