The much-awaited Pixel 7 and Pixel 7 Pro are expected to be unveiled on October 6 at a launch event, but the event was preceded by the annual Search On conference for Google’s core business, search. The company talked about a number of intriguing updates, such as changes to Google Lens that make it easier to use and Search updates that help with cravings for a particular food.

Translation is one of the main applications for Lens, as Google is all too aware. In order to realistically overlay translated text against the same background as the original text in a foreign language, the company now uses Generative Adversarial Networks (GANs). This keeps the context of a poster in a foreign language intact during translation by making the translation seem authentic and immersive.
Google claims that Lens responds to 8 billion queries each month and that users are prepared for the next major development, which is the integration of multi-search, a feature that combines image searches on Lens with text inputs. This function was first introduced to us at Google I/O, after which it underwent US beta testing. Google announced that multi-search is now available in 70 additional languages at Search On.
With the use of what Google refers to as multi-search near me, multi-search is also getting better at delivering localized results. The function should allow you to point Lens at an object and find comparable products at stores close to you starting this fall in the US. According to the business, this will also work with food and plants. Simply ask Lens to identify the object in question, then use multi-search to find a place that serves it or a place that sells the plant’s seedlings.