AR comes to Google Search, new features for Google Lens

AR comes to Google Search, new features for Google Lens

Technology

Image recognition and AR applications have a long tradition at Google. Even before Google Lens, the Google app analyzed captured photos and provided information about the image content. At this year’s Google I / O, new features have been introduced for its successor Google Lens and announced that the Google search on smartphones will enrich the results with AR elements in the future.

AR search results are useful, for example, when it comes to estimating real sizes or representing movement sequences. Often, AR and 3D rendering are the better ways to explain things more clearly and graphically than with printed information, explained Aparna Chennapragada (VP Google Lens & AR) in conversation with journalists.

AR shows its strengths most clearly when it comes to orientation in a foreign environment. Google shows how it looks like on the event site of the I / O: As soon as the camera is placed on an information sign, virtual signs are placed in the live image of the camera showing where to find something. In real life, you would use Google Maps for such tasks. The maps app will now soon get the AR-directional arrows, which were already presented at last year’s Google I / O .

Better image recognition
In addition to AR, the image recognition of Google Lens to be improved: On Google I / O was demonstrated, among other things, as the smartphone camera captures a cooking recipe and automatically launches matching videos that show how to prepare the dish.

Lens also wants to be better prepared for everyday tasks – such as at a restaurant visit: If you film the menu with Lens, the app highlights the name of the court visually. A click on it brings pictures of food on the screen. If a group wants to pay separately after the meal, Lens presents the bill, whereupon the app calculates the tip and the individual partial amounts on request. The new features should be unlocked this summer.