Previously called Google Glass, Google Lens is adding a number of interesting additions to its smartphone functionality.
The tech behemoth made this announcement at the annual developer conference held in Mountain View, California. The actual rollout of the new functionalities comes uncharacteristically quickly after Google I/O and is now available to users accessing Google Lens via their Apple devices and other ARCore-compatible Android devices.
Making dinner easier to understand – and to choose
The new “dining” features in Google Lens holds benefits for patrons and dining establishments alike. Users can now point their smartphone towards the menu, and the Lens app will highlight the most popular dishes on the menu (that’s if you prefer playing it safe when eating out) and can also display surface food information and photos taken from the Google Maps profile of the restaurant in question.
Paying is also simpler with Google Lens, with a functionality that enables users to take a picture of the bill and split it directly.
Furthermore, the already existing live translation functionality of Google Lens, which lets users take a picture of signs in foreign languages has been condensed and streamlined into a lighter version in the latest update to the app.
Google Lens extensively uses AI (artificial intelligence) and AR (augmented reality), two technologies that are changing the way in which we use our smartphones, adding significantly to what they are able to offer us in our everyday lives.
To make use of the new features in Google Lens, open the Google Assistant, Google Photos or Google search apps on your device, or access Google Lens directly on the camera app of Google Pixel phones