Google has today announced new ways to explore information on Search that makes results received about language, images and things in the real world more reliable, easier to understand and gives useful suggestions on where to find items being looked for. Through its optimised machine learning models, Search uses visual search and text translation in over 100 languages to complete any task in just 100 milliseconds — shorter than the blink of an eye.
The announcement made during the SearchON virtual event reveals that people visiting Search box now use Google Lens to seek answers to more than 8 billion questions every month. This entails using your camera or screenshot to search for information about images or text around them.
“Earlier this year, we made visual search even more natural with the introduction of multisearch. You can take a picture or use a screenshot and then add text to it similar to the way you might naturally point at something and ask a question about it. Multisearch is available in English globally and will be coming to over 70 languages in the next few months,” said Cathy Edwards, Google Vice President and General Manager of Search.
Google has also optimised its machine learning models and is now able to blend translated text into complex images making them look and feel much more natural. With the new Lens translation update, you can point a phone camera at a poster in another language, for example, and you will see translated text realistically overlaid onto the pictures underneath. This process uses generative adversarial networks (also known as GAN models), which help power the technology behind the Magic Eraser on Pixel. This improved experience is launching later this year.
Starting today, Google has added shortcuts right under the search bar on its Google app for iOS enabling users to shop via screenshots, translate text with their cameras, hum to search, search into a phone, solve homework with camera and identify songs by listening.
In the coming months, Google will be rolling out an even faster way to find what you need. When you begin to type in a question, Google will provide relevant content, keyword or topic options to help you craft the question. Google has made it easier to explore a subject by highlighting the most relevant and helpful information, including content from creators on the open web. For topics like cities, visual stories and short videos from people who have visited, tips on how to explore the city, things to do, how to get there and other important aspects that one might want to know about as they plan your travels will also be displayed.
Google is also reimagining the way it displays results to better reflect the ways people explore topics. Users will see the most relevant content, from a variety of sources, no matter what format the information comes in — whether it is text, images or video. And as you continue scrolling, you will see a new way to get inspired by related topics to your search.