Google will launch several new updates for Maps including the Immersive View and five other features on search to help people gather and explore information in new ways.
The company announced at its Search On event today, teasing the upcoming launches of new Maps features called Immersive View, Neighborhood Vibe and Search with Live View.
Communications and PR Manager, West Africa Taiwo Kola-Ogunlade, said this in a statement on Thursday in Lagos.
Kola-Ogunlade said people will gather information in new ways using the new features leveraging machine learning.
He highlighted the six new features as multisearch, multisearch near me, translation in the blink of an eye, Google for iOS updates, even faster ways to find what you are looking for.
Kola-Ogunlade explained that people use lens to answer over eight billion questions monthly.
He noted that earlier in the year, Google introduced multisearch, a major milestone, to help people search for information.
“With multisearch, you can take a picture or use a screenshot and then add text to it — similar to the way you might naturally point at something and ask a question about it.
‘’Multisearch is available in English globally, and will now be rolling out in 70 languages in the next few months.
‘’Multisearch near me allows you to take a screenshot or a photo of an item, and then find it nearby.
‘’So, if you have a hankering for your favourite local dish, all you need to do is screenshot it, and Google will connect you with nearby restaurants serving it,” he said.
According to him, in translation, in the blink of an eye, one of the most powerful aspects of visual understanding, is its ability to breakdown language barriers.
“Google has gone beyond translating text to translating pictures – with Google translating text in images over one billion times monthly in more than 100 languages,” Kola-Ogunlade said.
Google is now able to blend translated text into complex images using major advancements in machine learning.
“Google has also optimised their machine learning models to do all this in just 100 milliseconds — shorter than the blink of an eye,” Kola-Ogunlade added.
He said that this used generative adversarial networks (GAN models) which powered the technology behind the Magic Eraser on Pixel.
“Google is working to make it possible to ask questions with fewer words- or even none at all- and still help you find what you are looking for.
“For those who do not know exactly what they are looking for until they see it, Google will help you to specify your question.
“Google is reinventing the way it displays search results to better reflect the ways people explore topics to see the most relevant content,” Kola-Ogunlade said.