San Francisco: Google has announced a major update in its Search algorithms to better understand people’s queries and throw more relevant results.
By applying new neural networking techniques, Google said it can offer more relevant results for about one in 10 searches in the US in English and support for other languages and countries will come later.
The technology behind the new neural network is called “Bidirectional Encoder Representations from Transformers” (BERT) which Google first introduced last year.
“By applying BERT models to both ranking and featured snippets in Search, we’re able to do a much better job helping you find useful information,” Pandu Nayak, Google Fellow and Vice President, Search, said in a blog post Friday.
Google sees billions of searches every day, and 15 percent of those queries are ones it has not seen before.
“When people like you or I come to Search, we aren’t always quite sure about the best way to formulate a query. We might not know the right words to use, or how to spell something, because often times, we come to Search looking to learn — we don’t necessarily have the knowledge to begin with,” Nayak explained.
BERT models can consider the full context of a word by looking at the words that come before and after it — particularly useful for understanding the intent behind search queries.
“Some of the models we can build with BERT are so complex that they push the limits of what we can do using traditional hardware, so for the first time we’re using the latest Cloud TPUs (Tensor Processing Unit) to serve search results and get you more relevant information quickly,” said Nayak.
For longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, Search will be able to understand the context of the words in your query.
“We’re also applying BERT to make Search better for people across the world,” said Google.