Google Search can prove to be a frustrating mess when it focuses more on a few keywords in a search query, rather than the contextual details in it. This shortcoming tends to bring up results that are of little to no use. Google is now working to improve its eponymous search engine to better understand user queries, thanks to the magic of machine learning and natural language processing. As a result of the advancements, Google search can now better understand linguistic nuances and search query elements such as prepositions and conjunctions.
Pandu Nayak, Vice-President of Search at Google, wrote in a blog post that the company is employing a technology called Bidirectional Encoder Representations from Transformers (BERT) to improve search. BERT is essentially a neural network-based technique for natural language processing (NLP) pre-training that helps in the creation of custom answering systems. The key advantage of BERT is that it analyses a search query by focusing on the context of a sentence similar to how humans naturally communicate and understand language, rather than just doing a word-by-word analysis.
“With the latest advancements from our research team in the science of language understanding–made possible by machine learning–we’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search”, Nayak wrote. He adds that improvements to Google Search are more pronounced in the case of English, but the results for languages such as Hindi, Korean, and Portuguese have been encouraging too. The gist is that BERT will allow the search engine to better understand linguistic elements such as “to”, “from”, and “for” among others and will accordingly bring up search results.
The focus here is on enabling users to look up their query on Google Search in the same way they would put it in a conversation with another person, rather than typing a keyword-heavy gibberish in the search field that would make no sense in a natural conversation. Google is currently testing BERT-backed search models in two dozen countries and aims to improve Google Search to an extent that it can bring up relevant results in response to queries of a conversational nature, rather than limiting them to a collection of hit-and-miss keyword-laden queries.
The change comes as Google has made a push to update its search engine, which is more than 20 years old. The company last month announced a feature for people to swipe left or right on TV and movie recommendations, to better train Google’s search algorithms. Google in August also added playable podcasts to search results.
Google’s iconic search engine is still its cash cow, as well as the cornerstone of the company’s digital advertising business. Google, which generates more than $115 billion a year in sales, makes more than 85 percent of its revenue on ads.
The company has also revamped some of its other core search services, including Google Images and Shopping. In August, the company redesigned its image search tool to include more information alongside photos. So if you click on a product image, it’ll show you information on brand, price, reviews, and availability. And in May, the tech giant unveiled a major update to Google Shopping, letting users see options to buy products from physical stores or directly on the Google site itself.
In September, the company confirmed that third-party workers who analyze language data from the Assistant leaked private conversations of users in the Netherlands. Belgian public broadcaster VRT NWS said more than 1,000 files had been leaked, including recordings from instances where users accidentally triggered Google’s software. As a result, Google paused language reviews of the audio data and later made it an opt-in program.