Google strives to enhance the relevance of the listings presented in the search engine results pages (SERPs) with every algorithm update. A recent update to the SERPs is Google’s Bidirectional Encoder Representations from Transformers (BERT), which utilizes natural language processing. BERT is considered one of the most significant changes introduced by Google in the last five years — directly affecting one in 10 search queries.
It aims to display more relevant results by correctly interpreting complex, long-tail search queries. In this post, we will discuss what this means and how it changes the way you search.
It is a neural network-based technique for NLP pre-training that enables Google to identify the context of words in a given search query more accurately.
For instance, consider the phrases “six to 10” and “a quarter to six,” the same preposition “to” has a different meaning in each phrase, which may not be obvious to search engines. However, here is where BERT becomes useful, as it can effectively make the distinction between the context of the preposition in the first phrase compared to how it’s used in the second phrase. Through the understanding of context, it can provide more relevant results.
Neural networks of algorithms facilitate pattern recognition, while neural networks trained on data sets can identify patterns. Its typical applications include image content, predicting financial market trends and even recognizing handwriting. Whereas natural language processing or NLP is a fraction of artificial intelligence (AI) that deals with linguistics.
Advancements facilitated by NLP that internet users and online businesses make use of every day include social listening tools, word suggestions and chatbots.
BERT is an NLP algorithm that utilizes neural networks to produce pre-trained models. These models are trained using endless amounts of data available on the web. Pre-trained models are generic NLP models that are further refined to perform specific NLP tasks. In November last year, Google open-sourced BERT, claiming it provided complete and relevant results on 11 NLP tasks, including the Stanford question answering dataset.
BERT’s bidirectionality sets it apart from other algorithms, as this enables it the ability to give context to a word. It can do so by not just considering parts of the sentence leading to that word, but also taking into account the parts following it. Bidirectionality allows search engines to understand the meaning of a word such as “film” that has a different meaning when used in “window film” as opposed to when used alongside “blockbuster.”
In search, BERT facilitates the understanding of key details of a query, especially when it comes to complex, conversational queries or those with prepositions in it. For instance, in the query “2021 Indian traveler to Bali needs a visa,” the preposition “to” suggests that the traveler is going from India to Bali. By changing the preposition, you could change the sentence around completely, which would read like “2021 Indian traveler from Bali needs a visa,” and could mean the travelers are from Bali and need a visa for India. BERT allows for the understanding of the contextual difference between the two sentences.
RankBrain was Google’s first AI method applied in search. It runs parallel to organic search ranking algorithms and makes adjustments to the results calculated by those algorithms. RankBrain adjusts results offered by the algorithms based on historic queries.
RankBrain also facilitates Google to interpret search queries so that it can display results that may not have the exact words as the query. For instance, when looking for “the height of the landmark in Dubai, “it will automatically show you information related to Burj Khalifa.
On the other hand, the bidirectional component of BERT makes it operate in a very different manner. Where traditional algorithms look at the content in the page to gauge relevancy, NLP algorithms take it a step further by looking at the content before or after a word for additional context. Since human communication is usually complex and layered, it makes this advancement in processing natural language essential.
Together, BERT and RankBrain are utilized by Google to process and understand queries. BERT is not a substitute for RankBrain but can be applied alongside other Google algorithms or in combination with RankBrain, depending on the search term.
With the ability to take what we’ve learned from one language and apply these to another, BERT is utilized to make search results more relevant for internet users across the world. For instance, what we learned from the most widely used languages on the web, such as English, are then applied to other languages. Thereby, offering improved results in other languages that people are searching in too. Moreover, the BERT model also enhances the relevance of featured snippets across countries and languages.
BERT also impacts Google Assistant by triggering it to offer featured snippets or web results influenced by the BERT update. NLP technology like BERT enhances machine comprehension and this innovation is undoubtedly beneficial for many online users and businesses. However, in regards to SEO, the principles remain the same. If you have the SEO best practices ingrained in your marketing strategy, then you can be assured of your web success. Websites that produce, high-quality, relevant and fresh content consistently will benefit the most from this algorithm update.
Writing superior content based on keyword research is an exercise that will remain a prioritized ranking factor across search engines. Website owners who focus on their users getting the informative and accurate content they expect, end up with a good ranking on the SERP. Monitoring the performance of pages, while creating great content will help websites stay relevant.
With BERT, irrespective of the language or words used in the query, the chances of Google getting the results right have become higher but are still not 100 percent. For instance, even with BERT, anyone searching for “what state is south of Nebraska,” they are likely to get results for “South Nebraska” instead of Kansas, which is likely the answer the user is seeking.
Helping machines to understand language remains an ongoing endeavor and deriving definite meaning from any given query is a complex process. When Google applies NLP to a list of key keywords, the top results displayed may not contain some or even just one of the required keywords, making those results irrelevant. With BERT, Google has upped its game by offering a sophisticated update to its algorithm, but search remains an unsolved problem because of the complex nature of human language.