Meet BERT, Google’s Newest Major Search Algorithm Update

Google updates its algorithms nearly every single day without much discussion or note. As recent as the end of October, Google released its biggest update, yet: BERT. A breakthrough in NLP (natural language processing), BERT is a monumental leap forward into how SEO (search engine optimization) will understand queries in the near future. There are several examples of individual models in the field of NLP that all perform one function and one function very well. Such examples are classification, entity recognition, and question and answering. BERT bundles each specific function into one algorithm to provide more emphasis on the intent of a search based on words surrounding key words rather than left to right word translation.

Meet BERT

BERT stands for Bidirectional Encoder Representations from Transformers. It is produced from an open-sourced, neural network-based technique that enables users to train their own search system. Simply put, BERT is a “pre-trained unsupervised natural language processing model.” Through researching transformers (models that process words in relation to all the other words in a sentence), BERT is being trained in the art of understanding context within a search query. This is particularly useful for longer, more conversational searches.

How, though, does a computer get so smart? A simple explanation is a much bigger computer with a lot more power and text. First, the computer must translate “English” to “machine.” BERT learns by taking an arbitrary length of text and transcribes it into what is called a vector. A vector is a fixed string of numbers. Said text was taken from Wikipedia and after being transcribed was then trained by what is called “masking.” Masking hides a random word within a sentence to train a computer to identify what that word could be. BERT, being bidirectional, is trained to look at both the words before and after the masked word to predict what it is. This process is repeated over and over until it becomes excellent at predicting masked words. Then, it is fine tuned to do eleven of the most common NLP tasks. The result is a natural language processing model that focuses on the intent of a search by analyzing prepositional words like “to” or “for” rather than base words like “traveling”, “abroad”, or “tourism.”

By applying all eleven natural language processing functions and BERT’s bidirectional function, searchers are provided with the most relevant and useful information at the top of a Search Engine Results Page (SERP). BERT further narrows results of SERP features (special results on SERPS that are intended to offer users helpful information with minimum clicks) and featured snippets. (Featured snippets provide direct answers to a question on a SERP without the user having to click to find a specific result.) For example, take an inquiry such as “International wedding criteria from US to Mexico 2019”. The word “to”, prior to BERT, would have been overlooked and yielded many results on how to travel from Mexico to the US for marriage rather than answering the intended question. When BERT is applied, however, the first answers on a search engine results page would be precise to the context of the request: “How to get married in Mexico – list of requirements for US citizens”. It provides for much more relevant information. Let’s look at another search example: “do bank tellers stand a lot at work”. Without BERT, the search engine would grasp at straws trying to match keywords and yield results that may not necessarily apply. BERT, however, understands the context of keywords so “stand” only has the definition its context implies (physical demands of a job).

How to Optimize for BERT

Technicalities aside, what does this mean for you? All experts suggest “not much.” There really isn’t a way to go around the system to enhance your SEO (search engine optimization). Writing valuable and up to date content is the best one can do for their customers. In fact, all the hype around BERT seems to be just that – hype. Leaders in the NLP industry, while very excited about the possibilities BERT presents, are still working to enhance context understanding of such a model. In fact, when discussing where the industry goes from here, experts predict they will be working to enhance the same trajectory for quite some time.