We use cookies to make Serpstat better. By clicking "Accept cookies", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. Learn more

Report a bug

Cancel
5212 29
Serpstat updates 8 min read

What You Need to Know About Google BERT Update

google bert update
Alex Danilin
Joolie Varghese
Digital Marketing Expert at Web Destiny Solutions
The search engine uses a mix of calculations and different ranking signals to position the webpages in the SERP. In contrast to the previous years, Google presently makes a large number of changes to the algorithms each year. The most recent significant update was BERT, which was implemented in October. The reports state that it has affected 10% of all inquiries. It's the biggest update since Google released RankBrain.

What does BERT mean?

BERT stands for Bidirectional Encoder Representations from Transformers. Unlike other models, BERT is intended to pre-train deep bidirectional representations from an unlabeled text by jointly conditioning on both left and right context in all layers. In other words, BERT should enable the machine to comprehend what the words in a sentence mean, considering every single detail of the context. Thereby, this model can be fine-tuned with just one additional output layer. State-of-the-art models can be created for a wide range of tasks such as question answering and language inference, without substantial task-specific architecture modifications.

For example, consider the following two sentences:

Sentence 1: What is your date of birth?

Sentence 2: I went out for a date with John.

The meaning of the word date is different in these two sentences.

Contextual models generate a representation of each word based on the other words in the sentence, i.e., they would represent 'date' based on 'What is your', but not 'of birth.' However, BERT represents 'date' using both its previous and next context — ' What is your ...of birth?'

Google has displayed several examples regarding this search update. One such example is the query: "do estheticians stand a lot at work."
What You Need to Know About Google BERT Update 16261788382720
Previously, our systems were taking an approach of matching keywords, matching the term "stand-alone" in the result with the word "stand" in the query. But that isn't the right use of the word "stand" in context. Our BERT models, on the other hand, understand that "stand" is related to the concept of the physical demands of a job, and displays a more useful response, Google stated.

How will BERT affect SEO?

Digital marketing services are meant to make the websites better for search engines. Therefore, any update on the algorithms will influence the entire process. But, unlike Penguin or Panda update, BERT is not going to judge the web pages either positively or negatively. It is entirely related to improving the understanding of the human language for Google search.

BERT has the following effects:
Coreference resolution: it is the process of determining whether two expressions in natural language refer to the same entity in the world. BERT helps Google to keep track of objects when pronouns and noun-phrases refer to them. This may be particularly important for longer paragraphs with multiple queries referenced in the text.
Polysemy resolution: when a symbol, word, or phrase means many different things, that's called polysemy. For example, the verb "to get" can mean "procure" (I'll get the drinks), "become" (she got scared), "understand" (I get it), etc. Google BERT helps Google search to understand "text-cohesion" and disambiguate in phrases and sentences.
Homonym resolution: words that sound alike or are spelled alike but have different meanings. Similar to polysemic resolution, words with multiple meanings like 'to', 'two', 'to', and 'stand' and 'stand' illustrate the nuance which had previously been missed, or misinterpreted, in search.
Named entity determination: subtask of information extraction that seeks to locate and classify named entity mentions in unstructured text into predefined categories. BERT will help in understanding when a named entity is recognized but could be one of a number of named entities with the same name as each other.
Textual entailment: natural language processing is a directional relation between text fragments. BERT can understand several meanings for the same things in sentences, aligning queries formulated in one way and resolving them to answers which amount to the same thing. This benefits the ability to predict "what comes next" in the query exchange.
Questions and answers: questions will get answered more accurately in SERPs, thereby the CTR of the sites will be reduced. As language continues to be understood paraphrase understanding improved by Google BERT might also impact related queries in People Also Ask.

How to overcome the BERT update?

According to Google's search expert Danny Sullivan, "There's nothing to optimize for with BERT, nor anything for anyone to be rethinking." BERT is more about providing better search results to the user. Therefore, the page should have good and relevant content to be listed as a relevant one.

Here are some ways in which BERT update can be handled:
Create compelling and relevant content
Users are more prone to visit those websites where they receive the exact answer for their queries.
For example, if a user searches for "home remedies for dandruff", then they are expecting a web page that shares tips and home remedies for dandruff, not a website that sells shampoo or medicines for dandruff.
The product should be advertised, but the accuracy of the search results should not be compromised. The content should be formatted and arranged in such a way that the main focus goes to home remedies.
Focus on Context Than Keyword Density
The relevancy of keyword density is zero right now. So, stuffing keyword everywhere in the content is not going to make any significant impact. The main focus should be on the context, which is determined by processing a word to other words in the sentence, including the prepositions, preceding and succeeding words. Therefore, it is optimal to structure your content around how you would address a user query. Try to solve the user's problem with in-depth and direct answers that match their intent.
Long-tail and short-tail phrases
A lot of confusion exists in the case of the selection of the long tail and short tail keywords. Relevant information is found in lengthier content. BERT algorithm evaluates content in the sentence or phrase level, which has got its disadvantages. People always tend to search in natural language and we cannot determine the exact length of the search query. If the content is good and can answer the specific questions, then the length of the phrase is not a matter.
Website owners are now thinking of a way to optimize their sites without getting affected by the BERT update. But, it is not the right way to deal with the algorithm as Google itself has stated that there is no real way to optimize for the BERT update. This update was designed to help Google better understand the search intent of the user when they search in natural language. Relevant and accurate information should be provided on the website, which should be meant for real people, rather than machines. If these principles are followed, then the sites are already optimized for the BERT algorithm.
10 Things You'll Love About Our Newsletter Or Why You Should Subscribe

Learn how to get the most out of Serpstat

Want to get a personal demo, trial period or bunch of successful use cases?

Send a request and our expert will contact you ;)

Rate the article on a five-point scale

The article has already been rated by 9 people on average 4.33 out of 5
Found an error? Select it and press Ctrl + Enter to tell us