Google BERT: Everything you need to know about Google’s BIGGEST Update Since RankBrain

Google Updates

The world of SEO is an ever-changing landscape. And experienced agencies know that just as soon as they think they’ve found their footing, they can bet that the ground will lurch beneath their feet. In its quest to bring high quality, relevant content to users, search engines (oh, who are we kidding? We’re talking about Google) frequently roll out and refine new algorithms. Just as soon as we’ve all gotten used to RankBrain (has it really been five years already?) there’s already a game-changing new kid on the block. He began to roll out towards the end of last week and will be fully live shortly. 

Let’s take a few minutes to get to know this new kid a little better and try and understand his implications on SEO practices for agencies and businesses alike. 

Meet BERT

So, who is BERT? Essentially, BERT represents a shift towards trying to better understand us, and our queries. BERT stands for Bidirectional Encoder Representations from Transformers. Despite the pop-sci fi acronym, and the Google’s promises that this is one of the biggest algorithms to impact on the world of search for years, its function is actually very similar to RankBrain. At least, insofar as both are machine learning algorithms that are designed to help the search giant better content on the page and the search queries that lead it there. 

The principle difference is BERT’s emphasis on the kind of conversational and natural language that many Google users use in their searches. Rather than picking out a few keywords out of a conversational question or proposition and searching accordingly, BERT has a much better understanding of context and nuance. This “neural matching” is what makes the most fundamental step up from RankBrain. It’s also what makes BERT such a potential gamechanger.

How BERT works, an illustrated example

BERT’s big advantage is its ability to leverage context to help utilise semantic groupings of keywords. Even if a keyword isn’t present in the search query, BERT’s understanding of context may still utilise it to bring a higher quality and more relevant answer to the SERP.

Writing for SEO roundtable, expert Barry Scwartz uses a common problem people have with new TVs as an illustrative example of how BERT works.

Much to the chagrin of many filmmakers, a lot of newer TVs have a mode colloquially known as the “soap opera effect”. This more adds extra frames to make movement on screen look more fluid but in reality the effect makes even big budget blockbusters look cheap and uncanny, much like a soap opera. 

Recognising that this is the common denominator in many queries pertaining to TV picture, BERT is able to take a generic query like “Why does my TV look strange” and extrapolate that the “soap opera effect” may be culpable, even though the keyword “soap opera effect” isn’t part of the query.

This example rather neatly illustrates the algorithm’s understanding of context as well as its increased understanding of the conversational nature of a growing number of search queries.

BERT is big!

Google have stated in no uncertain terms that BERT’s presence represents a giant leap forward in the realm of search. Indeed, the algorithm’s announcement promised the “biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search”. The company has disclosed that it has already been testing BERT for some time, and sees significant benefits to using it. 

BERT is currently helping out with a massive 10% of all US English language queries and will soon be rolled out in a number of languages all over the world. 

BERT and featured snippets

Another place in which BERT’s presence will be felt is featured snippets. These are already a fantastic resource for getting users the results they need to their queries without even needing to navigate away from the SERP. 

These snippets will potentially be higher quality and richer in relevant detail. But will click through rates be negatively affected as a result? Only time will tell.

So, is RankBrain dead?

No, not at all. BERT’s function is to supplement RankBrain and other algorithms rather than supplant them. 

Can you / should you optimise for BERT?

It’s about time we address the question that all agencies will be asking. Can you (or should you) optimise for BERT? While this is always a well intentioned question, it’s one that has tricky implications. Writing to optimise for any algorithm can lead agencies and copywriters down a slippery slope. 

Try too hard to write for the search engine and the quality of your content inevitably suffers. It’s a road that leads to keyword stuffing and other dodgy practices that reduce the value of your content for the people who really matter… your readers.

While BERT is without doubt a big deal in the world of search, its advent marks an effort to make the search engine better understand page content rather than the other way around.

Early reactions to BERT

BERT is still in its infancy, so the impact on the wider SEO community has yet to be quantified. However, tweets in one of the few threads pertaining to the new algorithm speak eloquently to the truth that when agencies and copywriters stick to known best practice, the rest pretty much sorts itself out. 

In a nutshell

BERT is an important step forward in helping Google to better understand users’ queries and the language that they use when interrogating search engines. However, early chatter within the SEO community is largely concentric around wondering why its effects haven’t been felt. The principal reason, of course, is because agencies don’t really need to “do” anything outside of the realm of generally accepted best practice.

This is most certainly one of those “keep calm and carry on” scenarios. If you’re writing high quality and relevant content, you don’t need to worry about posting flags for Google. 

BERT will help it find you!

 

Last Updated on

Join 30,000+ Subscribers We Share SEO Secrets With

* No spam here, just the odd one or two insights we think you'll really appreciate *