Google BERT: Google he said one of the biggest search algorithm updates in recent years. Using new neural network techniques to better understand query intentions, Google says it can now display more relevant results for about one in 10 searches in the US in English (with support for other languages and sites).
For featured snippets, the update is already operating globally.
Any updates to the search algorithm are often very subtle. An update that affects 10% of searches is a big deal (and certainly keeps alert SEO experts around the world).
Google says this update will work best for larger, queries, and many other ways, so you'll love to search Google because it's easier for you to write a complete sentence than a keyword sequence.
The technology behind this new neural network is called "Bidirectional Encoder Transformers Reports" by Bidirectional Encoder Representations from Transformers and we will simply refer to it as BERT.
Google first talked about BERT last year and opened the code for applying to pre-trained models.
Transformers are one of the latest developments in machine learning. They work particularly well for data where data sequencing is important, which obviously makes them a very useful tool for working with natural language and, therefore, search queries.
This BERT update also marks the first time that Google uses the latest Tensor Processing Unit (TPU) chips to display search results.
Ideally, this means that Google Search will now be able to understand exactly what you are looking for and provide more relevant search results and quotes.
The Google BERT update was launched this week, so you'll have a chance to see some of its results in the search results.