Google BERT: Google he said one of the biggest algorithm updates search recent years. By using new neural network techniques to better understand query intent, Google says it can now display more relevant Results for about one in 10 searches in the U.S. in English (with support for other languages and locales as well).
For featured snippets, the update is already operating globally.
Any updates to the search algorithm are often very subtle. An update that affects 10% of searches is a big deal (and certainly keeps alert SEO experts around the world).
Google says this update will work best for larger, queries, and many other ways, so you'll love to search Google because it's easier for you to write a complete sentence than a keyword sequence.
Η technology behind this new neural network is called “Bidirectional Encoder Representations from Transformers” and we will simply refer to it as BERT.
Google first talked about BERT last year and opened the code for applying to pre-trained models.
Transformers are one of the most recent developments in machine learning. They work particularly well for data where the sequence of elements is important, which obviously makes them a very useful tool for working with natural language and therefore search queries.
This BERT update also marks the first time that Google uses the latest Tensor Processing Unit (TPU) chips to display search results.
Ideally, this means that Google Search will now be able to understand exactly what you are looking for and provide more relevant search results and quotes.
The Google BERT update was launched this week, so you'll have a chance to see some of its results in the search results.