Nicolas Nguyen, SEO expert How to write and co-founder of Semji, presented the Google Bert update and its impact on india phone number library web content writing in his latest webinar . We’ve decided to transcribe it so you can discover how to write for SEO in the age of Google BERT.
What is BERT?
On December 9, Google announced the launch of BERT in French, a few weeks after the English announcement. What’s behind the name BERT? The acronym stands for “Bidirectional Encoder Representations from Transformers.” It’s a language model name in the Natural Language Processing community.
If we had to remember several key points around BERT:
The first is that it is a language model, therefore a NLP model (see definition below) created by Google itself. This model is Open Source, used since 2018, i.e. 1 year before it became an update. Google oversee the work of the sales department announced this update as the biggest update in 5 years, especially since Rank Brain .
Google also announced that BERT would impact 10% of queries. Having now delved deeper into these impacts, they are affecting long-tail keywords more.
The takeaway: Google BERT is both a language model and a Google update.
From a purely language model perspective, BERT has made headlines within the natural language processing community. Why? Because it performed better than all other known algorithms.
Today, BERT will:
- better understand the written content
- know how to answer questions
- be better at sentiment analysis
- also succeed in detecting whether singapore number two sentences have the same or similar meaning.
- be able to evaluate the logical sequence of sentences. (For example: “I say a first sentence, I write a second, but does the second sentence really make sense with the first.”)
- generate content
What’s important to consider today is that BERT is part of a long line of language model evolution. Indeed, in 2013, Google had already created WORD2VEC. In recent years, we can see that there have been many developments, and it seems that this isn’t going to stop.
Why all this development around How to write language models? Large companies like Google, Microsoft, Facebook, Apple, etc., are racing to develop algorithms and language models with a single goal: to understand content like humans.
The other part that will concern us much more: the Google update part. What’s important is that BERT is another update in the long series of Google updates around content, always with the challenge of highlighting relevant content in relation to a user’s query: this remains Google’s mantra and number 1 value proposition. We have every interest in improving this understanding of content to highlight relevant content.
During its communication, Google showed us several examples that are quite interesting. In this example, Internet users typed a query like: “Brazilian traveler 2019 to go to the USA to get a visa” and we can see the before / after. Previously, Google brought up content not for Brazilian travelers but rather for Americans, which was not relevant. Whereas today, Google will highlight content from the US Embassy in Brazil.
Another example with the keyword: “Can I look for medication at the pharmacy for someone else?”. Before, we could find content to fill the prescription, and today we have content that specifically answers the question. We can see that in these two examples, we are on long-tail keywords and in particular a question-type keyword.