Dawn Anderson runs an agency in the UK named Bertey, and it was named that before Google BERT came out. Her agency is actually named for her dog, which she named Bertey several years ago. So really, nothing to do with the Google BERT natural language stuff.
Dawn started in SEO in 2006 and first became interested a few years ago in how crawling works, that was before she became interested in information retrieval. It was from there where she became very interested in the science behind search and started following everything she can find related to it. She then began reading papers in Google Scholar and other places and then attending IR conferences. She said it isn’t necessarily hard, it just takes time – that is her being modest.
Google BERT is a hot topic, so I had her explain it. BERT is an open source natural language training model. Google BERT is an algorithmic interpretation on BERT to help Google contextual understand your queries better. She explained how BERT is fed sentences to help train it, and explains how BERT does it differently through bi-directional learning.
Bing also uses BERT and technically has been since before Google. But Bing says they use it for 100% of queries while Google says it uses it fo 10% of queries. I asked her why do you think Bing does it for 100% and Google only 10% and she thinks it is about cost. There are new versions of BERT coming out, such as AlBert and others that might make it less expensive to run.
Dawn said you…