Pear Analytics Resources

Marketing Study 16th March’22

Google explains How AI Protects Search using MUM and BERT
How AI protects search using MUM and BERTUsing machine learning to enhance its comprehension of language allows Google to recognize when search results should contain phone numbers for appropriate crisis lines, for example.

“MUM can better comprehend the purpose behind people’s inquiries to recognize when a person is in need,” Nayak added, adding that this enables Google to “present trustworthy and actionable information at the correct moment.”

Google intends to implement these changes in the coming weeks.

Google has cut shocking search results by 30%

Unexpected search results are rarely pleasant — and in certain cases, they may be damaging and distressing.

That’s why it’s critical that Google can better understand each searcher’s intent so that the results they see match their expectations.

SafeSearch mode allows users to exclude explicit results. However, there are times when it is precisely what a person is looking for.

“BERT has enhanced our knowledge of whether queries are searching for explicit content, allowing us to significantly lower your odds of getting unexpected search results,” Nayak said.

Google will use MUM to tackle spam in several languages

Google employs artificial intelligence (AI) to eliminate spam and unhelpful results in a variety of areas.

In the next months, it will put MUM to work scaling these safety measures even when there is minimal training data.

As Nayak stated, “when we train one MUM model to execute a task — like categorizing the type of a question — it learns to do it in all the languages it knows.”

Google told users that the recent improvements have been and will continue to be carefully vetted, including by manual search raters.

MUM and BERT

Multitask Unified Model (MUM) is a new technology that was presented by Google in May 2021. It was introduced for answering complex questions that do not have a direct answer. MUM was offered by Google as a mixture of multiple innovations.

Bidirectional Encoder Representations from Transformers (BERT) was created and published in 2018 by Jacob Devlin and his colleagues at Google. It is based on a transformer-based machine learning technique for natural language processing (NLP).