MUM, or Multitask Unified model: Taking Your Google Search to the Next Level

By 

Published 

May 17, 2022

MUM, or Multitask Unified model: Taking Your Google Search to the Next Level

Table of Content:

Suppose you have been a NASCAR driver, and now you want to participate in a drag race next. What are the things you should do differently? Today, Google can help you with this, but it would take a bunch of different thoughtfully made searches — you'd have to search for the difference in the tracks, the number of cars competing in the race, the difficulty of drag racing compared to stock car racing, the right cars and gears to use, and more. After a number of searches, you'd probably be able to get the answer you need. But if you were talking to a racing expert or an expert of racing cars- you would only have to ask one question- "What do I need to do differently to prepare?" and you would get an answer that considers the nuances and contexts of these two situations and provide you with guidance on the things you need to take into account for the task at hand.

These types of situations are not unique. In fact, Google says that people issue eight queries on average for complex tasks like this. Search engines available today aren't quite sophisticated enough to answer the way an expert would. But with a new technology called Multitask Unified Model, or MUM, Google says that they're getting closer to helping you with these types of complex needs.

Using Transformer Architecture and helping when there isn't a simple answer

Google says that MUM is similar to BERT as it is built on a Transformer architecture- but thousand times more powerful. MUM not only understands language but also generates language. According to Google, MUM is "trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models. And MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio."

Transferring knowledge across languages

Suppose there's really helpful information about drag racing written in Japanese, but you won't find it unless you search in Japanese. But MUM could transfer knowledge from sources across languages and use those insights to find the most relevant results in your preferred language. Basically, it uses AI to learn from sources that aren't written in the language you have typed your search in and can help bring that information to you.

MUM is Multimodal

This means it understands the information provided in different formats like web pages, pictures, and more. You can put a picture of any gears and ask if it will be necessary for your vehicle modding, and MUM will tell you if it is.

Consequences?

Now, this technology seems very cool and necessary at first, but what about the consequences? What will it mean in terms of reduced traffic for websites? When we search in google with a question, it usually takes the answer that's provided on a website or websites and shows it on top, and as such, we don't really need to click on the website which has the article that actually provided the answers. So, their traffic gets reduced, and with the introduction of MUM, traffic will decrease even more, and websites will start losing money. Will it result in the loss of thousands of online businesses? Will it cause jobs to disappear? Only time will tell.

You can learn more about it in the Google blog. You can also check out the discussion forums on Twitter and Reddit.

Stay in the know

Get the latest product and management insights.

Related Posts