The digital experience has reached new heights in the last few years, thanks to the development of faster internet networks, and the unstoppable worldwide mobile e-commerce businesses. From the past two decades, Search Engines have gotten better at detecting intent, offering relevant results, and combining different verticals (such image, video, or local search), but all of the search engines have mostly operated in the same way. The search principle somewhat remains the same, type in a text query, and the search engine shall suggest a mix of organic links, rich results, and advertisements as an answer to your query.
Google too has been constantly evolving and developing various techs for better and relevant user query results. Recent developments, such as transformer-based machine BERT, have improved search engines’ language processing capabilities, allowing them to better interpret searches and offer more relevant results.
The Tech Giant recently introduced its Multitask Unified Model (MUM), a system that, according to Google, is 1,000 times more powerful than BERT and combines language understanding with multitasking and other features. Pandu Nayak, Google’s vice president of search shares how MUM could potentially change the way people engage with its search engine. “There is no expectation that it will become this question-answering system,” Nayak sources, adding that such a system is “just not useful” for complex needs.
In simple words, one can think of ‘MUM’ as a more advanced version of BERT, especially since Google views it as a similar watershed moment. While both are based on transformer technology, and MUM has BERT language understanding capabilities built in, there are some differences between the two.
MUM is built on a new design (the T5 architecture) and has far more capabilities. Learning is scaled by training in multiple languages. “This is helpful because it allows us to generalise from data-rich languages to languages with a scarcity of data,” Nayak said. “[MUM is] trained simultaneously across 75 languages.” This could mean that MUM’s applications will be easier to translate into other languages. If that’s the case, it might help Google Search gain traction in those markets. MUM isn’t just for text and another contrast is that MUM is multimodal, which means that it can accept video and image inputs in addition to text.
In the near term, MUM’s goals are primarily focused on knowledge transmission between languages, according to Google. Nayak replied that Google wants to create new search experiences while also allowing teams to use it for their own projects. The MUM roadmap, and what Google is doing to ensure that the technology is used responsibly.
MUM’s first public application, in which it discovered 800 different vaccine names in 50 languages in a couple of seconds, is a fair example of where it’s at now. And where the future of technology heads.
Discover more from Rudra Kasturi
Subscribe to get the latest posts sent to your email.