MUM enhancements to Google Lens
This year at I/O, they announced that they have reached a critical milestone in understanding information, with something called Multitask Unified Model (MUM).
MUM can simultaneously understand information across a wide range of formats, like text, images and video. It can also draw insights from and identify connections between concepts, topics, and ideas about the world around us.
Google has been experimenting with using MUM’s powerful capabilities to make our products more helpful and enable entirely new ways to search. They are excited to share an early look at what will be possible with MUM.
At Search On, they demonstrated a new way to search with Google Lens, with the ability to add text to your visual searches and ask questions about what you see.
So if you see a shirt you like, but you’d prefer the pattern on socks, you can point your camera and ask the question.
Launching on Google Lens in the coming months, starting in English.
A redesigned Search experience
With the help of advanced AI systems like MUM, Google is redesigning Google Search and introducing new features that enable natural, more intuitive ways to search.
Here are some of the steps they have taken towards this vision:
Things to know: When you search for a topic, like acrylic painting, you can see all the different dimensions people typically search for, and find the path that’s right for you. They will be launching this feature in the coming months. In the future, MUM will unlock deeper insights you might not have known to search for — like “how to make acrylic paintings with household items” — and connect you with content on the web that you wouldn’t have otherwise found.
Refine this search / Broaden this search: These features help you explore information by zooming into more specific aspects of a topic, or broadening out to more general ideas. Launching in the coming months in English
A visually browsable results page: For searches where you need inspiration or want to explore information visually, we’re announcing a redesigned page that makes it easy to visually browse to find what you’re looking for. Available in English in the U.S. when you search for visual ideas.
Deeply understanding videos with MUM
They are introducing a new MUM-based experience that identifies related topics in a video, even if the topic isn’t explicitly mentioned, and makes it easy for you to dig deeper and learn more.
Launching in the coming weeks in English.
Starting soon, iOS users will see a new button in the Google app to make all the images on a page searchable through Google Lens.
This means you can now seamlessly search shoppable images on websites as you browse with Lens mode in the iOS Google App.
This will be limited to the U.S. at this time.
Lens in Chrome
They are also bringing Google Lens to Chrome on your desktop.
Soon, you will be able to select images, video and text content on a website with Lens to quickly see search results in the same tab — without leaving the page you’re on.
This will be available globally in the coming months.
A more shoppable Search experience
Starting today, Google is making it easier to browse for apparel on mobile right from your Search results.
For example, when you search for “cropped jackets,” we’ll show you a visual feed of jackets in various colors and styles alongside other helpful information like local shops, style guides and videos.
This new experience is powered by Google’s Shopping Graph, a comprehensive, real-time dataset of products, inventory and merchants with over 24 billion listings.
This experience is limited to the U.S. at this time.
See in-store inventory… from home
Shoppers are increasingly starting their in-person shopping experience online.
You can now use the “in stock” filter to see if nearby stores have specific items on their shelves.
So, say you’re looking for a kids bike helmet, you can select the ‘in stock’ filter to find stores near you that have a helmet – even a specific brand or type – on their shelves.
Launching on September 29 in English in the U.S. and select markets, including the UK, Australia, Austria, Brazil, Canada, Denmark, France, Germany, Japan, Netherlands, New Zealand, Norway, Sweden, and Switzerland.
New additions to About This Result
We’re expanding About This Result panels to include more insights to help you learn more about the sources and topics you find on Search.
Starting today, you’ll be able to find new insights about results, including:
More information about the source: In addition to seeing a source description from Wikipedia, you’ll also be able to read what a site says about itself in its own words, when that information is available.
What others have said: Reading what others on the web have written about a site — news, reviews, and other helpful background context — can help you better evaluate sources.
More about the topic: In the “About the topic” section, you can find information such as top news coverage or results about the same topic from other sources.
Launching in the coming weeks in English in the U.S.
We’re helping governments and NGOs provide addresses to people and businesses around the world with Address Maker, which uses our open-source system Plus Codes to create unique, functioning addresses at scale.
In a matter of weeks, Address Maker helps get under-addressed communities on the map — unlocking the ability to do things many people take for granted like vote, open a bank account, apply for a job, or even get packages delivered.
Governments and NGOs in The Gambia, India, South Africa, Kenya and the U.S. are already using Address Maker, with more partners on the way.
Wildfire Layer in Maps
Last year, we launched a wildfire boundary map powered by satellite data to help people easily understand the approximate size and location of a fire — right from their device.
Now, we’re increasing our coverage and bringing all wildfire information together with a new layer on Google Maps
The layer will include emergency websites, phone numbers, and evacuation information from local governments if they’ve been provided. When available, you can also see details about the fire, such as its containment, how many acres have burned, and when all this information was last reported.
Launches globally on Android, iOS and desktop this October.
Tree Canopy Insights
Last year, we piloted the Environmental Insights Explorer (EIE) Tree Canopy tool in Los Angeles, California.
Tree Canopy data uses aerial imagery and advanced AI capabilities to identify places in a city that are at the greatest risk of experiencing rapidly rising temperatures.
With Tree Canopy data, local governments have free access to insights about where to plant trees in order to increase shade and reduce heat.
They are now expanding the Tree Canopy tool to over 100 cities around the globe, including places like Guadalajara, London, Sydney and Toronto during the first half of 2022.