Google is rolling out a new search feature that combines text and images for a more refined Internet trawling experience.
You might have read our article yesterday about DALL-E 2, the system that can generate images using a few descriptive words. Well, “Multisearch” is not that impressive, but it is somewhat in a similar vein as it expands search options and further integrates what words correspond with in terms of photographs.
Some examples that Google provides include taking a screen capture of a piece of clothing you like and search Google for it plus a variation on the color.
It can even pair items with one another as in the dining set + coffee table example Google describes. It can even help you take care of your plants. Simply take a picture of the plant and search for “care instructions.”
What powers all of this? The answer shouldn’t come as much of a surprise.
“All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways. We’re also exploring ways in which this feature might be enhanced by MUM– our latest AI model in Search– to improve results for all the questions you could imagine asking.”
“Multisearch” is currently in beta format in the United States and can be accessed through the Google app according to the company’s blog post discussing it all. The article further advises that it works best right now for shopping (naturally).
Of course, we would like to know your thoughts on Google’s new “multisearch” in the comments below.
There are plenty of other photography news headlines for you to read on Light Stalking at this link right here.
[Google]