Google has bolstered its visual search function by adding artificial intelligence (AI) to automatically assemble multiple images into AMP stories formats. The company said it had seen publishers already experiment with the format as it allows site visitors to more easily search content in Google Images and Discover.
For now, Google stories will be largely about celebrities and athletes. Users simply tap an image to find out more about articles related to the content.
Meanwhile, the company is also bolstering its video options to allow users to more easily explore content from videos and find relevant information more easily.
For example, people who may plan trips could check out videos of their destination beforehand to allow them to assess what to expect. Featured videos now include deeper integration of important landmarks and also include the most relevant subtopics for a specific search.
Changes to Google Images algorithms also include more important information and rankings. Effectively, this means that image-corresponding webpages will be more likely to relate to the content that is being searched for in the first place.
Additionally, Google has added context to images such as captions and suggestions for related searches.
Meanwhile, Google Lens’ AI technology will be able to analyse images and detect objects that are of interest within such images. Effectively this means that Google Lens now links to relevant product pages, and allows users to draw on part of an image.