Google now builds its own AMP stories for search


Google is now using artificial intelligence to “intelligently construct” AMP stories for search. These new AMP stories appear in knowledge panels in search results for actors, athletes, musicians and other famous people.

Google said on Monday, “Starting today with stories about notable people — like celebrities and athletes — providing a glimpse into facts and important moments from their lives in a rich, visual format. This format lets you easily tap to the articles for more information and provides a new way to discover content from the web.”

The issue is, when I try this on mobile Safari or mobile Chrome on my iPhone, the stories do not let you interact with them. They do work when in the Google search app on iOS.

How AMP stories in search works: Google has been incorporating more images and video in its search results for years, and AMP stories is a continuation of that effort.

Below is an example of how to trigger a Google-constructed AMP story on a search for Michael Jordan. Just click on the “start story” button in the knowledge panel.

Here is a GIF of the whole process in action:

When it fails to work This feature does not seem to work (yet) when using iOS over the Chrome or Safari browser. I was able to trigger it using the Google search iOS app. I also confirmed with others that it does not currently work on these browsers. Google will hopefully fix it soon, but all I see is the first screen and instructions asking me to tap to go to the next screen. Tapping, in this case, didn’t do anything:

How can I optimize for AMP stories? At this point, it is unclear how you can get your content to show up in these Google-constructed AMP stories. Google said the stories can show in search, Google Images and Discover features. It may help to have AMP-formatted content, but that does not appear to be a requirement.

“We’ve been able to do this in part thanks to advancements in computer vision, which help[s] us extract concepts from images,” explained Cathy Edwards, director of engineering for Google Images, in a blog post Monday. “We model hundreds of millions of fine-grained concepts for every image and video that we have in our index. For example, an image of a tiger might generate concepts like ‘feline,’ ‘animal’ or ‘big cat.’ This lets us identify a picture by looking at its pixels, without needing to be told by the words on a page.”


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.





Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com