While it generally makes sense to enrich data on websites with such metainformation, Google made the idea much less appealing to us when they decided they'd steal our merchant reviews in order to display them on their (competing) product search / price comparison pages. Who wants to become a pure content provider (with mostly Googlebot traffic) for Google's future cash cows?
Don't get me wrong - I'm very much in favor of accessibility and mashing up of publicly available information, it would greatly benefit everyone if this was easier. But I'm not going to follow a trend (and instructions) set by a monopolist who has only his own revenue in mind and strongly opposes similar openness for his own data (all of which is actually collected from elsewhere...).
It is hard to get schema properties right, because Google is changing how it uses it for its search results all the time.
A year back I optimized some product pages so that they displayed the price and availability in the search results. Very nice. Then, two weeks later, Google changed something and the pages stopped showing their special properties and were back to normal in the search results. No idea why...
Here's a definition from the Google Developers site[1]:
"Schema.org is a markup vocabulary that is standardized and managed as a collaboration of Google and other companies. By working with schema.org, we are creating an open standard, so that the markup you embed can be used by any email product that receives them."[1]
You should look in webmaster tools if you want to understand how to use schema in SEO results. I've found things like Events are well supported and relatively easy to mark while things like product offers and reviews are more inclined to be ignored (or intermittently held to higher unpublished standards?) by google probably due to their high utility for abuse.
Personally, I mark everything I think is meaningful to a human and check that google thinks it is validly formed and how they would display it, but I don't rely on whether they do. With good and consistent metadata, Client side code+CSS can be a bit cleaner and better abstracted and it seems to help a little in the long tail of search engines/aggregators, etc.
It doesn't help that Google's documentation on a lot of this is vague or nonexistent. The best thing Google can do is add more examples to their documentation.
Happened to us too, they changed how it worked so it doesn't show for all websites[1], just some, but we're still not sure if they took it off or if there's a bug in our code.
Worse still, it doesn't really seem to have affected our incoming traffic, so the whole thing was a complete waste of time and trying to fix it is now very low on the to-do list.
Serious question, but why would anyone want to do this? The first example on schema.org is for movie information. If you have a website with lots of movie information, why would you make it easier for google to take your information, cut you out and provide answers to queries before they even hit your website? It simply is not going to turn out well for you.
But on the flipside, what if Google decides to adjust their algorithm and demote sites that don't use the appropriate Schema to markup their content? Then Google is still going to cut you out by ranking you lower than your competitors. And actually, I wouldn't be surprised if Google already ranks sites without schema lower than sites with schema.
Personally, I'd rather spoonfeed the info to Google, rank #1 and run the risk of Google cutting me out with Knowledge Graph, or however else they want use the info, than not spoonfeed the info to Google and rank #10 while my competitors rank higher than me.
Obviously it's not that cut and dry, but it creates an interesting dilemma.
But on the flipside, what if Google decides to adjust their algorithm and demote sites that don't use the appropriate Schema to markup their content?
Well then you'd deal with that when it happens. Currently you know google will take the content and present it without hits to your website, but them penalising sites that don't is just theoretical.
> Currently you know google will take the content and present it without hits to your website
In some cases, yes, but not always.
For example, using schema to markup product reviews and aggregate review scores can add value to an e-commerce site's organic search listings and increase their CTR - assuming the review scores are favorable.
In that case markup your company information with the name, description and appropriate location and/or telephone tags and leave it at that. Google will try and place you at some location at some point and having some control over that is preferable.
I've not found that schema.org provides clear guidance over what markup to use, and when. Nor a clear indication of what the value is to sites that do implement it.
Take Hacker News as an example... shared links with comments.
Is the link itself of type WebPage? https://schema.org/WebPage I would've said yes, except that the bit at the top states that all pages are implicitly WebPages and thus implicitly self-describing rather than describing a link elsewhere.
Unless the article/link is a WebPage.significantLink ? But that seems not significant enough for what is essentially the context for the entire page.
Maybe we're linking to an https://schema.org/Article ? So we can mark the link as that. Except you and I know that not all things linked are articles, it feels like a poor fit, especially as the Article type agains describes the thing itself and not the link to it.
Alright, let's forget describing content elsewhere and focus on content here on this site... comments.
Should I use https://schema.org/UserComments ? Looks like the right thing, except that this is a comment relating to (a child of) an event, which isn't a good fit.
But we do have a https://schema.org/Comment which looks like a great match, it even has an upvote and downvote count.
So we have a good match for comments, but not for the links... unless the news articles/posts are in fact just comments themselves? Afterall, they have an upvote and downvote, a comment body and we could always use the `url` of `Thing` for the link.
But in the end I'd probably mark the list of stories as simply a https://schema.org/CreativeWork as at least that has a `discussionUrl`.
That works reasonably well (and I know schema.org well enough to do it swiftly), but I've given tasks like this to other devs and authors and seen wildly different results for similar things.
The question I'd ask, why? Why bother? It seems like a lot of work for inconsistent quality of answer, for what benefit to the sites that go to the effort?
With little guidance and examples for really common scenarios (blog, forum, business word press site, ecommerce shop, calendar), time and effort is poured in without any understanding of the value to be derived from that investment. It may have potential but if a client is spending money today, what value is there?
I think you have the wrong perspective of Schema. Although it is sometimes useful to describe what links relate to (both externally and internally), Schema is most useful when content describes itself. E.g., it's not so much saying "This link refers to an Article" but "I am an Article". I very rarely markup links with Schema, as whatever page I'm linking to describes the content therein.
I don't know if there is a definite benefit to implementing Schema, but I've noticed that new websites I have launched seem to rank a lot quicker than without Schema markup. And, I think it looks nice, and provides a better user experience. There could be more guidance on Schema.org, but I think it does a good job. The real problem is getting into the mindset about what Schema is about.
Bottom line: this is an easier way for Google to steal your information. Once they steal it, you will get almost no traffic as Google will show the answers themseleves
Don't get me wrong - I'm very much in favor of accessibility and mashing up of publicly available information, it would greatly benefit everyone if this was easier. But I'm not going to follow a trend (and instructions) set by a monopolist who has only his own revenue in mind and strongly opposes similar openness for his own data (all of which is actually collected from elsewhere...).