What to Know About Google’s Structured Data

Google recommends that website owners start to provide special XML code snippets to assist it in sorting and categorizing their website data. This is called structured data and is usually done in a format known as microdata.

This new format is not hard to understand nor is it hard to implement, but it is important to know that Google considers its use important and is making it fairly simple for website owners to add these code snippets.

First, not all data on your website can be marked up as structured data. For now Google is only using code for products, local businesses – including address, phone, and other information, articles, software applications, reviews, events, and movies.  Each year Google has added new categories as they expand the types of data that they are integrating into search results.

Use of special tags:

Coding a review will need special tags that denote rating, vcard, title, name, locality, and region. This is all a part of sorting the data for Google in their approved and specific format. Google makes it pretty easy for website owners to start using structured data and has even provided some great online tools.

Here are a few additional resources for you to consider:

Google’s blog post on the topic:

http://googlewebmastercentral.blogspot.com/2013/05/getting-started-with-structured-data.html

Structured Data Markup Helper:

https://www.google.com/webmasters/markup-helper/

Embedding Structured Data for Gmail:

https://developers.google.com/gmail/actions/embedding-schemas-in-emails

Google Webmaster Tools Data Highligher:

https://www.google.com/webmasters/tools/home?hl=en

Structured Data Bread Crumb Snippet:

https://support.google.com/webmasters/answer/185417?hl=en

Google States They Do Not Use Facebook or Twitter for Ranking

Matt Cutts, the lead Spam Engineer at Google revealed just this last week that Google is not using Facebook or Twitter posts or profiles for index ranking. This is very big news and a change in what Google has stated about how social impacts their algorithm.

You can watch the full YouTube video here.

Here’s the bottom-line about using social media and Google rankings:

1. Google is not ranking your site based on the activity you have on social media profiles like Facebook and Twitter.

2. Google does look at links that are shared on these social sites just like they look at content pages when they can spider the content.

3. Google is concerned about using social profiles to create “identity” as this may change or be blocked over time.

4. Google is not recording, for their algorithm, the number of likes or followers a social profile has.

5. Matt Cutts states that he personally likes social profiles for sharing and driving traffic, but does not recommend using them as an avenue to impact Google search placement.

This is an interesting change for Google as previously Google has stated that it did include likes and follower numbers as part of social signals and that these social signals impacted organic placement.

My recommendation is to continue to use Twitter and Facebook if it makes sense to do so. Some businesses have a rich forum on Facebook and should not abandon their followers just because they now don’t get SEO juice from activity there. But for SEO’s to encourage social media interaction now appears to be just one more SEO tactic that Google is clearly disavowing as a way to get organic placement.

Not Provided Keyword Data in Analytics Work Arounds

In Google Analytics almost all organic search activity is being returned with a “not provided” tag masking the actual keywords used to find your content. If you are not advertising in Google AdWords, you may be totally in the dark as to what keywords visitors are using to find your web page content.

If you are looking to improve website visibility and popularity of your website, you may be struggling to figure out what keywords you should use for a landing page, topic for an e-newsletter or for that matter even the topic for a blog post.

Here are a few tips on how you can discover keywords and opportunities to incorporate into your content creation program by thinking outside the norm.

1. Use Google.com’s predictive text insertion to identify top search terms to see if you are covered. Click in to some of the searches you like and look carefully at the returned results. Do you see businesses like yours there or do you just see PDFs from colleges or government entities. Make sure the words you use for your final cut match with your business based on the returned results.

2. Use YouTube.com’s predictive text insertion in the search field to identify possible keyword variations you may not have considered. If you are video minded and see a possible keyword opportunity, consider making a video to fill that niche and place on that topic.

3. Use the Google AdWords Keyword Planner tool to do a reality check and see what type of competition you may face and look for alternative keyword variations.

4. Make sure to review your Google Analytics beyond the first page of results where you see the “not provided” as further down the page and back you will be able to see some of the actual keyword terms used to find you.

5. Make sure to review your Google Webmaster account to see what terms Google is showing as your query results. Although you may not see all the terms used to find you, you will be able to glean very specific insight as to city name, combinations, and top activity. If you feel you need professional help, we provide consulting services to help identify areas of opportunity.

More on Google and Duplicate Content

I felt that this video clarification from Google lead spam engineer Matt Cutts was a worthy review. Make sure to read my comments underneath the video.

You can view this video at YouTube at http://youtu.be/mQZY7EmjbMA.

The key take away on this video is that Google is grouping duplicate content by their algorithm and is only showing the top result.

So who exactly does this impact? Thousands of e-commerce sites that all share the same product descriptions and previously were able to compete with each other. Now that landscape has changed, and changed significantly.

Here’s my own quote found on the video page:

“So, I think I understand then that this leaves e-commerce stores that have the same exact boiler plate content as 20 to 100+ other sites sitting squarely in the duplicate content category meaning that Google is grouping the product pages across the web and only showing one in the search results due to relevancy. I would bet that the one that shows is the one with the highest CTR or most social shares. Since last year the algorithm changes that Google has made have significantly changed how e-commerce stores (that do not have unique products or unique content) operate; forcing most into AdWords to get site traffic.”