My firm is getting ready to get back into the business of web design with a new program of buidling responsive web sites - that's where your website resizes and reformats to fit the device on which it is viewed.
We'll be rolling out a new corporate website using new html5 complaint and responsive code in about two months. Ask us and we'll send you our test site links so you can see what's in the works.
In the meantime, we are wait listing clients who would like to use our highly specialized SEO web design services using our new program. We'd be glad to give you placement on our waiting list too, without any obligation, until you see samples and pricing, just ask us to add you!
Connect with me online on Twitter | Facebook | LinkedIn | Our Blog | Google+
Google+ Community - AdWords Strategies | Google+ Community - Bing Ads Strategies
Structured Data Gives Google What It Wants
Google recommends that website owners start to provide special XML code snippets to assist it in sorting and categorizing their website data. This is called structured data and is usually done in a format known as microdata.
This new format is not hard to understand nor is it hard to implement, but it is important to know that Google considers its use important and is making it fairly simple for website owners to add these code snippets.
First, not all data on your website can be marked up as structured data. For now Google is only using code for products, local businesses – including address, phone, and other information, articles, software applications, reviews, and movies. Each year Google has added new categories as they expand the types of data that they are integrating into search results.
Here's an example of coding for a review using structured data called rich snippets:
Notice that the review has special tags that denote rating, vcard, title, name, locality, and region? This is all a part of sorting the data for Google in their approved and specific format. Google makes it pretty easy for website owners to start using structured data and has even provided some great online tools.
Here are a few resources for you to consider:
Google's Blog post on the Topic:
Structured Data Markup Helper:
Embedding Structured Data for Gmail:
Google Webmaster Tools Data Highligher:
Structured Data Bread Crumb Snippet:
Google's Duplicate Content Penalty -- Is It Fiction?
I read this article at SiteProNews with interest "Duplicate Content -- It's Time to Shatter the Myths" by Martin O'Neill. In the article the author states:
"...feel free to use content from online sources but your long-term goal should be producing quality, original content and material that will serve your website and online presence in the months and years to come."
You'll want to read the full article yourself to understand the full breadth of the topic. I however, with all due respect, disagree with the conclusions drawn by the author about duplicate content and rankings for sites that use it on Google.
Case in point -- just ask the owners of e-commerce stores whose product descriptions are shared by a large number of similar websites what has happened to their website placement. Most sites have been penalized; pushed so far back in the organic results that they now have to move into Google AdWords in order to have their websites found. But, don't take it from me, listen to what Matt Cutts, Google's lead Web Spam engineer says about duplicate content in this video.
Are You Feeling Lucky and Willing to Gamble Your Website Placement?
The takeaway is that duplicate content is a placement factor and especially if you do not provide additional value or a unique point of view. You'd just have to sit in my office for a week to know that duplicate content is a huge problem for website owners and that many are struggling to regain placement that they have lost because of its use on their website.
The author of the article gives an example of a small limited test he did with two websites he launched with the same content and that Google placed both in four weeks. I want to point out that this is a very limited test and that he reviewed placement in four weeks only. We've found that after launch a site will be boosted in organic placement and then after six weeks or so the placement will significantly drop to where it will typically stay in the SERPs. Small overlaps of content may not be a huge impact as shown in his own test, but one should not derive that duplicate content is not penalized by his limited review.
What I have found is that when you scrape a site, use article marketing sites to build your content from (where others pull and use these articles as well), use content that is widely duplicated by others (vendor descriptions), you will need to make sure to have a nice sized AdWords budget as you will simply not be able to place organically on Google using these types of tactics.