Yahoo! Well really Googlehoo!, The Wicked Witch is dead. Google has just announced that it has killed the Supplemental Index.
Google’s Supplemental Index has previously been affectionately known as “Google Hell” and if your site arrived there, there was simply no getting out. There were many tips on the Web that had been widely circulated on how to get out, but the reality was that short of changing the content on every page and even changing your page name and starting all over, you were stuck.
Now, Google has announced in their Search Blog that they have killed the Supplemental Index. In about July this past year Google had dropped the additional descriptor “Supplemental Index” next to search queries which had been the bain of all professional webmasters. After that change, we could only guess by poor website performance if a site had landed in “Google Hell”.
Google has now stated:
“…From a user perspective, this means that you’ll be seeing more relevant documents and a much deeper slice of the web, especially for non-English queries. For webmasters, this means that good-quality pages that were less visible in our index are more likely to come up for queries.”
Google has stated that this change has been due to new technology and their unabashed “amazing technical feats” allowing a full search of the entire index for every search query.
So… ding, dong, the Wicked Witch is finally dead! Good Riddance!
This is Google’s new brainchild that is in testing – knol – or as they call it “unit of knowledge”. Currently this new roll out is in testing and is being done by invitation only. The bottom line is that Google is wanting authors who are experts on specific topics to write a piece and then for a community, like Wikipedia, to interact with the information.
It creates a mini-web page for each topic. Clearly this will be searchable and will give authors on good topics excellent exposure and even a portion of ad proceeds. You can click my post title to read the full information about this new interactive tool from the Google Search blog.
Here is an image of a Knol on insomnia from the Google site. Looks like a very slick version of Wikipedia.
May personal take on this is that this could be very good for a business. Instead of posting articles on various sites as link bait, why not have one really great page that Google will create and that all you have to do is moderate. Also as this is Google, remember, it will certainly be included in the SERPs in a big way. This could be an extraordinary degree of exposure for experts and consultants in their industry.
I follow SitePro News. This is super e-newsletter. You can get the feed by clicking our post title. In the recent newsletter a very savvy author was speaking about the algorithm change on Google that just started this Thursday.
In the article, he mentions that Google has created a trust factor that is placing site’s with older domains preferentially above new domains. He also mentions changes in the weighting of PageRank shown on the green bar on the Google Toolbar or within the Google Sitemap control panel and also inbound links.
Clearly in the next several days we will continue to see a shakedown and mix up in the index. Are all of these changes good things? Well to scrap the PageRank indicator in the Google Toolbar is a good thing. PageRank has been an area that can be gamed by search engine optimizers and so to get rid of that I personally feel is a good thing. It appears that TrustRank may be the next big factor and this may be a better indicator of real value of a site and therefor a strong indicator of good organic search placement.
Although this newsletter is not online yet on the SitePro News site, when it is in the next several days, it is a must read. The title is “Google Algorithm Update Analysis” and is written by Dave Davies. You may not agree with everything that he has to say, but if you have been following the various Google patent disclosures over the last several months to one year, what he says makes sense based on the technology that Google has been actively patented.
From my viewpoint all of this information just reaffirms that excellent and unique content on your website is important. If you take time to create and build a great site, it should not stop when you launch, new content, a blog, an e-newsletter that are done on a regular basis build new content and authority over time.
There is no quick fix for great organic site placement on search engines, but once it is achieved, you have hit a tipping point and your business and market presence increases dramatically because of it. So specifically working to improve organic placement is crucial for all growing businesses.
A blog reader at my other blog Web-World Watch, left this link http://www.copyscape.com/ on a post that spoke about Google dinging sites for showing duplicate content.
I entered my own blog address in this tool, and found that there were sites that had actually snatched my own blog content verbatim and had not supplied a link back or even had identified me as the author. In fact they had passed the content off as their own, and had selected some of my hottest traffic posts!
I have notified them of copyright infringement! You should check your own content to see if you have a similar problem. If you are like me, you don’t mind if others quote you, even show one or two paragraphs of your post and link back to read the full content, or even contact you for approval, but to simply snatch content and provide no links back and pass the content off as their own intellectual property? Very bad form!
The issue on duplicate content that Google is particularly targeting in one of their most recent patent disclosures is simply this case in point. Who should get the credit for duplicate content? Google is developing a way to identify the author of content just in a case like this. I would imagine that this will revolve around the initial post date recorded by the web server and a factor of a match to other content and writing style on the site. Eventually I am looking to the development of a trust certification for site owner to embed on their page that tags their content for Google.
In the meantime, if you are scraping someone else’s content from their blog, please stop! It’s time to create your own, and if you aren’t then check to see if someone is at Copyscape.com.
In a previous post, I noted that Google is really cracking down on duplicate content. All site owners should work to clean up their site to make sure that duplicate pages like printer friendly versions of pages are blocked from spidering using the robots.txt file. This will prevent Google from dinging your site for duplicate content.
I did get a comment from a reader which pointed to a site where you can also check to see if someone has snatched your content or duplicated what you have done. Click my post title to visit CopyScape.com.
When I ran my own site through the tool, I found another site that had scraped several blog posts verbatim from my site and passed the content off as theirs. Hmm, that’s a copyright violation. I have notified the sites! I do not mind if you mention my content or show one or two paragraphs, but you must link back to the full article on my site. To simply snatch my content and say it is your intellectual property is wrong.
This is what the Google duplicate content algorithm change is all about! Identifying the legitimate owner and blocking from the index other sites that show this content. In some cases Google is identifying the rightful owner by the post date and by authority. I believe in the next year or even months to come, that we will even see a digital authority head tag tied to domains that Google will pick up to verify the site owner.
In the meantime, watch your site for duplicate content, check to see who has scraped your content, and if you have scraped my content please remove it or link back to my site and give me credit with a link.
Click our post title to read this interesting article on how to keep your blog out of Google’s Supplemental Index. The writer offers an interesting tip on how to update your .htaccess file to turn all URLs into www’s. However you can only consider doing this if you are using FTP blogging on many different platforms. If your blog is hosted at Blogspot, you don’t have access to the server.