Google to Re Rank Overly Optimized Websites

Matt Cutts from Google dropped a bomb this past week. According to Cutts Google is set, in their upcoming algorithm, to re rank in their organic search placement websites that have been overly optimized in an effort to “level the playing field”.

Here’s what Matt Cutts from Google said in another exchange on the topic:

 “We are trying to make  GoogleBot smarter, make our relevance better, and we are also looking for those  who abuse it, like too many keywords on a page, or exchange way too many links  or go well beyond what you normally expect.” Read the full article.

What does this exactly mean to website owners?

Well if you have really worked over your website for keyword density, aggressively worked to build inbound text links with very specific anchor text, your website may be hit with a Google search penalty filter in the upcoming three months as their new algorithm rolls out.

What should you do now?

We recommend a careful review of your home page now before your website is dropped or pushed to the 100th page in the search results to see exactly what may need to be changed to be more Google-friendly with this new content focused push. If you have built strong keyword density on some terms on your home page, now’s the time to remove some of the usages and make the content more readable.

Does an XML Site Map Help or Hurt Organic Placement

If you can’t include an XML site map on your website for some reason will this hurt you with search engines? My unequivocal answer is NO.

Although we do recommend the creation and registering of an XML  site map with both Google and Bing (by using their Webmaster Control Panels), to not add one is not a serious blow to your potential organic placement on search engines.

Search engine spiders do not need your created XML site map to spider your website. They will by their very nature spider your home page and then follow the text links in your home page to auto discover the other pages in your website. If your website navigation is not text based or is encapsulated in images or Flash then text links to your key inside sections as well as your own website HTML based site map should be provided on your home page. These links will typically appear in the footer. Doing so insures that you feed search engine spiders a way to travel and discover the pages in your website.

So if you can’t create an XML site map for some reason for your website all is not lost. Just make sure that you have created text links in your home page content that point deep inside your website.

Google Announces Another Algorithm Update Biased Against Ads

Just this past Friday, Matt Cutts, a key engineer at Google, announced an algorithm update for Google that is biased against ads. Although this update doesn’t have a name yet (and it will soon), it is a filter to remove websites from Google’s index that are top heavy on advertising.

The Google Webmaster Central blog spoke in-depth about the algorithm. You can read the full article from this link. Here’s what Google says about the change:

“In our ongoing effort to help you find more high-quality websites in search results, today we’re launching an algorithmic change that looks at the layout of a web page and the amount of content you see on the page once you click on a result.”

“We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads. “

Google again is reasserting that content is key to relevancy and thus to organic placement. With too much “stuff” above the fold (what is visible on your screen before you have to scroll), you will now be dinged by Google in the search results. Take heed as you don’t want your website or blogsite that is even heavy with images to be caught in the filter.

Can You Even Get SEO Juice From a Subdomain?

For best search engine benefits what is the best configuration for your blog a subdomain or a subdirectory? First, it is important to clearly identify the difference of a subdirectory versus a subdomain.

Subdirectory example: http://www.mccordweb.com/weblogs/

Subdomain example: http://blog.mccordweb.com

How search engines handle the two is entirely different. So set up determines the link juice and search engine capital that a blogsite will pass to the parent website (in the above example the parent site is mccordweb.com).

Here is a very concise explanation of how search engines index, spider, and count subdomains:

“… Google considers sub domains separate from their parent domains:  sub.yoursite.com is considered a different site altogether compared to yoursite.com when it comes to search engine authority.” You can read the full and very interesting article here.

Although you can track subdirectories and subdomains using Google Analytics with special code inserts, how search engine weight and evaluate content that is resides off-domain as in a subdirectory domain is crucial to your organic placement strategy.

It is clear that I am not alone in finding that a subdirectory domain is considered as if it was a separate domain by the search engines. You can find out more by following this thread to the Webmaster World forum.

“It  [a subdomain] is treated much more like an independent domain in many respects – for example, if urls from both the root domain and a subdomain show up on the same page of search results, they do not cluster together.”

Matt Cutts says this on the issue:

“My personal preference on subdomains vs. subdirectories is that I usually prefer the convenience of subdirectories for most of my content. A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example. If you’re a newer webmaster or SEO, I’d recommend using subdirectories until you start to feel pretty confident with the architecture of your site.”

The bottom-line is that a subdomain is simply a way to mask the URL of an off-domain site or blog location and give the APPEARANCE that it is installed within the server where the parent site resides. Search engines consider the content separate and will weight it an index it separately from the parent website. What is crucial to understand is that any links that point to the subdomain blog or website do not flow through and add capital to the parent domain.

For best use of blogging and mini-websites, it is still by far best to install them in subdirectories and not subdomains.