Since I last wrote about the privacy updates that are mandated by the EU to cover website traffic on American websites by EU nationals, much has happened.
Several clients have shared their thoughts with us on why the sudden change. Some are listed below.
“I do feel lucky about not getting caught, but also want to be safe.”
“I’ve just had a lawyer call me and I feel like I need immediate action on the privacy updates as I don’t want to end up in court on a new matter.”
As for me, my perspective is that it is not expensive or hard to do the implementation to be in compliance with the GDPR. I am risk adverse and feel that eventually the US will institute some controls so we will be ahead of the game by changing our own websites now.
Are you the type that buys a website, pays for hosting and then thinks you are done? Do you update your website or just think about it?
Your website should be a work in progress. It should not be a set it up and forget it kind of thing. Consider your website like a plant, for search engines to notice it, it needs regular care and feeding.
At least once a month your webmaster should look at your website traffic, check security if you are using a WordPress website, and look to build additional content to keep your site fresh.
But, do you need someone on your payroll to be your webmaster? Most likely not, unless your are running a large ecommerce operation. Here’s where webmaster on-demand services may be a perfect fit for you, and we offer them with no long term contracts or commitments.
Although we may not be the best technology match for every website, for many we are. Find out more about our webmaster services and give us a call to see if we can help you too.
Michael Wyszomierski of the Google Search Quality Team talks about errors that Google will report to you in the Webmaster Control Panel about the crawl and indexing of your website.
There are two types of crawl errors they are site errors and specific URL errors.
Site errors means that the Googlebot cannot access your website and are usually from an item listed in your robots.txt file. You should give these errors top priority for fast resolution.
The most common URL is a “not found” error. This means that a page may be missing from your site or the URL changed and your sitemap.xml file should be updated to remove the triggering of these error messages.
Having specific URL errors on your site is completely natural and nothing to worry about, but best practices should include a review and clean up periodically of these problems.
The crawl errors are more important. Although a crawl error can be caused when your server is not available due to a glitch or network error, they should be investigated to make sure that your site is not blocking the crawl by search engine robots. You may even need to get with your web host if server errors happen frequently.