Page 2 of 2 FirstFirst 12
Results 11 to 14 of 14

Thread: Google, Yahoo & Microsoft Unite On Canonical Tag To Reduce Duplicate Content Clutter

  1. #11
    Quote Originally Posted by Loko View Post
    How come I can access my site with and without 'www'.
    You just need to set a 301 redirect via .htaccess to your preferred domain, either 'www' or 'without www', that way all requests are sent to the only one you want available.

    redirect all to 'www'
    RewriteEngine on
    RewriteBase /
    RewriteCond %{HTTP_HOST} !^www\.domain\.com [NC]
    RewriteRule ^(.*)$$1 [L,R=301]
    redirect all to non 'www'
    RewriteEngine On
    RewriteBase /
    RewriteCond %{HTTP_HOST} !^domain\.com [NC]
    RewriteRule ^(.*)$$1 [L,R=301]

  2. #12
    Quote Originally Posted by Loko View Post
    Thanks for your reply. Your post makes sense and I understand what you are saying. This has also been explained by Matt Cutts.

    Perhaps I should formulate my question a bit different:
    How come I can access my site with and without 'www'. I don't have duplicate pages in the directory or folders on the server, at least not that I can see. I don't see a page where I can insert a canonical tag
    Or is it something I have to set at the hosting provider?
    The major search engines have gotten fairly good about sorting out the canonicalization issue, so you might not see any inconsistency or duplicates in an index. What you may see is a mix of URLs in Google's index with and without the www subdomain. If you see both, pick the one that is the most common and apply the 301 redirect routine that dtkguy posted. That is the common fix for standardizing the subdomain issue.

    If the 301 redirect is applied correctly, you should see the subdomain either be added or removed (depending upon your pre-set standard) when the opposite version is used. Try entering the address without the www subdomain and you will see what I mean.

    Another canonicalization issue that I didn't mention has to do with Microsoft servers. File names and paths are not case sensitive on Microsoft servers (they are cases sensitive on Unix and Linux servers), which allows sloppy programmers to use multiple versions of file names in hyperlinks with an improper mix of upper and lower case characters. In other words, ContactUs.html, contactus.html and CONTACTUS.HTML are differnt file names on Unix and Linux, but represent the same file name on Microslop servers. Google and Yahoo run on Unix and Linux, so if non-standardized and inconsistent capitalizations and case are used in hyperlinks on a Microsoft servers, G and Y can sometimes see them as different URLs and create another form of duplicate content. If you are on Microsoft server, it is important to use the exact same versions of paths and file names as they exist on the server.

  3. #13
    Thanks a lot for the help guys. I will make the suggested changes.


  4. #14
    Inserted a slightly different code in order to make it work:

    RewriteEngine On
    RewriteCond %{HTTP_HOST} !^www\.domainname\.com$
    RewriteRule (.*)$1 [R=301,L]
    Thanks again.

Page 2 of 2 FirstFirst 12

Similar Threads

  1. Need WP Help. duplicate content
    By Sami4u in forum Wordpress
    Replies: 19
    Last Post: 9 November, 2009, 20:01 PM
  2. Duplicate Content and Multiple sites Issues - From Google Staff
    By thebookmarker in forum Content and Writing
    Replies: 6
    Last Post: 19 October, 2009, 08:09 AM
  3. Detecting Duplicate Content
    By GameOver in forum Business
    Replies: 3
    Last Post: 17 May, 2009, 09:20 AM
  4. Quoting and Duplicate Content
    By GameOver in forum Forum Life
    Replies: 2
    Last Post: 27 April, 2009, 01:09 AM

Tags for this Thread


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts