You can block pages using a robots.txt file i.e.
Code:User-agent: * Disallow: /download page section.php
I am just about ready to upload my website and I just wanted to check with you guys to get any advice about things I should be sure not overlook. I was a bit concerned that after running an html validator, it came back with 42 errors. I have been able to get it down to 22 but cant seem to figure out the rest. How important is that in regards to Search Engines?
Also, there is a download page section that customers are lead to after their purchase is complete. How can I ensure that this "special" page is not indexed and listed in search results? I found this tag: <meta name="robots" content="noindex,nofollow"> but not sure if that is the correct one to put.
Also, I've read that these two are important:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
Thanks for your advice/input
Thank you... I will do that...
Anyone else have any comments or checklist that I should make sure to do before FTPing the site?