Hi..I operate few proxy sites where I park my domains to my primary domains..i looked up my "Site Diagnostics" and saw that few of my parked domains come under the "Blocked URL" column..and the reason being "Robots.txt file"
But my robots.txt file is the one that comes with glype and all it has is:
I wonder how my parked domains got blockedUser-agent: *
Moreover, two of my addon domains(proxy sites) that operate indivdually(not parked) are also in the "blocked url" and the reason being the same "Robots.txt file" while they have the same robots.txt file..
Why are they blocked when I have listed them in the "Allowed Sites"?? Is there any way I can unblock them??