Frequently Asked Questions | Dec 10, 2024 - 11:59am |
|
|
Frequently Asked Questions Operational Questions
Using robots.txt when Multihoming Web Crossing
Rate This FAQ
(Not yet rated)
|
Created On: 19 Jan 2006 11:33 am Last Edited: 19 Jan 2006 11:35 am |
Question |
|
|
How can I have unique robot.txt files for subsites on my multihomed Web Crossing server? |
Answer |
Easy! The concept is exactly the same as using favicon.ico for multihomed sites.
Using a robots.txt is easy when you have only one domain name: you place it at the very top level of your site.
However if your site is multihomed (serving more than one domain name out of subfolders) and you want different robots.txt for each site, it isn't readily apparent how to set this up.
First, after you have your new robots.txt created, put it in the very top level of the folder serving the domain in question. Then go to your Web Services Control Panel and add an entry at the top of the Web Mapping form field that looks like the following. Do this for each of the domains your site serves that need their own robot.txt files.
The following example assumes that the domain "www.yourdomain.com" is already mapped and being served out of the top level folder called with a path of "yourdomainfolder". Adjust accordingly. What you are doing here is simply remapping the request for robots.txt to the subfolder that serves the site. |
Example |
Note: word wrap may cause this to look like two lines. However it is just a space in between.
http://www.yourDomainName.com/robots.txt webx:/yourdomainfolder/robots.txt |
|
|
|
|