Snitz Forums 2000
Snitz Forums 2000
Home | Profile | Register | Active Topics | Members | Search | FAQ
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 Community Forums
 Code Support: ASP (Non-Forum Related)
 Fine-tuning robots metatag
 New Topic  Topic Locked
 Printer Friendly
Author Previous Topic Topic Next Topic  

StephenD
Senior Member

Australia
1044 Posts

Posted - 06 November 2003 :  06:08:52  Show Profile  Send StephenD a Yahoo! Message
I'd like to change this tag so that the robots only harvest a particular page or pages in my portal mod. This is what it currently looks like:
"<meta name=""robots"" content=""all"">" & vbNewline & _

How do I go about doing this please.

StephenD
Senior Member

Australia
1044 Posts

Posted - 06 November 2003 :  06:37:41  Show Profile  Send StephenD a Yahoo! Message
Or can I create a robots.txt file like this:

User-agent: *
Disallow: portal.asp?ContentID=25&CategoryID=12
Disallow: portal.asp?ContentID=23&CategoryID=12
etc etc...

Would this work? Is the syntax correct?
Go to Top of Page

redbrad0
Advanced Member

USA
3725 Posts

Posted - 06 November 2003 :  12:00:38  Show Profile  Visit redbrad0's Homepage  Send redbrad0 an AOL message

To exclude all files except one
This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "docs", and leave the one file in the level above this directory: 
User-agent: *
Disallow: /~joe/docs/

Alternatively you can explicitly disallow all disallowed pages: 
User-agent: *
Disallow: /~joe/private.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html

Brad
Oklahoma City Online Entertainment Guide
Oklahoma Event Tickets
Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Topic Locked
 Printer Friendly
Jump To:
Snitz Forums 2000 © 2000-2021 Snitz™ Communications Go To Top Of Page
This page was generated in 0.65 seconds. Powered By: Snitz Forums 2000 Version 3.4.07