Snitz Forums 2000
Snitz Forums 2000
Home | Profile | Register | Active Topics | Members | Search | FAQ
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 Help Groups for Snitz Forums 2000 Users
 Help: MOD Implementation
 Guest user on line more than 600 mins
 New Topic  Topic Locked
 Printer Friendly
Author Previous Topic Topic Next Topic  

cobrachen
Starting Member

48 Posts

Posted - 14 April 2006 :  23:37:15  Show Profile
Guest #1 Unknown
Unknown 220.xx.xx.xx Viewing Topic:
XXXXXX 04/15/2006
01:12:57 630 Minutes

This guest is on line for more than 600 minutes. I tried to use IP banned but this guest still switching from one topic to another.

How is this possible and how could I kick this one off?

Thank you very much.

weeweeslap
Senior Member

USA
1077 Posts

Posted - 15 April 2006 :  04:41:33  Show Profile  Visit weeweeslap's Homepage  Send weeweeslap an AOL message  Send weeweeslap a Yahoo! Message
it's a robot, make a robots.txt file and disallow that bot.

coaster crazy
Go to Top of Page

cobrachen
Starting Member

48 Posts

Posted - 15 April 2006 :  14:44:43  Show Profile
quote:
Originally posted by weeweeslap

it's a robot, make a robots.txt file and disallow that bot.



Would you elaborate more? How a txt file could stop it?
Go to Top of Page

Podge
Support Moderator

Ireland
3775 Posts

Posted - 15 April 2006 :  15:33:02  Show Profile  Send Podge an ICQ Message  Send Podge a Yahoo! Message
http://www.robotstxt.org/wc/robots.html

Its a standard exclusion protocol which search engine spiders comply with (sometimes).

The pages listed in your robots.txt file will usually be ignored by search engine spiders.

If you know the user-agent of the spider you can tell it that all pages are off limits to it.

See the link above for specific instructions.

Podge.

The Hunger Site - Click to donate free food | My Blog | Snitz 3.4.05 AutoInstall (Beta!)

My Mods: CAPTCHA Mod | GateKeeper Mod
Tutorial: Enable subscriptions on your board

Warning: The post above or below may contain nuts.

Edited by - Podge on 15 April 2006 15:33:54
Go to Top of Page

cobrachen
Starting Member

48 Posts

Posted - 15 April 2006 :  21:54:01  Show Profile
quote:
Originally posted by Podge

http://www.robotstxt.org/wc/robots.html

Its a standard exclusion protocol which search engine spiders comply with (sometimes).

The pages listed in your robots.txt file will usually be ignored by search engine spiders.

If you know the user-agent of the spider you can tell it that all pages are off limits to it.

See the link above for specific instructions.



Thank you very much for this information.

Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Topic Locked
 Printer Friendly
Jump To:
Snitz Forums 2000 © 2000-2021 Snitz™ Communications Go To Top Of Page
This page was generated in 0.35 seconds. Powered By: Snitz Forums 2000 Version 3.4.07