Apr 20, 2012
tom

.htaccess user agents and website security

Question

I have been looking around and trying to work out the best way to protect a few websites of mine.

Appart from the obvious manual monitoring of the site logs and banning extreme/suspicious activity. I have seen many posts etc. about banning user agents. Is this a good route to go down? and would it be a better idea too, instead off banning known bad user agents, just allowing the common mainstream ones such as IE, FireFox, Safari and Chrome?

http://www.javascriptkit.com/howto/htaccess13.shtml

Asked by hozza

Answer

Not worth it.

The User Agent is sent by the client, and is trivial to forge. There’s a Firefox add-on that adds alternate UA options to the menu, for example. If the attacker is writing a script, he can specify whatever UA he wants.

Answered by Andrew Schleifer

Related posts:

  1. Using .htaccess to exclude all user-agents for private site
  2. Serve different files for specific user agents using nginx
  3. Get list of user-agents from nginx log
  4. What to do about spoofed user agents? Scrapers pretending to be spiders
  5. How to use SNMP agents?

1 Comment

  • I heard SSL certificates are good for website security.

Leave a comment