The rise of bots, spammers, crack attacks and libwww-perl

by on November 2, 2006 · 24 comments· LAST UPDATED April 9, 2007

in , ,

libwww-perl (LWP) is fine WWW client/server library for Perl. Unfortunately this library used by many script kiddy, crackers, and spam bots.

Verify bots...

Following is a typical example, you will find in your apache or lighttpd access.log log file:

$ grep 'libwww-perl' access.log


$ grep 'libwww-perl' /var/log/lighttpd/access.log

Output: - [23/Oct/2006:22:24:37 +0000] "GET /wamp_dir/setup/yesno.phtml?no_url= HTTP/1.1" 200 72672 "-" "libwww-perl/5.76"

So someone is trying to attack your host and exploit security by installing a backdoor. yesno.phtml is poorly written application and it can run or include php code (list.txt) from remote server. This code install perl based backdoor in /tmp or /dev/shm and send notification to IRC server or bot master i.e. server is ready for attack against other computer. This back door can flood or DDoS other victims server (it will also cost you tons of bandwidth). Usually attacker will hide himself behind zombie machines. Blocking by user agent can help and in some cases problem can be dropped all together.

You will also notice that libwww-perl/5.76 as browser name (read as useragent). To avoid such attack:
=> Block all libwww-perl useragent
=> Run web server in chrooted jail

How to block libwww-perl under Lighttpd web server?

Open lighttpd.conf file:
# vi /etc/lighttpd/lighttpd.conf
Append following line to main server or virtual hosting section:
$HTTP["useragent"] =~ "libwww-perl" {
url.access-deny = ( "" )

Save and close the file. Restart the lighttpd:
# /etc/init.d/lighttpd restart

How to block libwww-perl under Apache web server?

Use mod_rewrite and .htaccess file to block user agent libwww-perl. Open your .htaccess file and add rule as follows:
SetEnvIfNoCase User-Agent "^libwww-perl*" block_bad_bots
Deny from env=block_bad_bots

How do I verify that User-Agent libwww-perl is blocked?

Download this perl script on your own workstation. Replace with your site name:
$req = HTTP::Request->new(GET => '');
Save and execute perl script:
$ chmod +x
$ ./


Error: 403 Forbidden

You should see 403 Forbidden error as your user-agent is blocked by server configuration.

Please note that blocking by user agent can help, but spammers spoof user agents. My personal experience shows that blocking libwww-perl saves bandwidth and drops potential threats by 50-80%.

Another highly recommended solution is to run web server in chrooted jail. In chrooted jail attacker cannot install backdoor as shell and utilities such as wget not available to download the perl code. I also recommend blocking all outgoing http/ftp request from your webserver using iptables or use hardware based firewall such as Cisco ASA Firewalls.

Final extreme solution is to put entire root file system on read only media such as CDROM (or use live CD). No attacker can bring down your web server if it is serving pages from read only media (except DoS/DDoS attack).

What do you think? How do you block such attacks? Please share your nifty technique with us.

TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!

{ 24 comments… read them below or add one }

1 Randal L. Schwartz November 4, 2006 at 2:32 pm

Blocking LWP::UserAgent by agent name is like painting a lock on your door and saying it’s secure. Dumb. Really dumb. Not worth the time, and surprised you suggested it.


2 Jesus of Anonymous June 15, 2011 at 10:39 pm

Personally, I would have a like fun at the script kiddies seeing as they mostly use automactic scripts they downloaded from irc. Simply redirect your vector attack path that you can see in your logs , (you will know what they are looking for) redirect it over to the FBI. You can bet they will soon be stopping attacking your site for fear of being traced. No one wants the FBI to track and jail them. If they are stupid enough to run an auto attack bot thr it will be the FBi site that gets a hack vector not you, and the FBi will trace your hacker, simple lol.


3 ElSecurityGURU September 2, 2011 at 7:40 am

That is funny and prob a good way to stop them lol el kudos


4 nixCraft November 4, 2006 at 4:00 pm


Blocking user agent can help, but spammers spoof user agents. Just suggested solution; there are tons of dumb spammers too they don’t change there user agent so I do block them :)

Real solution is chrooted jail.

Appreciate your post.


5 DarkMindZ April 29, 2008 at 6:04 pm

Good one, I wrote a similar easier techique here:

Blocking bots and scanners with htaccess


6 mumuri May 21, 2008 at 7:20 pm

there is an other solution , if this soft read all link in a page, you can just put a non viewable link on the page, and when this link is call, you ban the ip of the bot.


7 Terrorkarotte March 4, 2009 at 10:03 am

Thanks for the tutorial. After I made the changes the kids changed their identification and now they can try their attacs like before.
When i was analysing, what kind of code injection attemp they tried this time i found this page:
It is a php application with mysql support, that allows me to redirect and block the scriptkiddies when they try to make a injection attemp. The script does not look for useragent instead for injections. Maybe this script is worth a try for you too.


8 Lawrence April 9, 2009 at 6:02 pm

3 years on since you wrote this, and its still a good suggestion.

Randall Schwartz might be Mr Perl, but blocking libwww-perl has worked wonders for me in reducing uncessary bandwidth from bots.

Checking my logs I have zero legitimate user agents using libwww-perl, and *100%* hacker attempts using it (or other user agents). Thats good enough for me to block that agent.

I’m doing it in apache2.conf as follows:

SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
SetEnvIfNoCase User-Agent "^libwww-perl" bad_bot
    Order allow,deny
    Allow from all
    Deny from env=bad_bot


9 Bruno June 2, 2009 at 7:16 am

I try to use your tutorial today (using .htaccess file) but it doesn’t work.
Then I try to replace the first line by :
SetEnvIfNoCase User-Agent "^.*libwww-perl" block_bad_bots
and it’s now working.
Hope It can help you to improve this great tip


10 About Web July 13, 2011 at 7:26 am

Thanks Vivek , finally got this working solution to block lib www perl bot access.


11 techno February 22, 2012 at 3:00 pm

thank’s Vivek,this article helpfull for me


12 MetaTrader 4 MQL Services April 11, 2012 at 11:53 pm

Thank you very much!

We had quite some bandwidth stolen due to this annoying perl bot.

Thanks to you we now got it blocked at the htaccess.

Keep up the good work!


13 Sam May 8, 2012 at 6:56 am

Sometimes its all quite confusing……..especially for newbies………


14 Abhimanyu June 12, 2012 at 11:47 am

Hi Website Owner,

i have a website which has been build in web 2.0 version so please give me any ideas to block spammers via .htaccess or lipwww-perl access. i had already tried your all above coding please help me. right now i m just block the IPs one by one and everyday i have found so many spammers IPs..

Hope you will consider my request and respond me soon



15 Poceni letalske karte August 25, 2012 at 7:02 pm

Thank you for this useful tip. It has been handy for me.
Kind regards!


16 poptropica August 30, 2012 at 4:20 pm

also tried below and place it in you .htaccess fiel.

RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
RewriteCond %{QUERY_STRING} ^(.*)=http [NC]
RewriteRule ^(.*)$ – [F,L]


17 Paul Goh October 3, 2012 at 11:56 am

how to put the code in the htaccess file?


18 kudos May 1, 2013 at 10:48 am

Thanks for your tutorial. However, it’s not working for me. I inserted the script in my htaccess.
I then used your perl script and changed the domain to mine. I’m not familiar with perl at all but i’m using Padre editor and i simply just clicked on run script button.
The server did not block me – instead i got to see the HTML of my website.

Any ideas?


19 آموزش سئو November 9, 2013 at 4:52 am

Thank you for htaccess code


20 Michael Fever March 8, 2014 at 1:35 am

The script blocks the user agent string which is sent by the browser. It’s not something you can easily replicate.


21 Steve Rogers March 18, 2014 at 4:03 pm

Thank you very much for the tips! I saved a lot of bandwidth from these bots. By the way, will this affect my SEO ranking in any way?


22 johnny June 16, 2014 at 6:42 pm

Can someone explain to me, how to block Libwww-perl, on blogspot.
when i check my blog.. Libwww-perl is on..


23 Mayyank Gupta December 29, 2014 at 7:07 am

Johnny use this code
RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
RewriteRule .* – [F,L]


24 Johnny Evans February 20, 2015 at 12:43 pm

What about blocking it on IIS web servers?



Leave a Comment

Previous post:

Next post: