Usually Perl and/or CGI scripts can go wild and eat up ALL system resources: this is dangerous stuff can be controlled by three directives. Apache comes with three directives to place limits on the the amount of CPU, memory and processes the server can use.
i) RLimitCPU - restrict the strain of CPU usage.
Example: RLimitCPU 10 20
ii) RLimitNPROC - restrict the number of the processes run simultaneously.
Example: RLimitMEM 200000 200000
iii) RLimitMEM - restrict the memory used by processes run on the server.
Example RLimitNPROC 3 5
You can above three three directives in vhost or main server configuration. First value to each of above example is soft (minimum) limit and second value is hard (maximum) limit which cannot be crossed by any process. Here is more practical and realistic example to be used in mass hosting server (open your httpd.conf file and add following three directives):
A) Set Maximum 100 CPU second to be used by a process so Perl process will die if it is continue for more than 100 seconds i.e. Perl scripts may run for no more than 100 seconds. Scripts running longer than 100 seconds will be stopped automatically by the system/Apache.
RLimitCPU 100 100
B) Set Maximum of 25 processes at any one time
RLimitNPROC 25 25
C) Allow 10 MB to be used per-process
RLimitMEM 10000000 10000000
Once added to httpd.conf file restart the apache process. Please note that you must experiment to see how low you can set these values as per your setup. You can also use ulimit to get and set user limits. Under Debian GNU/Linux www-data is right user to setup these limits. Read man/help page of ulimit and pam configuration for more information. Please see official apache website for Rlimit directives.
(Check out all of our posts on Perl)