Get more juice out of multiprocessor system with xjobs

by on October 10, 2006 · 3 comments· LAST UPDATED October 10, 2006

in , ,

I have already written about how to run execute commands on multiple Linux or UNIX Servers. Now Joe 'Zonker' Brockmeier show us how to use xjobs utility to run multiple jobs out of multiprocessor machine :)

From the article:
Ever feel like you're not getting the most out of your multiprocessor machine? The xjobs utility allows you to schedule several processes to run simultaneously to make the most of your system's resources.

Xjobs takes a list of arguments from standard input and passes them to a utility, or takes a list of commands from a script, and then runs the jobs in parallel. If you have a multiprocessor machine, xjobs will automatically run one job per processor by default. For instance, on a dual-CPU machine, if you run ls -1 *gz | xjobs gunzip, xjobs will gunzip two files at a time by default. If you run the same command on a quad-CPU machine, it will gunzip four files at a time by default, until it runs out of files to process.

This is an excellent command for my dual core dual chip AMD box.

TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!

{ 3 comments… read them below or add one }

1 marco September 12, 2008 at 9:07 pm

But is it needed? Isn’t that the job of the scheduler?

Reply

2 Ole Tange June 7, 2010 at 1:55 pm

marco: If you have a hostname that you want to lookup the IP-address of you can do:

host hostname

If you have 10 you can run them simultaneously by appending ‘&’. That is much faster than running them sequentially.

host hostname1 &

host hostname10 &

But if you have 1000000 hostnames then you cannot just run 1 mio jobs in parallel. You need something to make sure you only run, say, 100 jobs in parallel. This is the task that xjobs and GNU Parallel solve.

Reply

3 Ole Tange June 7, 2010 at 1:50 pm

xjobs deals badly with special characters (such as space, ‘ and
“). To see the problem try this:

touch important_file
touch ‘not important_file’
ls not* | xjobs rm
mkdir -p ’12” records’
ls | xjobs rmdir

GNU Parallel http://www.gnu.org/software/parallel/ does not have that problem. It also lets you distribute jobs to remote machines.

Reply

Leave a Comment

Previous post:

Next post: