Get more juice out of multiprocessor system with xjobs

I have already written about how to run execute commands on multiple Linux or UNIX Servers. Now Joe ‘Zonker’ Brockmeier show us how to use xjobs utility to run multiple jobs out of multiprocessor machine 🙂

ADVERTISEMENTS

From the article:
Ever feel like you’re not getting the most out of your multiprocessor machine? The xjobs utility allows you to schedule several processes to run simultaneously to make the most of your system’s resources.

Xjobs takes a list of arguments from standard input and passes them to a utility, or takes a list of commands from a script, and then runs the jobs in parallel. If you have a multiprocessor machine, xjobs will automatically run one job per processor by default. For instance, on a dual-CPU machine, if you run ls -1 *gz | xjobs gunzip, xjobs will gunzip two files at a time by default. If you run the same command on a quad-CPU machine, it will gunzip four files at a time by default, until it runs out of files to process.

This is an excellent command for my dual core dual chip AMD box.

🐧 Get the latest tutorials on SysAdmin, Linux/Unix, Open Source & DevOps topics via:
CategoryList of Unix and Linux commands
File Managementcat
FirewallCentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network Utilitiesdig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNCentOS 8 Debian 10 Firewall Ubuntu 20.04

ADVERTISEMENTS
3 comments… add one
  • marco Sep 12, 2008 @ 21:07

    But is it needed? Isn’t that the job of the scheduler?

    • Ole Tange Jun 7, 2010 @ 13:55

      marco: If you have a hostname that you want to lookup the IP-address of you can do:

      host hostname

      If you have 10 you can run them simultaneously by appending ‘&’. That is much faster than running them sequentially.

      host hostname1 &

      host hostname10 &

      But if you have 1000000 hostnames then you cannot just run 1 mio jobs in parallel. You need something to make sure you only run, say, 100 jobs in parallel. This is the task that xjobs and GNU Parallel solve.

  • Ole Tange Jun 7, 2010 @ 13:50

    xjobs deals badly with special characters (such as space, ‘ and
    “). To see the problem try this:

    touch important_file
    touch ‘not important_file’
    ls not* | xjobs rm
    mkdir -p ’12” records’
    ls | xjobs rmdir

    GNU Parallel http://www.gnu.org/software/parallel/ does not have that problem. It also lets you distribute jobs to remote machines.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre>, <code>...</code> and <kbd>...</kbd> for code samples.