BASH Shell Redirect Output and Errors To /dev/null

How do I redirect output and errors to /dev/null under bash / sh shell scripting? How do I redirect the output of stderr to stdout, and then redirect this combined output to /dev/null device? In Unix, how do I redirect error messages to /dev/null?

You can send output to /dev/null, by using command >/dev/null syntax. However, this will not work when command will use the standard error (FD # 2).
Tutorial details
Difficulty Easy (rss)
Root privileges No
Requirements bash/ksh
Time 1m
So you need to modify >/dev/null as follows to redirect both output and errors to /dev/null.

Syntax to redirect error and output messages to /dev/null

The syntax discussed below works with Bourne-like shells, such as sh, ksh, and bash:

$ command > /dev/null 2>&1
$ ./script.sh > /dev/null 2>&1
$ ./example.pl > /dev/null 2>&1

OR

command &>/dev/null
job arg1 arg2 &>/dev/null
/path/to/script arg1 &>/dev/null

You can also use the same syntax for all your cronjobs to avoid emails and output / error messages:
@hourly /scripts/backup/nas.backup >/dev/null 2>&1
OR
@hourly /scripts/backup/nas.backup &>/dev/null

Redirect both standard error and standard out messages to a log file

You can always redirect both standard error (stdin) and standard out (stdout) text to an output file or a log file by typing the following command:

command > file 2>&1
/path/to/my/cool/appname > myapp.log 2>&1

Want to close stdout and stderr for the command being executed on a Linux/Unix/BSD/OSX bash shell?

Try the following syntax:

## Thanks http://www.cyberciti.biz/faq/how-to-redirect-output-and-errors-to-devnull/#comment-40252 ##
command 1>&- 2>&-
 
## Note: additional '&' at the end of job to put it in backgrounds ##
job 1>&- 2>&-  &
command 1>&- 2>&-  &

See man pages: ksh(1)

🐧 If you liked this page, please support my work on Patreon or with a donation.
🐧 Get the latest tutorials on SysAdmin, Linux/Unix, Open Source/DevOps topics:
CategoryList of Unix and Linux commands
File Managementcat
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Network Utilitiesdig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg jobs killall kill pidof pstree pwdx time
Searchinggrep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
29 comments… add one
  • sherikrot Feb 11, 2009 @ 22:05

    Another way to do it:

    $ command &>/dev/null

  • Giuseppe Feb 12, 2009 @ 8:12

    Or you can close stdout and stderr for the command being executed:

    $ command 1>&- 2>&-

    • Jose Torres Oct 15, 2011 @ 20:27

      Remember to add an additional & at the end of the statement to run the command in the background. Thank you Giuseppe for the tip.

    • Gelaxy Dec 20, 2016 @ 17:06

      What do you mean by “close stdout and stderr” ?

      Thanks

  • Jonathan May 26, 2009 @ 21:31

    Thanks! I was searching how resolve this problem, and your solution work perfect for me!

  • Frank Jun 30, 2009 @ 17:15

    need a command in my bash script to remove some (not all) of the contents of directory2.
    The script does NOT run as root, which works because it removes the correct files but not the root-level stuff in directory2 (that I don’t want to remove).
    Problem is users get confused by the “permission denied” msgs output by the “rm”. So…
    I tried to redirect the stderror & stdout to /dev/null this way:
    rm * /directory1/directory2/ > 2&>1 /dev/null
    kept changing /dev/null form a special file & other users need crw-rw-rw-
    Will the recommended approaches allow me to redirect to /dev/null without messing it up for others?

    • Martin Jun 2, 2014 @ 4:23

      You could use find instead to filter out the files you don’t want to delete, or only delete files matching a patter:

      Delete all files except those with “attachments” in the name:
      # find . ! -name '*attachments*' -exec rm -v {} \;

      Delete all files with “attachments” in the name:
      # find . -name '*attachments*' -exec rm -v {} \;

      Find is very versitile, it’s pretty cool what you can acheive with find.

  • Henry Apr 14, 2010 @ 16:53

    how does one redirect output from text file processing to a script file that uses the command line variable $1.

    file iplist has a long list of IP’s on the network and i need to send this to a script that creates a file with the ping info.

    script says: ping $1 > $1
    Please assist if possible

  • SilversleevesX Jul 20, 2010 @ 4:16

    How reliable, if that’s the word I’m looking for, is ending a particular command in a script with a redirect like “2>/dev/null” ? What have folks’ experiences been with the different commands and bash/sh versions when trying it this way?

    I know it’s not recommended, but for someone like myself, with scripts they either run daily or don’t run for months and then go through a spate of executing them two and three times a day (only to go back to seldom running them until the next time it happens), it would be very convenient and not too too anxiety-producing to run a script and know that whatever passable or critical errors it comes up with are being suppressed.

    I’m much more inclined to put up with circumstances after the fact, and I seldom write anything that’s too destructive (on the system or OS/hardware install and performance level, at any rate) for a little error like Exiv2 complaining about some JPG file’s Photoshop IFD entry being out of bounds.

    So share up, coders and newbies. :)

    BZT

  • Saartube Jan 19, 2011 @ 10:31

    Thank you :))

  • ciccio Oct 2, 2011 @ 9:11

    Hi,
    how can I redirect output to /dev/null BUT errors on sdout.
    I mean: I want to launch a command:
    – if all goes good —> no output
    – if something goes wrong —> output of errors

    Thanks,
    Ciccio

  • SilversleevesX Oct 2, 2011 @ 16:07

    ciccio –
    I think it would be the opposite of sending errors to the bucket.

    Something like:
    (your_command) 1>/dev/null
    should leave errors alone, that is, going to stout where you can see them. I’m sure you have something in mind where both good and bad output would normally go to stdout.

    BZT

  • josch Oct 5, 2011 @ 23:16

    ciccio, the order of the redirection counts.
    use:
    command 2>&1 1>/dev/null

    • 🐧 nixCraft Oct 6, 2011 @ 0:54

      No, it does not matters. So following two are the same command:

      command 2>&1 1>/dev/null

      AND

      command 1>/dev/null 2>&1
      • Anonymous Aug 25, 2012 @ 19:33

        Hello,

        The order is important :

        $ ls non_existing_folder 1>/dev/null 2>&1

        (no output)

        $ ls non_existing_folder 2>&1 1>/dev/null
        ls: non_existing_folder: No such file or directory

  • smilyface Oct 8, 2012 @ 14:04

    echo “open 192.168.1.10 8080″| telnet | grep –color=auto “Connected to”
    gives the following output:
    ——————————————————-
    Connection closed by foreign host.
    Connected to 192.168.1.10 (192.168.1.10)
    ——————————————————-

    How can I get rid of “Connection closed by foreign host.” ?

    • neonatus Oct 17, 2012 @ 19:29

      @smilyface

      you can close (omit) the stderr output from telnet command
      echo “open 192.168.1.10 8080″| telnet 2>&- | grep –color=auto “Connected to”

  • siva Sep 13, 2013 @ 6:21

    Hi

    I tried like below

    ping 127.0.0.1 > /dev/null 2>&1
    “but i am getting out as ” Ambiguous output redirect”
    Please suggest me

    • ap Apr 26, 2014 @ 13:14

      Put in bash script:
      exec 2>/dev/null
      before your commands. And avoid redirection in the command itself.

  • Hugues Nov 12, 2013 @ 16:33

    l often do the following and I do not want an error (just a 0 length file)
    You get a valid output if the command works, otherwise the error is sent to /dev/null

    file=`ls doesthisfileexist 2>/dev/null`
    if [ -n $file ] ; then
    do something
    fi

    • LiPi Mar 4, 2014 @ 17:40

      Why not:

      if [ ! -f $FILE ]
      then
          echo "File does not exist"
      fi
      
  • Tom Dec 27, 2014 @ 18:20

    I just stumbled upon this article… FYI ‘command > /dev/null 2>&1` won’t work in every scenario. For example, this will still output an error message:

    ps -ef | grep | grep ps > /dev/null 2>&1

    • 🐧 nixCraft Feb 2, 2015 @ 20:14

      Try:

      (ps -ef | grep | grep ps) > /dev/null 2>&1
      • Eric Sebasta May 15, 2015 @ 15:04

        That is a pretty slick little trick. Didn’t know that one. Thanks!

  • ma thesh Feb 2, 2015 @ 18:16

    How to get the error help in shell window

  • Alex Oct 19, 2015 @ 10:02

    Thanks!
    Exactly what i wanted!

  • Shyam Nov 18, 2015 @ 16:10

    Hi,

    Please tell me how to redirect the output from a script to a log file so that i can save all the details which i am capturing in the script using read command.

    Here is a snippet of my code:
    echo “Enter the number”
    read $N > text
    ….

    If i open text in vi i am getting blank lines and i have saved my script as number.sh and done chmod on the script to give it user permissions as well.

    • Anonymous Sep 21, 2020 @ 11:26

      Add the following at the top of your script:

      exec 3>&1 4>&2
      trap 'exec 2>&4 1>&3' 0 1 2 3
      exec 1>/path/to/log.txt 2>&1
      

      Example:

      16011772255fc281187f93a_000004

  • Kevin Sullivan Sep 29, 2020 @ 20:51

    FYI to use standard out AND error out in a bash script, you cannot have a space between > and &. It’ll give you some wacky error. Just like written on his page, be sure to use >& so that they are together. I just figured this out after battling with the error I previously mentioned. Good luck!

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre>, <code>...</code> and <kbd>...</kbd> for code samples.