UNIX: Loop Through Files In A Directory

How do I loop through files in a directory under UNIX like operating systems?

The most simplest form is as follows:

for file in /path/to/file1.txt /path/to/file2.txt /path/to/file3.txt
do
 # do something on $file
 cat "$file"
done

You can directly process all command line args:

for file in $*
do
 # do something on $file
 [ -f "$file" ] && cat "$file"
done

OR simply process all *.css file in the current directory:

for file in *.css
do
 # do something on "$file"
 cat "$file" >> /var/www/cdn.example.com/cache/large.css
done

You can also read file names stored in a text file called delete.txt (note read with -r and IFS which will take care of file with spaces):

while IFS= read -r f <&3; 
do
      #do something with "$f"
      rm -f "$f"
done 3< delete.txt

Make sure you always put $f or $file in double quote. Here is another sample script it will go through /home/wwwdata/{example.com,example.net,nixcraft.com} and process all files using for loop:

#!/bin/bash
# sync all domains to backup server at midnight 
domains="example.com example.net nixcraft.com cyberciti.biz"
me="${0##*/}"
now=$(date +"%d-%m-%Y_%S")
log="/tmp/${me}.${now}"
latest="/tmp/latest"
logdata(){
	local f="$1"
	local d="$2"
	[[ "$d" != "" ]] &&      echo "                            $d"
	[[ "$f" == "start" ]] && echo "--------------------------------------------------------------"
	[[ "$f" == "end" ]] &&   echo "=============================================================="
 
}
source /usr/local/nixcraft/mgmt/ssh/.keychain/$HOSTNAME-sh
for d in $domains
do
	logdata "start" "$d @ $(date)"
        [ -d "/home/wwwdata/$d/" ] && { 	cd "/home/wwwdata/$d/";  
 	/usr/bin/rsync  --exclude='cache/cache-*'\
			--exclude '.bash_history' \
			--exclude '.viminfo' \
			--exclude 'cache/*_mutex.lock' \
			--exclude 'broken-link-checker*' \
                        --exclude 'tmp/*'
			-a --delete . backup@nasbox.nixcraft.net.in:/raid6/$HOSTNAME/ ;
         } 
	logdata "end" "$d @ $(date)" 
done &> $log
[ -f $latest ] && /bin/rm -f $latest
ln -s $log $latest
mail -s "Backup $HOSTNAME" admin@clients.nixcraft.net.in < $latest

See also:


🐧 Get the latest tutorials on Linux, Open Source & DevOps via RSS feed or Weekly email newsletter.

🐧 3 comments so far... add one


CategoryList of Unix and Linux commands
Disk space analyzersdf duf ncdu pydf
File Managementcat cp mkdir tree
FirewallAlpine Awall CentOS 8 OpenSUSE RHEL 8 Ubuntu 16.04 Ubuntu 18.04 Ubuntu 20.04
Modern utilitiesbat exa
Network UtilitiesNetHogs dig host ip nmap
OpenVPNCentOS 7 CentOS 8 Debian 10 Debian 8/9 Ubuntu 18.04 Ubuntu 20.04
Package Managerapk apt
Processes Managementbg chroot cron disown fg glances gtop jobs killall kill pidof pstree pwdx time vtop
Searchingag grep whereis which
User Informationgroups id lastcomm last lid/libuser-lid logname members users whoami who w
WireGuard VPNAlpine CentOS 8 Debian 10 Firewall Ubuntu 20.04
3 comments… add one
  • Benjamin Jul 2, 2010 @ 1:14

    For most cases the -exec option of find is very helpful:
    find . -name ‘*.css’ -exec cat {} >> large.css \;

    # delete certain files younger than 60 minutes:
    find /wherever/you/want -type f -name ‘*substring*’ -mmin -60 -delete

  • yoander (sedlav) Jul 2, 2010 @ 13:29

    Another way, using xargs
    find . -print0 -type f | xargs -0 -I {} cp -v {} /tmp

Leave a Reply

Your email address will not be published.

Use HTML <pre>...</pre> for code samples. Still have questions? Post it on our forum