A common problem I’ve come across in bash is that there doesn’t seem to be a way to check for the existence of multiple files in a directory. If you wanted to find out whither a particular file exists you could do the following:
#!/bin/bash
if [ -e ~/files/x.txt ];then
echo "Found file"
else
echo "Did not find file"
fi
me@pc:~$ if [ -e ~/files/x.txt ];then echo "Found file";else echo "Did not find file";fi
Did not find file
me@pc:~$ echo hello > files/x.txt
me@pc:~$ if [ -e ~/files/x.txt ];then echo "Found file";else echo "Did not find file";fi
Found file
me@pc:~$
See http://www.faqs.org/docs/bashman/bashref_68.html
No problem there. OK now let’s try and do that using a wild card
#!/bin/bash
if [ -e ~/files/* ];then
echo "Found file"
else
echo "Did not find file"
fi
me@pc:~$ rm files/*
me@pc:~$ if [ -e ~/files/* ];then echo "Found file";else echo "Did not find file";fi
Did not find file
me@pc:~$ echo hello > files/x.txt
me@pc:~$ if [ -e ~/files/* ];then echo "Found file";else echo "Did not find file";fi
Found file
me@pc:~$
Everything looks fine *until* there is more than one file that meets the test criteria.
me@pc:~$ echo hello > files/y.txt
me@pc:~$ if [ -e ~/files/* ];then echo "Found file";else echo "Did not find file";fi
bash: [: /home/me/files/x.txt: binary operator expected
Did not find file
me@pc:~$
So what’s happening ? Well bash is expecting only one item in the search criteria so it bombs out. How I got around this was to run a ls command using the same glob and ignore the output by redirecting both the standard output and standard error to /dev/null
ls -1 ~/files/* > /dev/null 2>&1
if [ "$?" = "0" ]; then
echo "Found file"
else
echo "Did not find file"
fi
So let’s try it.
me@pc:~$ rm files/*
me@pc:~$ ls -1 ~/files/* > /dev/null 2>&1; if [ "$?" = "0" ]; then echo "Found file"; else echo "Did not find file";fi
Did not find file
me@pc:~$ echo hello > files/x.txt
me@pc:~$ ls -1 ~/files/* > /dev/null 2>&1; if [ "$?" = "0" ]; then echo "Found file"; else echo "Did not find file";fi
Found file
me@pc:~$ echo hello > files/y.txt
me@pc:~$ ls -1 ~/files/* > /dev/null 2>&1; if [ "$?" = "0" ]; then echo "Found file"; else echo "Did not find file";fi
Found file
me@pc:~$
EDIT: Unescaped the html
Thanks – just the thing
Thanks, just what I was looking for.
Tricky thing, the wildcard check
doe not work if there are *many* files matching…
if ls gives an ‘Argument list too long’, this behaves like there were no matches
You need to unescape your html entities!
Thanks Joe.
paul… if you have that many files in a directory, you probably need to use find and xargs.
The problem that I’m having is that I only want to find files in the current directory, and I don’t want the ‘ls’ to return true if a sub-directory exists. I’m thinking that my best bet is to use find in this case as well:
if [ $(find . -maxdepth 1 -type f | wc -l) == “0” ]
then echo “no files”
else echo “files exist”
fi
function exists {
for x in $*; do
test -e “$x” && return 0;
done
return 1
}
# THEN #
if exists foo*; then echo foostar exists; fi
I found all of the above to be too inefficient.. Try this instead.
for i in filename*; do FOUND=$i;break;done
if [ $FOUND == ‘filename*’ ]; then
echo “No files found matching wildcard.”
else
echo “Files found matching wildcard.”
fi
for i in filename*; { [ -e “$i” ] && break || i=”; }
[ “$i” ] && echo “$i found 1+”
Hi, I got the same problem. Maybe you don’t want to have any error for output if no files are found (e.g. ls: cannot access filename: No such file or directory), just as I wanted. So for a clean output I used:
if [ -e `echo $mySearch | cut -d ” ” -f1` ]; then
Hope this helps you guys! 😀
You don’t need to use the [] operators in this case – ‘if’ just tests the exit status of the command it executes, where zero is true and non-zero is false. ‘ls’ returns non-zero if there are no files matching the specified pattern. E.g.:
A brute force way is to just try to delete with wild card. if you don’t care about trying to delete files that don’t exist use’ > /dev/null 2>&1 ‘.
So I’m saying here, delete all the files called – :
rm /tmp/${TMPNAME}-[0-9]* > /dev/null 2>&1
This is because I generate temp files for my scripts in a slick way so I can easily manage them. I use the ${RANDOM} variable to generate files for use.
I would seriously advise people not to do what Michael suggests above.
Why the use of ‘-1’ with the ‘ls’ examples ? I mean, I know that with this option the ‘ls’ output is printed in one column but how can this differ here from a plain ‘ls’ since the only thing we are looking for is a positive or negative test?
No real difference except that the “-1” forces the new line. Just a habit.
Thanks Ken, the code snippet and discussion were of help.
My favourite:
[ "$(find . -maxdepth 1 -name '*.jpg' -print -quit)" ] && do-something
The [ ] test returns true (1) if it just contains a non-zero-length string.
[ -n "$(find ...)" ] would be equivalent if you find it cozier to specify the test used. find searches for the file name glob pattern (*.jpg here) and outputs just the first hit.
I find this way to be superior for several reasons:
It's quite portable across shells
I find it easy enough to remember (once you're familiar with [, $( and find).
It's short enough to type in a oneliner.
It's very efficient. Other suggestions involve either the expansion of a glob pattern, or a full ls of the directory, both of which are slow on a directory with many entries.
My favourite:
[ "$(find . -maxdepth 1 -name '*.jpg' -print -quit)" ] && do-something
The [ ] test returns true (1) if it just contains a non-zero-length string.
[ -n "$(find ...)" ]
would be equivalent if you find it cozier to specify the test used. find searches for the file name glob pattern (*.jpg here) and outputs just the first hit.I find this way to be superior for several reasons:
It’s quite portable across shells
I find it easy enough to remember (once you’re familiar with [, $( and find).
It’s short enough to type in a oneliner.
It’s very efficient. Other suggestions involve either the expansion of the glob pattern, or a full ls of the directory, both of which are slow on a directory with many entries.
It can easily be extended to several glob patterns, or take into account any other test/attribute that find knows about.
FWIW, I use this approach in my scripts:
[ -z "$(ls -A ~/files/)" ] && echo 'empty'
[ -n "$(ls -A ~/files/)" ] && echo 'not empty'
That is probably the easiest method, using shell features only, without running any external programs:
FILES=(*.txt)
if [ “$FILES” != “*.txt” ]
then
echo Files exist
fi
The first line creates an array $FILES containing names of all *.txt files in a directory, or – if there are no *.txt files – a single element containing literally “*.txt”. This is the condition we check in the if command.
Hi Ken
Your code and this discussion helped me very well. Thank you very much
Thanks Ken. This bit of code was a lifesaver. For the life of me, I couldn’t figure out how to check for multiple files until I came across your solution.
Thanks for finally talking about >Finding if
one or more files exist in a directory using bash.
| kenfallon.com <Liked it!