Find files in multiple folder names

I am trying to list all the files from dir1, dir2, dir3 and dir4 which might be anywhere in as a sub directory of my cwd using the find command. I tried the following with no success:

find . -type f -regextype posix-egrep -regex 'dir1/.+|dir2/.+|dir3/.+|dir4/.+'

I tried posix-extended as well. How can I list these files?


Thank you for visiting the Q&A section on Magenaut. Please note that all the answers may not help you solve the issue immediately. So please treat them as advisements. If you found the post helpful (or not), leave a comment & I’ll get back to you as soon as possible.

Method 1

And if you want to search three folders named foo, bar, and baz for all *.py files, use this command:

find foo bar baz -name "*.py"

so if you want to display files from dir1 dir2 dir3 use find dir1 dir2 dir3 -type f

try this find . ( -name "dir1" -o -name "dir2" ) -exec ls '{}' ;

Method 2

It’s best to use the -path directive:

find .  ( -type f -and -path '*/dir1/*' -or -path '*/dir2/*' -or -path '*/dir3/*' -or -path '*/dir4/*' )

Which means: find all files under current directory where there’s ‘dir1’ or ‘dir2’ or ‘dir3’ or ‘dir4’ in the path.

Method 3

Just to let everyone know. Adding .*/ before each dir solved the problem since the regex is matching against the full path it seems.

Method 4

This is my first idea after reading the previous answers:

find . -type f -regextype posix-egrep -regex ".*/(dir1|dir2|dir3|dir4)/.+"

This takes into account, that the regex must match the whole filename, and it is easier to understand.

Method 5

Sample of fast(!) finding particular files (access_log) in multiple locations defined by wildcard (home directory) and general apache2 log directory applying also name filtering to exclude ssl logs and gzipped old logs. Does not directly answer the question here but might be userful for someone who found these instructions (like me).

find / ( -path "*var/log/apache2*" -o -path "*home/*/logs*" ) -type f  -name "*access_log" ! -name "*ssl*"

Be careful with spaces, specially near to ( and ).

By the way, I used it for apachetop, to find most active websites in my server.

apachetop -d1 -s9 -p $(find / ( -path "*var/log/apache2*" -o -path "*home/*/logs*" ) -type f  -name "*access_log" ! -name "*ssl*" -print | sed 's/^/-f '/)

Method 6

you dont need to specify dir1 to dir4 by name if you are using regextype

find . -type f -regextype sed -regex ".*/dir[1-4]/[^/]*"
  • type f: find files not directories
  • regextype sed: uses sed regextype
  • regex: searches using regex in the entire path
  • ".*/dir[1-4]/[a-z0-9]*": .* at the beginning means any number of characters which could be of type (char/number/backslash(/) etc); dir[1-4] means look for “dir” followed by a number between 1 and 4; /[^/]* states that following dir1-4 omit any pathnames which have / in them, so it only returns files in the directory dir1-dir4 and ignores any subdirectories that may be within dir1-dir4.

created a directory for you to test it out:

Method 7

I know you specified that using find, but only to show other options that can be used, you can use xargs:

find . -type d | grep -E "dir1$|dir2$" | xargs ls  

find . -name "dir1" -or -name "dir2" | xargs ls

You can have a file named “folders” that contains something like:
$ cat folders

Then, you can do something like this:
$ cat folders | xargs -I % find . -type d -name % | xargs ls
file1  file2  file3  file4

file1  file2  file3  file4

file1  file2  file3  file4

file1  file2  file3  file4

xargs in my opinion, I feel more versatile than find -exec.
Also you can make some crazy stuff like
$ cat << EOF | xargs -I {} find ~ -name "{}" | xargs ls
> dir1
> dir2
> dir3
file1  file2  file3  file4

file1  file2  file3  file4

file1  file2  file3  file4

All methods was sourced from or, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

0 0 votes
Article Rating
Notify of
Inline Feedbacks
View all comments