tags:

views:

1481

answers:

4

I often use the excellent find program in Bash to list files with certain filters. For example, in a Subversion (SVN) working copy, I sometimes wish to recursively list all files but excluding the .svn subdirectories as follows:

find . -name '.svn' -prune -o -type f -print

Today, I wanted to do something similar, but I also wanted to affect the order in which directory contents were listed: I wanted 'ordinary' files to be followed by sub-directories (and then the recursive contents). There does not appear to be an option for this.

The ls (list) command has an option to list recursively. This command has many sorting options, including list by file name, access time, size, and so on, but not classification, although the -p option will annotate directories.

Now, I could write, e.g., a Python script to do exactly what I want. However, find already does almost everything I want. Usually within a Bash shell, it is possible to combine programs to do just what you want: each program, like find, sort, uniq, ls, wc, performs a simple task, but does so well. Not every program needs to be able to sort because sort can sort. So, really, I'm just curious...

My question is, do you know if there's a way to do what I want: to both filter and sort a recursive file listing, just by combining Bash programs?

For example, find gives me the files in this, alphabetical, order:

a.txt
b\file1.txt
b\subdir\file2.txt
b\then_file3.txt
c.txt
d\file4.txt
e.txt

but I'd prefer them in this order, where within each directory, the ordinary files are listed alphabetically first, followed by the directories, again alphabetically:

a.txt
c.txt
e.txt
b\file1.txt
b\then_file3.txt
b\subdir\file2.txt
d\file4.txt

(I am a Windows user, but I run a Bash shell in Cygwin.)

Thanks.

+4  A: 

Use an embedded find. The outer find locates all directories and executes an inner find which shows just the files you want in that directory:

find . -type d -exec find {} -type f -maxdepth 1 \;
R Samuel Klatchko
This works, but issues warnings. They may be turned off by `-nowarn` option.
Pavel Shved
Thanks! I thought it would probably be possible somehow.So, my SVN filtering example would be find . -name '.svn' -prune -o -type d -exec find {} -type f -nowarn -maxdepth 1 \;
Rhubbarb
When I try this, the result is unsorted (by dir and within dir levels).
Dennis Williamson
+1  A: 
tree -fi

On Ubuntu and CentOS (and Red Hat, Fedora...) it's contained in its own package, called "tree" (duh). Ubuntu doesn't seem to install that package by default, the others do.

EDIT: Sorry, didn't realize you're using Cygwin. Well, it's ported to Cygwin too. If it's not in the default set, see here.

Florin Andrei
You can add a `-a` to get the hidden files, too.
Dennis Williamson
A: 

You didn't state how you wanted to handle things if the directories go more than one level deep. Does something like this do what you're looking for? It does a breadth-first listing, sorted within each depth level:

$for ((l = 0; l <= 24; l++)); do find . -mindepth ${l} -maxdepth ${l} -type f | sort; done
shoover
That was difficult to express, so I tried to illustrate that in the example. I *don't* want all files, and then all directories (that's easy).Whilst at a given directory, I want first to list the files, and then to process each subdirectory, recursing into each in turn.
Rhubbarb
+1  A: 

This groups the directories within each level first, then the files and recurses into each level and does the same:

ls -lR --group-directories-first

It's upside down from what you wanted, though.

Dennis Williamson