tags:

views:

48

answers:

2

I have made a build system for my web application that's rewriting all resource url's to include the file's revision number (to improve client caching). As of today I'm running this command for each file to get the revision number:

hg log --template '{rev}\n' path-to-file

Executing hg for each file is really time consuming. Is there a fast way to list all files in a repository with the latest revision number?

+1  A: 

Coding such a command in python, either by parsing hg annotate or as a mercurial extension should not be too difficult. The following discussion on the mercurial mailing-list seems to provide a reasonnable solution, although I have not tried it : http://selenic.com/pipermail/mercurial/2010-February/029947.html

Axelle Ziegler
I found that mailing list entry, but I rather not manage and distribute a Mercurial plugin just to do this.
Kimble
A mercurial extension is nothing more than a python script anyway :)
Axelle Ziegler
+2  A: 

This will do it, but please see my comment above as to why using this anywhere in your workflow is probably a bad idea. At the very list you should be using the revision hash not the number, since that doesn't change on clone. Also this isn't terribly efficient, but at least it's only two process instantiations instead of one per file:

hg grep --all '.' | perl -na -F: -e 'next unless ($F[2] eq "+"); print "$F[0] $F[1]\n" unless ($prev eq $F[0]); $prev = $F[0]'

Example:

that 3
file 1
Ry4an
Thanks! Even though it's not *the* most efficient solution it's probably a lot faster than my current approach. I'm using the revision number instead of the hash because it looks a bit nicer and the build always takes place on the same machine.
Kimble
Yeah, should be much faster because it only builds one python VM instance instead of many, though it still needlessly trolls through every line of every revision of every file.Glad it's working for you (hint!). :)
Ry4an