views:

137

answers:

1

I have an SVN repository that I use for storing both code and data files (binary & text). I have noticed slower performance as the repository has grown. I would like to understand better what contributes to this.

Obviously performance depends on the speed at which new data is transferred. I'm interested in performance independent of this (e.g. the time to execute SVN Update when no files have been changed).

To what extent is this kind of performance affected by (1) the number of files in the repository? (2) the size of files in the repository?

Both will slow things down, but I'm wondering whether one or the other is significantly more important.

+3  A: 

If you have an FSFS Repository, there should be no perfromance degradation for common operations, even after 10000+ revisions and gigabytes of data.

RELATED: http://stackoverflow.com/questions/127692/svn-performance-after-many-revisions/131070

More likely, you are seeing something else happening.

Remember, Big working copies = Lots of disk space = Slower client-side operations

Make sure you aren't checking out the whole repo when you only need a subset. Organize your stuff into subfolders, and only check out the necessary subfolders.

msemack