Thanks Alex!

I think I can use checkout to optimize at level 2 at least.

Henning

On 02/24/2012 10:36 PM, Alexander Kitaev wrote:
Hello,

Subversion protocol does not support arbitrary depth levels, so you
only may run update for infinity depth and then filter out files you
do not need, or do as you do now - do step by step update with
immediates depth.

To get contents of a set of files in one go you may use
SVNRepository.checkoutFiles(...) method. This call will do the same as
if you'd have a working copy with the specified files missing and will
run an update on it.

Alexander Kitaev,
TMate Software,
http://subgit.com/ - Svn to Git Migration!
http://svnkit.com/ - Java [Sub]Versioning Library!
http://hg4j.com/ - Java Mercurial Library!
http://sqljet.com/ - Java SQLite Library!



On 24 February 2012 11:33, Henning Blohm<[email protected]>  wrote:
Hello,

in my application I need to introspect a given repository up to max depth
three. Actually I only care about some specific files that may be found no
deeper than depth three.

Today this is done by listing the first level, if applicable going down into
level two and finally fishing some more on level three. Data load is very
light, but roundtrips may be many which is why this approach becomes very
slow on remote repositories that expose a high latency.

Is there a way to recursively retrieve all files to a certain (finite)
depth, including content, in one go (i.e. avoiding multi-roundtrip latency)?
When looking for something like that, it seems that you can only chose
between IMMEDIATE and INFINITE really.

Alternatively, is there a way to bulk-retrieve a number of (potentially
existing) files in one go?

Thanks!
Henning



Reply via email to