Hi Cameron,
        Maybe I am confused, but I was actually asking about the push-pull 
capabilities - does the crawler plug into the push-pull framework ? (sorry 
about my ignorance here). If push-pull supports scp, would you know the name of 
the protocol transfer factory to use - I haven't found one.
thanks a lot,
Luca

On Feb 28, 2012, at 8:40 AM, Cameron Goodale wrote:

> Luca,
> 
> I haven't tried this exact use case within Crawler, but Crawler does
> support scp and I have used 'scp -r' to recursively download a folder and
> all content housed within.  I can only imagine ftp has a similar recursive
> option as well.
> 
> Maybe another more Crawler Savy dev can shine some light on the recursion
> use case when using Crawler.
> 
> -Cameron
> 
> P.S. When we get a final answer let's add this to the Crawler User Guide
> Wiki too as an example use case.  Glad you found the Crawler Wiki page
> useful.
> 
> On Tue, Feb 28, 2012 at 7:01 AM, Cinquini, Luca (3880) <
> luca.cinqu...@jpl.nasa.gov> wrote:
> 
>> Hi all,
>>       I have a quick question concerning the pushpull framework : is
>> there any way to transfer full directory trees, as opposed to single files
>> ? And which of the currently implemented transfer protocols would allow
>> that ? I haven't see any examples on that, though I might have missed it.
>> 
>> thanks a lot,
>> Luca
>> 
>> P.S.: Cameron, thanks for writing the push-pull user guide - it's great.
>> 
>> 
> 
> 
> -- 
> 
> Sent from a Tin Can attached to a String

Reply via email to