Hi, I have a few questions about data transfer and thought I would roll it into one email:
1) Local and remote data transfer with the same file manager - I see that when configuring a cas-crawler, one specifies the data transfer factory by using --clientTransferer - However in etc/filemgr.properties the data transfer factory is specified with filemgr.datatransfer.factory. Does this mean that I if I specify a local transfer factory I cannot use a crawler with a remote data transferer? I'm wanting to cater for a situation where files could be ingested locally as well as remotely using a single file manager. Is this possible? 2) Copy and ingested product on a back up archive For backup (and access purposes), I'm wanting to ingest the product into an off site archive (at our main engineering office) with it's own separate catalogue. What is the recommended way of doing this? They way I currently do this is by replicate the files using rsync (but I'm then left with finding a way to update the catalogue). I was wondering if there was a neater (more OODT) solution? I was thinking, perhaps using the functionality described in OODT-84 (Ability for File Manager to stage an ingested Product to one of its clients) and then have a second crawler on the backup archive which will then update it's own catalogue. I just thought I would ask the question in case anyone has tried something similar. Cheers, Tom
