Craig Barratt wrote:
> Tristan Krakau writes:
>
>   
>> due to the rsync-cygwin-ssh problem when backing up Windows clients
>> using 'rsync' as transfer, I tried to use 'rsyncd' through a ssh tunnel
>> instead (since the Windows client can only be accessed via ssh).
>>
>> I found other threads dealing with this topic (e.g.
>> http://sourceforge.net/mailarchive/message.php?msg_id=11482919), and all
>> the hints there seemed to work fine:
>>
>> - a ssh tunnel from the backuppc server to the client is created
>> - rsync is redirected to the localhost:port and this way connects to
>> rsyncd-shares on the client
>> - after backing up the host the tunnel is killed again
>>
>> However, this only works when I manually setup the ssh-tunnel, call
>> BackupPC_dump <host> and tear down the tunnel afterwards.
>>
>> But if I want the tunnel to be set-up by
>>
>> $Conf{DumpPreUserCmd} = 'ssh -L 5009:localhost:873 -N -f
>> [EMAIL PROTECTED]';
>>
>> the tunnel is created - and the dump is waiting... it simply does not
>> continue until the ssh-tunnel is killed (from another shell e.g.). Then
>> the backup of course fails because the tunnel is not there and rsyncd
>> cannot be contacted.
>>
>> I also tried putting the ssh -L ... command in another script:
>>
>> host_prepare.sh:
>> ssh -L 5009:localhost:873 -N -f [EMAIL PROTECTED]
>> echo Tunnel to host was created
>> --
>> and it shows that the echo command, like any other command after the
>> ssh-command is still executed, which tells that ssh is started in the
>> background, but the dump will pause after the script has finished.
>>
>> Also, using & instead of the -f option with ssh has no effect.
>>
>> So my question is: How can I make the DumpPreUserCmd start the
>> ssh-tunnel in the background and return so the dump can begin while the
>> tunnel is there?
>>
>> I think it could have something to do with the way Perl executes
>> commands, maybe it always waits for all child-processes to end?
>>
>> I really hope someone could give me a hint how to solve this!
>>     
>
> Yes, it must have to do with how perl executes these commands.
> I'm not near a linux machine right now, so I can't test this.
>
> First, I assume you tried "&" inside host_prepare.sh, since 
> $Conf{DumpPreUserCmd} is exec'ed directly rather than via
> a shell.
>
> The perl code uses the open(F, "-|") form of fork to run you command.
> That pipes stdout of the children to perl.  So the issue could be
> that since stdout of the child (and its children) is still open,
> perl continues to wait.
>
> Inside the shell (assuming I've got the /bin/sh syntax right),
> I recommend trying to redirect ssh's stdout and stderr to
> /dev/null, eg:
>
>     ssh -L 5009:localhost:873 -N -f [EMAIL PROTECTED] 1>/dev/null 2>/dev/null
>
> or
>
>     ssh -L 5009:localhost:873 -N [EMAIL PROTECTED] 1>/dev/null 2>/dev/null &
>
>   
YES! That is it! Now Perl continues after the shell-script returns and
can connect through the created tunnel.

Thank you very much & best regards,

Tristan



-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to