On Mon, Mar 25, 2002 at 03:44:16PM -0800, Wolfe, Gordon W wrote:
> When I get some service or updates done to my test Linux server, that's
> actually the easy part.  Then I have to move it around to more than a dozen
> other Linux servers, logging on to each one in turn and doing an FTP "get"
> for each file on each server.  As I get more servers running, this problem
> will only get worse.
>
> I'd like to automate this, with a single script to consult a list of servers
> and a list of files to be placed on each server.  Then with a single command
> from my central distribution server, I could automatically update files on
> all the other servers.  This presumes a service userid with the same
> password on all servers, changed frequently.
>
> Is there anything like "wget", except doing "puts" instead of "gets"?  Is
> there anything else I might use?  I really want to have to avoid logging on
> to each and every server.  Bonus if it would preserve ownership and
> permissions.

Well, this might be the solution for you, I think.
Use 'lftp' and .netrc file.

Here's urgly example:

.netrc (permission: 0400)
machine sml01 login opc password opc
machine sml02 login opc password opc
machine sml03 login opc password opc
machine sml04 login opc password opc

putscript.sh
#!/bin/bash

lftp -u opc sml01 -e 'lcd /home/sml01; cd /home/www; mirror -R; exit'
lftp -u opc sml02 -e 'lcd /home/sml02; cd /home/www; mirror -R; exit'
lftp -u opc sml03 -e 'lcd /home/sml03; cd /home/www; mirror -R; exit'
lftp -u opc sml04 -e 'lcd /home/sml04; cd /home/www; mirror -R; exit'

How about this?
Regards,
Jae-hwa
--
Jae-hwa Park <[EMAIL PROTECTED]>
IBM Korea, Inc.                     Sa-Rang means LOVE!
For more information on me, visit http://php.sarang.net

Reply via email to