> When I get some service or updates done to my test Linux server, that's
> actually the easy part.  Then I have to move it around to more than a dozen
> other Linux servers, logging on to each one in turn and doing an FTP "get"
> for each file on each server.  As I get more servers running, this problem
> will only get worse.
>
> I'd like to automate this, with a single script to consult a list of servers
> and a list of files to be placed on each server.  Then with a single command
> from my central distribution server, I could automatically update files on
> all the other servers.  This presumes a service userid with the same
> password on all servers, changed frequently.
>
> Is there anything like "wget", except doing "puts" instead of "gets"?  Is
> there anything else I might use?  I really want to have to avoid logging on
> to each and every server.  Bonus if it would preserve ownership and
> permissions.


probably the easiest is standard ftp. Configure ~/.netrc and then it's
as easy as

FILE=whatsit
for h in a b c d ;do echo put $FILE | ftp $h;done

or
for h in a b c d ;do echo -e put $FILE '\n chmod 400 ' $FILE | ftp
$h;done

to set its permissions. Ownership will be that of the account on the
remote system UNLESS you fiddle with directory permissions:
chmod +sg parentdirectory
in which case you may have some other "discussion" about permissions.

Without configuring ~/.netrc you can expand on the second example to
push the whole stream of commands into ftp, or use a 'here" document:
ftp -n $h <<EOF
user whatsit
...
etc
EOF


--
Cheers
John Summerfield

Microsoft's most solid OS: http://www.geocities.com/rcwoolley/

Note: mail delivered to me is deemed to be intended for me, for my
disposition.

==============================
If you don't like being told you're wrong,
        be right!

Reply via email to