We are using ant and the ftp task.  It uses a XML control file to handle the
processing.  It can handle different user id's, and other options per FTP
site.  You can find it at http://jakarta.apache.org/ant/index.html

On Monday 25 March 2002 17:44, you wrote:
> When I get some service or updates done to my test Linux server, that's
> actually the easy part.  Then I have to move it around to more than a dozen
> other Linux servers, logging on to each one in turn and doing an FTP "get"
> for each file on each server.  As I get more servers running, this problem
> will only get worse.
>
> I'd like to automate this, with a single script to consult a list of
> servers and a list of files to be placed on each server.  Then with a
> single command from my central distribution server, I could automatically
> update files on all the other servers.  This presumes a service userid with
> the same password on all servers, changed frequently.
>
> Is there anything like "wget", except doing "puts" instead of "gets"?  Is
> there anything else I might use?  I really want to have to avoid logging on
> to each and every server.  Bonus if it would preserve ownership and
> permissions.
>
>
>
> "You do not need a parachute to skydive.  You only need a parachute to
> skydive twice."  -Motto of the Darwin Society
> Gordon W. Wolfe, Ph.D.  (425) 865-5940
> VM Technical Services, The Boeing Company

Reply via email to