> When I get some service or updates done to my test Linux server, that's > actually the easy part. Then I have to move it around to more than a dozen > other Linux servers, logging on to each one in turn and doing an FTP "get" > for each file on each server. As I get more servers running, this problem > will only get worse. > > I'd like to automate this, with a single script to consult a list of servers > and a list of files to be placed on each server. Then with a single command > from my central distribution server, I could automatically update files on > all the other servers. This presumes a service userid with the same > password on all servers, changed frequently. > > Is there anything like "wget", except doing "puts" instead of "gets"? Is > there anything else I might use? I really want to have to avoid logging on > to each and every server. Bonus if it would preserve ownership and > permissions. >
If you don't like my previous suggestion, look at perl-LWP. Or scp, part of the ssh package. On reflection, it's easier then ftp;-) [summer@numbat summer]$ scp -p junk root@dugite:/root/JUNK junk 100% |********************************************* ************************************************************| 45084 00:00 [summer@numbat summer]$ It preserved timestamp, not ownership. -- Cheers John Summerfield Microsoft's most solid OS: http://www.geocities.com/rcwoolley/ Note: mail delivered to me is deemed to be intended for me, for my disposition. ============================== If you don't like being told you're wrong, be right!
