My company currently has multiple load balanced web servers.
Each time we
deploy code, we have to manually FTP it to each server. We'd
love to be able
to upload (or SVN) code to one location and have an automated
process to
replicate the code to the other servers.
I've used Super
The last time i worked on a site with multiple servers, they had a simple
scheduled task set up to run every half hour or so. It looks for files in
an upload directory, and if it finds anything, copies the files over to the
production servers, creating new folders if necessary, then deleting the
Paul...
My company currently has multiple load balanced web servers. Each time we
deploy code, we have to manually FTP it to each server. We'd love to be able
to upload (or SVN) code to one location and have an automated process to
replicate the code to the other servers.
You mentioned a
Andy,
We use SVN to deploy to code to a single server, and then we use DFS to
automatically propagate the changes. Once you set it up, it doesn't require
any intervention, unless it breaks, which happened to me a few times to
date. It is, however, very useful in that once you push the code to
My company currently has multiple load balanced web servers. Each time
we
deploy code, we have to manually FTP it to each server. We'd love to
be
able
to upload (or SVN) code to one location and have an automated process
to
replicate the code to the other servers.
We currently have a similar
If you use SVN + DFS, you will only pull the files you need from a remote
server (SVN) once, and then DFS will propagate only the changes between
your local servers. Doing svn update on every server will load the changes
once for every server from the remote location (unless your svn server is
You mentioned a program called Robocopy in this post. Can you
provide some additional information?
Robocopy is in the Windows Resource Kit AFAIK, it's a pretty powerful
command line tool that you can script to keep folders in sync.
As an example,
robocopy C:\source \\server\C$\source
We use SVN to deploy to code to a single server, and then we
use DFS to automatically propagate the changes.
That of course is the better option if your environment supports it :)
Paul
~|
Upgrade to Adobe ColdFusion MX7
We use UNC shares in our current enviroment but now we are integrating Linux
Apache boxes on the front end so in the interim we have a 1TB File (750Mb
Raid 5) Server that does NFS and CIFS shares. We are moving to a Netapps
Filer in a bit so we just picked up a SNAP Server to hold us over
Eric
And as long as you have Windows and your servers are members of the domain,
then your environment should support it. I'm still looking for a similar
thing for linux.
Russ
-Original Message-
From: Paul Vernon [mailto:[EMAIL PROTECTED]
Sent: Tuesday, February 06, 2007 11:32 AM
To:
I originally set up everything using UNC shares with DFS, but I felt it was
a bit slow. I've since ended up using DFS just to replicate the data, and I
still access all the data locally.
We don't have a huge amount of data though... Lots of files, but total maybe
5gb.
Russ
-Original
That SNAP server was the similar solution. Windows boxes can access it as a
Mapped Drive or in our case the SNAP server is in the domain so it is a UNC
share. The linux boxes access it via NFS so we have one repository for all
our files. Now if I can get Serena Mover setup and running it would
12 matches
Mail list logo