Hi Luke,
You may want to look at my localsvn scm here http://blog.wolfman.com/
articles/2006/12/06/a-capistrano-scm-module-for-local-svn-access
and slip your rsync method in there replacing the put that I do. This
would help people with limited upload bandwidth.
personally just sending the entire tar.gz works fast enough for me at
the moment.
On Jan 27, 10:58 am, "luke" <[EMAIL PROTECTED]> wrote:
> Hi Jacob,
>
> I have been using this method for little while. I just hacked at
> cache_svn.rb and created cache_rsync.rb.
>
> It works like this..
>
> 1. local svn cache is updated using svn up
> 2. rsync sends changes (minus .svn folders, and deploy.rb) to servers
> cache folder
> 3. simple copy from the server cache to the release folder
>
> This has the following advantages.
>
> 1. Its much faster if your svn server is running on your local
> network.
> 2. No need for subversion to be installed on the live server.
> 3. You can exclude some files from being sent to the live server. E.g.
> dont send deploy.rb or your svn password.
> 4. Your svn server can be behind a firewall.
>
> My code could really do with some improvements and I think the SCM
> plugin sounds great. Is it in svn?
> I would be happy to help adding rsync support once the plugin is out.
>
> - Luke
>
> P.S. - Here is the important bit of code from my hacked
> cache_rsync.rb.
>
> cmd = 'rsync '
> %w{
> archive
> compress
> copy-links
> cvs-exclude
> delete-after
> no-blocking-io
> stats
> }.each { | opt | cmd << "--#{opt} " }
> cmd << " --exclude \"config/deploy*\" -e ssh #{local_rsync_cache}/
> "
>
> puts "Updating each server"
>
> username = user || ENV['USER']
>
> current_task.servers.each do |server|
> puts "Syncing deployment cache for #{server}"
> puts `#{cmd} [EMAIL PROTECTED]:#{remote_rsync_cache}/`
> end
>
> run "cp -r #{remote_rsync_cache} #{release_path}"
>
> On Jan 24, 5:11 am, Jacob Atzen <[EMAIL PROTECTED]> wrote:
>
> > On Tue, Jan 23, 2007 at 01:54:36PM -0700, Jamis Buck wrote:
> > > > Might I suggest a combination of cached_repository and copy_*? Perhaps
> > > > implemented over rsync?
>
> > > > Something along the lines of:
> > > > - Checkout / export into local copy - to avoid having remote repos
> > > > access
> > > > - Rsync local copy to cached repository (or cached export) - to
> > > > minimize
> > > > bandwidth usage and deployment time
> > > > - Copy cached repository to new release
>
> > > > Does this sound like a good idea to anyone but me?
>
> > > If you're copying from the localhost to each server anyway, why not
> > > just copy directly to the new release directory? Perhaps I'm
> > > misunderstanding the scenario. At any rate, I think you'll find new
> > > deployment strategies ridiculously easy to write with the new system.
> > > The cached_repository strategy, for instance, is only 48 lines of
> > > code, including blank lines and comments.I'm impatient, when I hit deploy
> > > I want my new version running asap. As
> > such I would prefer having a cached copy on each server which I can push
> > deltas to instead of having to push the whole app including the frozen
> > Rails sources.
>
> > I'll take a hack at it when you publish the new code.
>
> > --
> > Cheers,
> > - Jacob Atzen
--~--~---------~--~----~------------~-------~--~----~
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/capistrano
-~----------~----~----~----~------~----~------~--~---