> From: Ed Pataky <ed.pat...@gmail.com>
> One thing I am concerned about is that it seems like there is no protection
> from someone in via ftp and changing files .. i assumed that version
> control meant that the files are protected .. why doesn't git protect the
> files? What i mean is, this seems to only work if everyone does it
> correctly .. but if someone simply goes in by ftp and modifies files, then
> git has no "control" over that .. is this correct?
If you want that sort of protection, the usual technique is:
- Adjust how the developers access the server so that the *only* thing
they can do is insert new commits into the master repository.
- Have a process that the developers can trigger when they put a
"newest" commit into the master repository that: (1) pulls the new
commits from the master repository into the subordinate repository
on the server; (2) checks out the newest commit from the subordinate
repository into the working directory; (3) restarts the web server
in whatever way is necessary for the new files to take effect.
Caution: When you do this, you will discover that your developers
have been "going in via FTP and changing files"... and that there are
some practical tasks which this is the *only* way they know how to
accomplish. Probably testing and debugging tasks. So suddenly your
existing workflow will break, and you will need to be prepared to
figure out better ways to do these things, possibly in a hurry.
You received this message because you are subscribed to the Google Groups "Git
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email
For more options, visit https://groups.google.com/groups/opt_out.