Good points ;)

Now for screwing up

On 9/23/06, Stewart Stremler <[EMAIL PROTECTED]> wrote:

begin  quoting Neil Schneider as of Sat, Sep 23, 2006 at 10:03:56AM -0700:
>
> Jason Kraus wrote:
> > I might be overcomplicating things but in addition to syncing when you
> > login, why not create some daemon that would get a signal from svn to
> > synch. Basically when you commit from one machine, other machines
> > currently online will be notified of the change and either notify the
> > user or sync. If a computer is offline and then comes online, it will
> > then sync. The other thing is have the same daemon listen for changes
> > in a directory structure and commit changes automatically. I am a
> > passively lazy person and I really don't feel like commiting every
> > time i change something... if I have to speen 30 mins to code
> > something that will save me 20 mins over a course of time, ill do it.
> > I wonder if a project like this already exists. Like a snvfs? Just my
> > 2 cents

Gah! ONE thought per paragraph, please!

And why not just use NFS or another one of those network filesystems?


As for using nfs, the problem is that if I take my laptop out of the network
and work on some documents then bring it back in, well it would mean some
extra work. Yes nfs would be ideal if none of the computers left the
network.

Sounds like a botnet to me. :-) Seriously, the security implications
> of this kind of setup are too risky to contemplate.

And reliability. If you screw up on one machine, all your machines are
going to be screwed up.


Thats what revision control is for. If you screw up, you roll it back. This
is mainly why i would like a svn fs. Even if none of my computers leave the
network, it is nice to roll back documents.

Security, too... if you forget to lock your terminal on your testing
machine, I can quickly type 'echo "logout" >! .login ; clear' and you'll
have some serious trouble (and endless fun for the rest of us -- and,
since you're using version control, harmless fun at that!) for a bit.

>                                                     It's ok for there
> to be a public server that you "pull" from when logging in, and "push"
> to when you logout. As long as you control your login, it's not a
> violation of the network use policty and you also control the server.

I'd even go so far so as to simply recommend that you should pull or
push on login/logout.

> What's not good is having code execute automatically from a public
> server to worktations or servers on private networks, most behind
> firewalls. It would essentially created a tunnel into the network.

Whenever "automatically" enters a conversation regarding computers,
the little "check security implications" light should turn on.


Im a bit confused here. Where does executing code from a public server come
in here? I was talking about the commands being predefined on the client
machine. At most, the server would send  a signal telling the client that
something in the repository has changed. So unless you are automatically
executing code that is in the repository (ya, that would be stupid) I faill
to see where executing arbitrary commands  automatically comes in here.

[snip]
> All these things are allowed in a Windows world. In fact Microsoft
> adds these "featrues" to "improve useablity". Any wonder their
> platform is constantly and consistently compromised?

When it all goes right, it often does "improve usability". But the
general trend is for things _not_ to always go right.  One should
always make the assumption that *something* is going to go wrong,
sooner or later, somewhere, and that you'll be the one to clean
up the mess once the dust settles.

--
_ |\_
\|


--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list


--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to