Most previous requests mimic things I want...

but I find that the protect function /mechanism should be given much more guts.

This is important if we are to create scripts which can encapsulate external tools 
which we are testing even if linked in the code.

I'd like protect to have a /disk refinement so that ALL I/O writes go to a ram cache.

if this where enabled I'd also be less nervous testing some critical file routines 
like recursive file handling...

the idea is that any write or deletion be actual done in a ram mirror of the disk.  
Untampered dirs/files would still be loaded from disk.  any consecutive read which 
maps to an area of the disk which was written would actually filter out to use the 
version in ram instead.  Deleted directories, would not be visible anymore, created 
directories could have virtual files in them.  I know some will complain that it taks 
a lot of ram, but many of us HAVE enough ram for this feature to be usefull.

This would also be very usefull for code which is run live from network connected 
clients.  Any hacker would think he is damaging your stuff, when in fact he just 
playing in his own little rubber padded chamber.

I'd also like it if a /protect refinement was added to MAKE, DO and LOAD, to make 
critical code objects immutable once loaded or run, this could keep any malicious code 
which attempts to tamper with setups, or sensitive code which is used to play in 
sensitive data...

when testing other people's code you never know what you load, not everyone is skilled 
enough to feel safe even after checking the script's source...

For example, I'd place my user.r setup in an other file and execute the following line 
in the user.r file

 do/protect protected-user.r

Does this makes sense to any of you?

is there already a way to do so?


-MAx

-- 
To unsubscribe from this list, just send an email to
[EMAIL PROTECTED] with unsubscribe as the subject.

Reply via email to