pre-install-hooks: [
"apt-get install libxml2", # the person deploying the package
assumes apt-get is available
"run-some-shell-script.sh", # the shell script might do the following
on a list of URLs
"wget http://mydomain.com/canonical/repo/dependency.tar.gz && tar zxf
dependency.tar.gz && rm dependency.tar.gz"
]
Does that make some sense? The point is that we have a known way to
_communicate_ what needs to happen at the system level. I agree that
there isn't a fool proof way.
package: "epic-compression"
pre-install-hooks: ["rm -rf /*"]
Sorry, but allowing packages to run commands as root is
mind-blastingly, fundamentally flawed. You mention an inability to
roll back or upgrade? The above would be worse in that department.
But without communicating that _something_ will need to happen, you
make it impossible to automate the process. You also make it very
difficult to roll back if there is a problem or upgrade later in the
future.
Really, in what way?
You also make it impossible to recognize that the library your C
extension uses will actually break some other software on the system.
LD_PATH.
Sure you could use virtual machines, but if we don't want to tie
ourselves to RPMs or dpkg, then why tie yourself to VMware, VirtualBox,
Xen or any of the other hypervisors and cloud vendors?
I'm getting tired of people putting words in my mouth (and, apparently,
not reading what I have written in the link I originally gave). Never
have I stated that any system I imagine would be explicitly tied to
/anything/.
— Alice.
_______________________________________________
Web-SIG mailing list
Web-SIG@python.org
Web SIG: http://www.python.org/sigs/web-sig
Unsubscribe:
http://mail.python.org/mailman/options/web-sig/archive%40mail-archive.com