Le 31.01.2013 14:47, Ralf Mardorf a écrit :
On Thu, 2013-01-31 at 13:29 +0100, berenger.mo...@neutralite.org
wrote:
_ easier to remove
No, self compiled is as easy to remove as a package
Of course, there are the "make uninstall" & co commands, but can you
use them with a centralized package manager software with a GUI like
aptitude or synaptics?
I do not think so. I think you need to run a terminal in the directory
where you have the right makefile to run a command.
I know that running commands is easy. At least, for me. But not for
many users.
_ easier to maintain
No, simply build a package
But in this situation, you are not using the usual "./configure && make
&& make install", you add at least 1 step ;)
Just joking, of course. You are true, I've forgot that option.
But when the option to build a package exists, I've noticed that often,
there is a package, too, and that it's ea
_ less dependencies (no need to install all *-dev packages of their
dependencies, and their own dependencies, same for makefile
The dependencies for the build package or directly installed software
are the same, just for compiling you need headers.
Not everyone is using autotools (and so, configure files), and so you
have sometimes to install makefiles generators, such as cmake.
I do not often compile external stuff myself, but for now, I think I
have seen more often cmake than configure, speaking about that. And, to
be honest, I like that, because it's far easier, for the user (colored
logs, easy to read missing dependencies) and for developers (the
CMakeLists.txt syntax is really easy to use, and I were able to read it
and fix an error the first time I've seen it. For configure stuff, I
still have no idea about how it is possible to create/maintain those
things.).
Of course, it is only my opinion, and many people probably think
configure is better. This is not the subject here, I was just saying
that I think that configure is "loosing terrain".
But you are technically right: the only dependencies are the
development headers of the libraries the software would need anyway.
It's a bad habit of
Debian to separate software into app, libs and headers and often it
does
cause serious issues, just take a look at the jack devel mailing list
archive. People don't care about bloated DEs, but they care about
some
bytes to have everything linked correctly, by one package, two
policies
that don't fit together.
No, I worry a lot about bloated DE, this is why I does not use any DE
(but I use graphical environment anyway, of course), and I do not need
most -dev packages. So, separating is a good idea for me.
Currently, if I remove all stuff I have in libdevel, which a normal
user would do, it would free 471MB.
Yes, I know, it's nothing nowadays... but I still have an old computer
with HD of 80GB. When I get it (2 years ago), the HD was a 10GB. On
10GB, 470M are not nothing.
Also, it allows to reduce space used on installation medias.
Except the point of HD space, there is also the problem of bandwidth. I
know that headers have a high compression ratio, since they are text
files, but I think on the scale of debian, that might be quite huge to
send.
So, do not say "bad habit", just "habit" ;)
OT:
I would like to know what kind of issue separating them might cause.
I'm surprised that this is even possible, since when you install a dev
file, it is only the headers corresponding to the binary you already
have installed? Except if you force apt(itude) to install them in
different versions than the installed one (which would break the package
and so apt(itude) would not let you do that) I do not see how something
can be broken by splitting dev/bin packages?
--
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org
Archive: http://lists.debian.org/911e6453f2c22a3e195a983572977...@neutralite.org