Le 30.10.2013 19:53, Reco a écrit :
Yet, Linux kernel is not written in pure K&R C. Linux kernel is
written
with GCC in mind, and actively uses GCC's extensions to C89 (or C99,
memory fails me) standard.
So I apologize.
I won't even try, since even with the GNU tools given by the gentoo
installer, I was never able to build a kernel myself. At least, not a
kernel which was working correctly.
> * GNU build system (autotools):
> o Autoconf
> o Autoheader
> o Automake
> o Libtool
Do not make me laugh. Those tools are just dirty.
Every time I have to compile something with autotools, it gave me
problems and problems and yet another problems!
They are slow, produces unreadable logs, are hard to maintain... (I
mean, it is hard to maintain the scripts they need to work)
That's only shows that you do not want or able to use these tools
properly. Because, you see, Linux kernel uses them just fine.
No, it's a matter of facts. I did not say that they do not work, I said
they have amounts of problems, that I encounter when I try to compile
things made by other peoples.
Last example in date is yesterday: I cloned the aptitude's git
repository, did the ./autogen.sh and then ./configure. Installed lacking
dependencies. Until now, no problem, nothing to say. Then, ./configure
again, and the tools just redoes every check ( while a lot where already
done in previous run ), so that I could know the next dependency I had
not installed. For now, ok, it's fine. It redoes unneeded stuff, which
makes it slow, but since it is a one-time job, I do not mind that much.
The biggest problem is the last error I had:
===
checking the return type of boost::make_shared... configure: error: in
`/home/berenger/devel/aptitude':
configure: error: unknown
===
Wonderful is not it? No informations at all. Of course, I then followed
advice to "See `config.log' for more details. Here is the end of the
file I had (with line numbers, or it is not fun enough):
2277 #define HAVE_BOOST /**/
2278 #define HAVE_BOOST_IOSTREAMS /**/
2279 #define BOOST_FUSION_FOLD_STATE_BEFORE_VALUE /**/
2280
2281 configure: exit 1
In short, no usable informations for the user (the guy which tries to
compile things, but is still not involved in the programming process).
So, unreadable logs. (Note that if you have any clue about the error, I
will still be happy to learn it. I would really like to compile aptitude
and do some experiments with it's source code.)
And for maintenance, I tried enough in the past to use it. At that
time, I thought that if it was still used by so many people, surely it
must be an excellent tool. I was wrong, the only reason is the history.
It is hard to learn, by far harder than other tools which makes the same
thing. If it is hard to learn, I can not imagine that it is easy to
maintain.
For now, I only talked on my name, and I can accept to be wrong and
that those problems could be only due to my lack of intelligence. But,
it seems that I am not alone to think that autotools are not so good
tools:
https://lwn.net/Articles/188693/
Now the next big change is happening: KDE is leaving the aging
"autotool" build chain behind. Some developers, not only in KDE, like
to nickname the autotools as "auto-hell" because of its difficult to
comprehend architecture.
I do not think that the KDE project would do such a huge change without
valid reasons, and they did it.
They moved to another tool, named cmake. I tried cmake quickly after
trying to understand autotools, and I was able to setup the build system
for my existing project in less a day ( 1 exe, 2 libraries, 2-3 plug-ins
) with a mechanism flexible and maintainable enough to not make adding a
module a pain implying the need to browse hundreds of lines.
Plus, it is portable on more systems: it supports windows. It may seems
stupid, but I do not really like the idea of only being able to build on
unices. I prefer to write stuff that can compile everywhere with the
same tools and procedure. And have those tools and procedure as easy as
possible. Which is not something autotools provides.
I did not tried scons and others, because I was happy with cmake. I
still am, and I have seen it used by lots of projects. I never had any
problem when trying to compile those projects, even without reading the
documents (when there are) describing the dependencies. Unlike with
auto-tools.
In short, the linux project may use it happily, but one project is not
all projects.
People which works on Linux are people with talent, and probably with
more knowledge about C language than most programmers have on their
favorite language.
But it does not means that the tools they use are the best current
ones. They were probably the best choice when they were selected (and
the best choice does not only rely on technical efficiency.), but things
have moved in more than 20 years (linux is from 1991, is not it?).
>Amongst others, apparently (list taken from wikipedia). Is it
>possible
>(feasible) to bypass these "vital" tools with another set of tools,
>that you'll
>be writing shortly after responding to this post?
Linux kernel is written in C. C owns nothing to GNU. Nothing.
It may happen that Linux's developers used some GNU implementations
for a C compiler, an assembler, a debugger, etc... but it could have
be made with other tools too.
Given that:
a) Linux kernel has to function on multiple processor architectures.
b) Maintaining and developing is easier if you test for one compiler
only.
c) Nothing beats GCC in being cross-platform.
I do not have the hardware to try, but is not llvm as good for that
goal (of being cross platform)? I have no idea about hardware
portability of compilers.
Plus, if you use C code (or C++ or whatever portable language), and
disable non-standard extensions, you should not have any problems
compiling your source code with other compilers.
In fact, compiling the same source code with other compilers is an
interesting thing to do, because it can reveal bugs in your code, where
you made some wrong assumptions on a behavior being the normal one. It
can also allows to discover bugs in your favorite compiler.
But, indeed, if Linux is made with GCC extensions, then another
compiler can obviously not be used, for now (I quickly tried to find
which extensions are used to sleep even less stupid that night but fail.
I have only found documents about llvm trying to compile linux, and it
seems that this project sends patches to both llvm and linux projects.
They said it is currently able to run a desktop environment, but that
there are still things lacking, like network interfaces. The document
was 2 years old.).
GNU means "Gnu is Not Unix", and it is because it was meant to be a
complete OS ( I have never seen a working hurd ) different than
Unix, but keeping the same behavior.
In other word: the goal have never be to invent something, only to
copy what exists somewhere else, but with the interest of being
open-source, and that stuff depending on (linked with) their tools
would stay open-source.
Nope. The goal was to enhance low quality userland AT&T and Berkley
gave the user. That goal was reached successfully. Copying
functionality
is a byproduct of that goal.
I have nothing with copying features. Copying a feature, but not the
way it's made (when it's not trivial of course) is one way to improve
it.
--
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org
Archive: http://lists.debian.org/68734b6ab158396d75b4b49029211...@neutralite.org