Hi all,

Just merged the branch Marcel and I were working (actually we couldn't
share a branch since we cannot push to other developer's branch and we
cannot create a shared one).

We'll work in tree, so we avoid conflicts as we do renames and change
the #defines.

See TODO-cmake.txt on how you should help.



On Mon, Jan 23, 2017 at 9:38 AM, Gustavo Sverzut Barbieri
<barbi...@gmail.com> wrote:
> On Mon, Jan 23, 2017 at 3:19 AM, Carsten Haitzler <ras...@rasterman.com> 
> wrote:
>> On Sat, 21 Jan 2017 09:45:04 -0200 Gustavo Sverzut Barbieri
>> <barbi...@gmail.com> said:
>>
>>> On Sat, Jan 21, 2017 at 5:24 AM, Carsten Haitzler <ras...@rasterman.com>
>>> wrote: [...]
>>> > i7 desktop: (autogen)    (make) (eina_cpu.c) (eina_cpu.h)      ()
>>> > autotools:     27.802     4.306        0.571        1.234   0.097
>>> > cmake ninja:    1.283     2.278        0.160        0.636   0.014
>>> > cmake make:     1.632     3.142        0.234        0.787   0.064
>>> >
>>> > pi3:
>>> > autotools:    477.870    62.640        6.853       16.337   1.313
>>> > cmake ninja:   15.892    35.931        2.611        9.627   1.365
>>> > cmake make:    19.358    38.991        0.961        1.161   0.921
>>> >
>>> > so dumping automake and libtool buys raw build speedups of like 2x. doing
>>> > any editing of code is massively faster as just far less is built AND it's
>>> > built faster. the autogen (configure/autotools) part is MANY MANY MANY
>>> > times faster. even assuming it'll get 3x slower once we check everything
>>> > with cmake... it's still 5-10 TIMES faster.
>>>
>>> currently cmake's configure does very barebone checks, such as types
>>> and the likes, but even that I want to improve by taking some
>>> shortcuts such as "if(LINUX AND GLIBC_GOOD_ENOUGH)" would assume you
>>> run a sane system and skip checking for stupid stuff. Same for
>>> compiler checks, we do lots of flags we know exist in newer compilers,
>>> so we could easily add an 'if' and just use the flags, not generate
>>> and compile one single test with that. Granted we could also do some
>>> of that in autootols, but some parts are trickier to do.
>>
>> yeah. having more "we know on platform X feature x/y/z are exist andor/ arfe
>> done this way" and simply detecting which is nicest. though there is the 
>> un-fun
>> thing s like "linux && glibc, linux && uclibc, linux && musl ..." for
>> starters... :(
>>
>>> > the simple version of this is: it looks like cmake doesn't do stupid
>>> > relinking or rebuilds it doesn't need to that i thought we'd have to 
>>> > fight.
>>> > so it just got better. cmake is seemingly right out there in terms of
>>> > speed. for a GENERIC build system tool that should/can handle anything it
>>> > seems to be handily fast.
>>>
>>> I configured it so it will require a build directory and inside that
>>> directory I shadow the system installation without "prefix", thus you
>>> end with "lib", "bin" and so on. They also use rpath to set paths to
>>> find the binaries, resetting those to "" (empty) at the time of
>>> installation, which is faster than relinking.
>>
>> i'm less of an rpath fan... i'd rather we use LI_LIBRARY_PATH instead at 
>> these
>> junctures...
>
> https://cmake.org/Wiki/CMake_RPATH_handling says -DCMAKE_SKIP_RPATH=ON
> does what you want.
>
> But overall it helps usage without the need for nasty libtool-like scripts.
>
> However, with the build results being laid out exactly like they would
> in the system makes things much easier, eina_prefix should just work,
> etc.
>
>
>>> Also note that cmake itself is just involved at the first moment,
>>> later you just run pure make/ninja commands. The make usually takes
>>> some helpers to produce progress and colors. There ninja is usually
>>> faster since it creates a full blown build.ninja with everything.
>>
>> well make/ninja with some calling out to cmake... but yeah.
>>
>>> Seems your RPI3 is slower with ninja than make, one reason may be IO?
>>> AFAIR ninja use command files and pass those to GCC with "@filename"
>>> instead of super long command lines. However opening and reading those
>>> small files may be hurting your build, since you're not actually
>>> compiling stuff due ccache.
>>
>> yes. ccache helps make the compile bit about as fast as it'll get so the 
>> other
>> parts show up...
>>
>>> > what i see here is a major leap in productivity if we moved to cmake. i 
>>> > now
>>> > "officially" like cmake. :) this would be a huge win for us... even if we
>>> > have to wrestle in a make dist and distcheck. the option of ninja is a "a
>>> > bit faster than gnu make and in some cases a lot faster" option. but 
>>> > really
>>> > cmake is the key.
>>> >
>>> > so i guess... bikeshedding ... is there any reason to not use cmake that
>>> > would override all the benefits? i cannot think of one.
>>>
>>> I'd like to complement with: simpler rules and usage for efl developers.
>>>
>>> With automake you can't autogenerate anything (at least I never found
>>> a may to apply m4 rules there), then you keep repeating patterns for
>>> modules and all, that results in slightly different versions of the
>>> same thing as one is updated and the other isn't.
>>>
>>> with cmake and other build systems I can instruct them to understand
>>> efl's well structured source tree and automatically do stuff for us.
>>> My plan is for the final CMakeLists.txt to foreach(l in src/lib/*)
>>> call EFL_LIB(${l}), then it will automatically do:
>>>
>>>  - check if library is enabled
>>>
>>>  - include src/lib/${l}/CMakeLists.txt to get SOURCES, LIBRARIES,
>>> PUBLIC_LIBRARIES, PUBLIC_HEADERS...
>>>
>>>  - create static/dynamic library (we can even automate libefl-single.so  
>>> here)
>>>
>>>  - write ${l}.pc (also simpler to automate libefl-single.pc and make
>>> all others just Require: efl-single)
>>>
>>>  - include src/bin/${l}/CMakeLists.txt or
>>> src/bin/${l}/*/CMakeLists.txt and compile all binaries automatically
>>> linking with the library ${l}
>>>
>>>  - include src/modules/${l}/*/*/CMakeLists.txt and compile all modules
>>> (with optional scope) linking with ${l} (if dynamic) or linking ${l}
>>> with module.a (if static module)
>>>
>>>  - include src/tests/${l} or src/tests/${l}/*/CMakeLists.txt and
>>> compile all tests, linking with ${l} and adding to ctest testing
>>> runner.
>>>
>>> You can compare 3 files with src/Makefile_Eina.am:
>>> src/lib/eina/CMakeLists.txt
>>> src/modules/eina/mp/one_big/CMakeLists.txt
>>> src/tests/eina/CMakeLists.txt
>>>
>>> see we do not need to provide src/bin/eina/eina_btlog/CMakeLists.txt
>>> as it's a single file source that just links with eina.
>>
>> i like the idea of a strict tree with a pattern to follow so it's easy to 
>> copy
>> & paste or re-use templates or just have simpler parent build rules that just
>> need some overrides (eg add the following include dirs and linking to this
>> binary vs the std template).
>>
>> now the question still stands... any good reasons not to cmake. what we need 
>> is:
>>
>> 1. people willing to get dirty and help a move happen
>> 2. a plan of how to do that move with the least disruption and least pain
>>
>> i can see a stage 0 here... "prepare for it". so within autotool land move 
>> src
>> around a bit so we have "1 dir == 1 output target" like you describe so it's
>> easier to do the above you describe with cmake.
>>
>> another question is ... is it possible to have a hybrid system. for now have 
>> a
>> master configure and this re-cycles sub-configure-like features to run cmake 
>> in
>> subdirs. that way we can port src/lib/eina and src/bin/eina for example
>> first ... then expand... then eg do efl and eo, then ecore, then ecore_con
>> then... so one thing at a time move over to cmake... and in the end nuke the
>> toplevel autotools configure and replace that with cmake. is this even sane?
>
> while this is possible, I believe it can be done real quick once we
> finish eina for real IF we add an extra step to "stage 0":
>
>  - unify & simplify #define usage
>
> As Marcel noticed and I notice when I helped with the single tree
> unification, it's a nightmare to find out the defines, what people use
> and which are meaningful.
>
> So a review to also follow a pattern is needed and can be done in
> autotools before we move.
>
> An example is how to enable/disable modules and make them static,
> these are all different and in cmake I tried to make them
> auto-generated thus they must follow a pattern:
>     EFL_${LIB}_MODULE_TYPE_${SCOPE}_${MODULE}_${TYPE}
>
> like:
>     EFL_EINA_MODULE_TYPE_MP_CHAINED_STATIC
>     EFL_EINA_MODULE_TYPE_MP_ONE_BIG_DYNAMIC
>
> Then it's my proposal for modules to be all converted to use something
> like that + the install directory structure to be the same, currently
> ecore is different, possibly others.
>
> But that goes to many, many things, like 3-4 variables to tell project
> version (VMAJ-like)
>
>
> --
> Gustavo Sverzut Barbieri
> --------------------------------------
> Mobile: +55 (16) 99354-9890



-- 
Gustavo Sverzut Barbieri
--------------------------------------
Mobile: +55 (16) 99354-9890

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
enlightenment-devel mailing list
enlightenment-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/enlightenment-devel

Reply via email to