On Tue, 17 Jan 2017 12:20:31 -0200 Gustavo Sverzut Barbieri <barbi...@gmail.com> said:
> On Mon, Jan 16, 2017 at 10:39 PM, Carsten Haitzler <ras...@rasterman.com> > wrote: > > On Mon, 16 Jan 2017 12:00:24 -0200 Gustavo Sverzut Barbieri > > <barbi...@gmail.com> said: > >> - pure gnu make is great, with very capable macros on its own, you > >> can auto generate rules, recursively include files... that leads to a > >> single 'make' (not recursive) yet nicely split into individual files > >> like the linux kernel. However it may be bit slow on big projects, > >> it's showing its age on Linux or Buildroot if you don't have a fast > >> enough CPU. > > > > we already depend on gnu make anyway... so nothing new. :) makefiles are > > something most c/c++ devs know so it's a "devil you know" with gnu make > > being the far more friendly make variety which is decent enough to not have > > to do workarounds for. > > well, if you rely on macros to create rules for you it's not that > easy, been there, did that, debug is a major PITA, things like some > variables and functions you declare may depend on order, which is not > that trivial to spot and there are no errors, just silent expansion to > empty. no - i'm not thinking of the kind of geenration that automake does. just simple EINA_CFLAGS=@EINA_CFLAGS@ and all the shellscript configure is doing is pre-running: EINA_CFLAGS=`pkg-config --cflags dep1 dep2 dep3` sed s^@EINA_CFLAGS@^$EINA_CFLAGS^g < input_template > Makefile.conf of course a more extended set of sed replace rules. so your Makefile.conf is just EINA_CFLAGS=-I/usr/local/include EINA_LIBS=-L/usr/local/lib -leina ... for example ... etc. with just pur makefiles you have to do the CC `pkg-config --cflags dep1 dep2 dep3` every single CC invocation... > the kernel-like makefiles uses lots of that AND dynamic configurations like: > > obj-$(KCONFIG_OPT_NAME) += file > > this is tad slow to compute, thus every make command will take a while > to execute. In contrast ninja is explicit and all is pre-computed, > then for regular development where you're not changing options, it's > noticeable faster. then you need something to pre-compute this... which really would mean cmake. i wouldnt want to bother with a hand done shell script or kconfig to do this it's going a step too far imho. > > i actually would be perfectly happy going back to subdir makefiles as you > > can just cd to the right place in the tree and "make" from there easily > > enough... :) > > it's a bad practice, but if you insist you can always add Makefile in > the subdirs with: > > make -C ../.. > > and things like that. no! thats not what i want. i want the subdir nmakefile to not touch or build anything above it in the tree. it's subdir only. anything higher up is ignored. it's not the convenience of "i just dont want a shell in the toplevel for make". it's "i want to only rebuild this part of the subtree - eg src/lib/eina and damnit don't rebuild or relink eolian or eo or evas or anything else". > >> I have lots of experience with autotools (efl, and all other > >> projects), cmake (which I did the initial autotools->cmake conversion) > >> and kconfig (my recent project soletta uses that). So consider I may > >> be biased, but: > >> > >> I'd *NOT* go with glue in shell because that usually needs lots of > >> complex data structures to simplify the code, otherwise you waste lots > >> of time working around poor hash support to do it in shell... Pure > >> shell would also not be my favorite tool to write the build system per > >> se since tracking parallel tasks with it is more complex than with > >> make/ninja or other tools meant for that. > > > > oh no. was thinking pure sh just to replace "configure" and generate a > > Makefile.conf that is included from a hand rolled Makefile. basically the > > kconfig bit of kconfig with everything needed for running compile tests to > > detect something (we can split the tests into a configure_tests dir and just > > issue a compile via a wrapper func from the shell and gather return code). > > so just replacing functionally what configure and/or kconfig does in a very > > simple way. > > > > can kconfig gather pkg-config cflags/libs and versions? can it do feature > > tests (does func x exist in library y (or more specifically does a compile > > succeed with a func x() in src code if it also #includes x/y/z and -lx -ly > > -lz -Lx -Ly -Lz)?). fundamentally these tests in autotools are very easy to > > DIY on the os's we support. does kconfig do this or will e have to "roll > > our own" ? > > no, kconfig needs to be fed with such information. For example you go > and write a shell script that will do: ok. so we need this anyway... beyond kconfig providing a menu system for config options then i dont see much value. > if pkg-config --exists freetype2; then > cat << EOF >> .kconfig.deps > config HAVE_FREETYPE2 > bool > default y > > config FREETYPE_CFLAGS > string > default "`pkg-config --cflags freetype2`" > > config FREETYPE_LIBS > string > default "`pkg-config --libs freetype2`" > EOF > fi > > Which you could make to a sh function "check_pc name>=version" yup. that was assumed that a "configure" shell replacement would do. it's not that hard. i actually am testing out the idea on my rage tree atm. i can split version numbers, (then easy to compare afterwards once broken up). i have --prefix/bindir/libdir/= xxx parsing done easily. --enable and --disable done with help strings and default values ... --help auto outputs available options tyou can enable or disable and default values for them... it writes a config.h ... it's surprisingly easy. the actual cf file for rages that is my test is: #!/bin/dash . ./cf-inc/cfmain.sh # declare things about this project CF_NAME="rage" CF_VERSION="0.2.1" # declare configurable things CF_CONFIG="rage_config.h" cf_opt yes PANTS "Wearing of pants" cf_opt no BLAH "Do we just not care and go blah" # handle input options and set up state cf_args $@ # any custom stuff goes here # write output cf_out forcing me to use dash to make it sh compatible... :) for now... remove the comments and it's tiny. yeah. i have to add the "get me pkg-config requirements x/y/z of at least version x/y/z" along with :"generate x from x.in with @XXX@ substitution", makefile.conf generation and a few other things... but the cfmain.sh is meant to be a generic re-usable shell core. i'm just playing to see how hard it really will be. it seems this is at least partly needed for a kconfig solution... and without it it would replace it and i actually have that done with the option parsing (--enable, --disable ... ok have to add a --with-x=XXX and then that's done). > Then in your entrypoint Kconfig: > > include .kconfig.deps > > Since there are no "prompt" in above, it won't be user-visible, but > other Kconfig files can do: > > > config USE_EVAS > bool "Enable Evas" > default y > depends on HAVE_FREETYPE2=y > help > Evas is EFL's retained canvas library, it will manage objects > and update the screen as needed. > > This prompt will be enabled only if HAVE_FREETYPE2=y. Usually people > also add a message/comment to let user know something important was > disabled: > > comment "Evas is disabled (needs freetype2)" > depends on !HAVE_FREETYPE2=y > > In our makefiles, we'd simply base on USE_EVAS: > > obj-$(USE_EVAS) := evas_main.c ... > > Which will expand to: > > obj-y := evas_main.c > > or: > > obj-n := evas_main.c > > > Then you usually have macros that prepend the directory as scope and > results something like: > > gcc -shared -o libevas.so $(src-lib-evas-obj-y) ... > > > > >> TL;DR: the only complete solution is CMake, but if I would pick > >> something specifically for EFL use cases I'd go with kconfig + ninja > >> with glue in Python -- or Perl (since kconfig uses that) or Lua (since > >> efl uses that). Second option is traditional kconfig + gnu-make. > >> > >> https://github.com/solettaproject/soletta is an example of kconfig + > >> gnu-make + python. > >> > >> - kconfig is embedded into the project as pure C code (not an extenal > >> dep) https://github.com/solettaproject/soletta/tree/master/tools/kconfig > >> > >> - https://github.com/solettaproject/soletta/tree/master/tools/build > >> base definitions > > > > i am not sure i can quickly get from that if it can do pkg-config checks or > > compile checks and so on like above... can it? > > You need something else, like I wrote below. In soletta we use > dependencies.json to specify what we used in autoconf (pkg-config > module, check lib, check header...). It's parsed by a Python script > that executes actual commands and outputs the ".kconfig.deps" I wrote > above (if you take Soletta it will generate Kconfig.gen). > > > >> - > >> https://github.com/solettaproject/soletta/blob/master/data/jsons/dependencies.json > >> we defined our dependencies file format in JSON, which we parse and execute > >> from Python. Could be "YAML" or something like that. But JSON is not that > >> cumbersome and it's easy to write a jsonschema to validate it. > > > > now i begin not to like this. write a file that then, goes through a > > generator to generate another file that goes through another generator > > that... it's autotools all over. > > This is not required, just us in Soletta thought it would be simpler. > Kconfig will just use its own files and output make/header compatible. > It doesn't find anything on its own. > > You could write it all in Lua, Python, C or whatever, given that you > output .kconfig.deps like in the example above. i just dont like the idea of yet another tool to generate input to another tool to generate input to another tool to... the chain is too long. each with its own language and oddities. python, lua, whatever is not relevant here. its just the complexity. i accept we need some kind of tool/thing to deal with pkg-config and that probably needs to execute compile checks (like does func x exist in libc?). this is kind of a necessary. we need something to build with. gnu make is a devil we know. i have never seen it be slow until it's given 50k lines of makefile to deal with on a slow box. small makefiles should be just fine. ninja may be slightly faster. ninja is the devil most people do not know and frankly i don't see it being THAT much better than gnu make if the makefile isn't silly. silly makefile is the fault of automake + libtool. so if we have somethnig to get pkgconfig info and do version checks - it's likely either something like cmake - a dedicated tool ... or it's shell. if we do shell then kconfig is not much more than a nice "option handler with dependencies etc." which i don't think we need... that's my take. > > my thoughts on plain sh were just to execute a series of checks and > > information gathering into a Makefile.conf (and config.h) then include that > > makeifle (and of course config.h). we'd have shell funcs like > > > > #!/bin/sh > > ./conf/configure_funcs.sh > > > > # pars arguments like --prefix=XXX and set vars accordingly > > parse_args > > > > # below will append VAR1 to CHECK_FUNCS var like "VAR1 VAR2 VAR2" so > > # a simple for I in $CHECK_FUNCS; do ... can iterate over them when > > # write_makefile_conf is called and just write out makefile vars like if > > # that var is true/false etc. and will also just generate a simple c file > > # to test compile > > check_c_func VAR1 funcname inc1.h inc2.h inc3.h -Ixxx -llib1 -Ldir2 > > # like the above but specifically runs pkg-config --cflags and set > > # VAR2_CFLAGS and VAR2_LIBS accordingly > > check_pc VAR2 lib1 lib2 lib3 > > # checks lib1 is at least 1.2.3 version - optional > > check_pc_version lib1 1.2.3 > > # checks lib2 is between 1.2.3 and 2.4.5 > > check_pc_version lib2 1.2.3 2.4.5 > > > > if test x$FEATRUE_X = xyes; then > > check_pc VAR3 libxx libyy > > check_pc_version libxx 1.5.0 > > check_pc_version libyy 1.6.0 > > fi > > > > # and so on... > > > > # at end of shell > > write_makefile_conf Makefile.conf > > write_header_conf config.h > > > > or you get the idea... > > Yes, up to "and so on" is what our Python + dependencies.json does... > but instead of writing in pure-shell, we did in Python + JSON. We > could do in pure shell, but we disliked that. sure. personal taste. i prefer to be in shell... then the build target doesnt need python installed to build. not an issue for unixen, but for windows - more of one. since we know pure shell works on windows as configure works... thus this is a doable thing. :) > the final 2 commands "write_makefile_conf" and "write_header_conf" are > managed *by kconfig*, as it will take the results of your checks > (check_pc, check_c_func...) + the user settings, then compute the > resulting files for you. yes. i realize that. :) thus kconfig replaces a fairly smallish amount of shell functionally. > this part is done by a C binary hosted in your own project (it's > compiled for you by kconfig), in soletta it's ./tools/kconfig/conf yeah. that i know. i just never delved into how to drive kconfig as a developer using it. i do know from many a kernel build that it works this way. :) > So to write your idea in a more real scenario, it becomes: > > --->8--- conf/configure_func.sh --->8--- > function check_pc() { > local _name=`toupper $1` > if pkg-config --exists $1; then > cat << EOF >> .kconfig.deps-tmp > config ${_name}_FOUND > bool > default y > > config ${_name}_CFLAGS > string > default "`pkg-config --cflags $1`" > > config ${_name}_LIBS > string > default "`pkg-config --libs $1`" > EOF > else > cat << EOF >> .kconfig.deps-tmp > config ${_name}_FOUND > bool > default n > EOF > fi > } yeah. actually my sh funcs really look a lot like this. cat's + EOF's > > --->8--- check-deps.sh --->8--- > #!/bin/sh > ./conf/configure_funcs.sh > > rm -f .kconfig.deps .kconfig.deps-tmp > > check_c_func VAR1 funcname inc1.h inc2.h inc3.h -Ixxx -llib1 -Ldir2 > check_pc VAR2 lib1 lib2 lib3 > check_pc_version lib1 1.2.3 > check_pc_version lib2 1.2.3 2.4.5 > > check_pc VAR3 libxx libyy > check_pc_version libxx 1.5.0 > check_pc_version libyy 1.6.0 > > mv .kconfig.deps-tmp .kconfig.deps > > --->8--- Kconfig --->8--- > > include ".kconfig.deps" > > config PREFIX > string "Where to install EFL" > default "/usr" > > config USE_EVAS > bool "Enable Evas" > default y > depends on HAVE_FREETYPE2=y > help > Evas is EFL's retained canvas library, it will manage objects > and update the screen as needed. > > This prompt will be enabled only if HAVE_FREETYPE2=y. Usually people > also add a message/comment to let user know something important was > disabled: > > comment "Evas is disabled (needs freetype2)" > depends on !HAVE_FREETYPE2=y yeah get the idea. kconfig doesnt do it for us. ala autotools. or cmake. we will have to roll shell to do the pkgconfig and other checks anyway and gather info and so on. :) so we have 4 tools then. shell to do compile tests (compiletest file a.c with func() in it - does it compile+link?) + python or whatever to generate json files of options that then generate kconfig files, kconfig files + kconfig src to create final makefiles and then make. > --->8--- Makefile --->8--- > .kconfig.deps: check-deps.sh Makefile > ./check-deps.sh > > -include .config # generated by make menuconfig, oldconfig... > > > and so on... You add .kconfig.deps as a dependency of 'make > menuconfig' or 'make oldconfig' (or all rules that touch '.config'), > then 'make all' depends on .config and you get the results you want. > > it will output include/generated/autoconf.h or $KCONFIG_AUTOCONFIG > with your header file. > > -- > Gustavo Sverzut Barbieri > -------------------------------------- > Mobile: +55 (16) 99354-9890 > -- ------------- Codito, ergo sum - "I code, therefore I am" -------------- The Rasterman (Carsten Haitzler) ras...@rasterman.com ------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, SlashDot.org! http://sdm.link/slashdot _______________________________________________ enlightenment-devel mailing list enlightenment-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/enlightenment-devel