On 06/03/2012, at 9:36 PM, Ric Klaren wrote: > Hi, > > I recently got to work on a multi platform make based build, that involves a > lot of C, some C++, Java and Javascript/HTML. The current system is somewhat > interesting and needs some work to become more maintainable. Having > successfully converted a maven java build to gradle in the past I started > investigating using gradle for this job (it would make a number of things I > now see a lot easier).
Which version of Gradle are you looking at? There are some improvements that will be available in the soon-to-be-released 1.0-milestone-9. Have a look at the release notes for some details: http://wiki.gradle.org/display/GRADLE/Gradle+1.0-milestone-9+Release+Notes > So I was wondering what the current intended direction is of the native > support. > > Will multi platform builds and artifacts be supported ? Definitely. You will be able to build and publish multiple variants of an executable or a library. Initially, we'll probably start with some pre-canned variants (debug vs non-debug, static vs dynamic, operating system, architecture, compiler, etc). At some point, there will be some way to define your own variants. The plugin will take care of whatever needs to happen to build each of the variants you are interested in (e.g. compile once with -fPIC and once without to build the shared and static variants on linux, just compile once on windows). It will also take care of choosing the correct variant when resolving dependencies. Which platforms (and variants in general) do you need to build for? > (Something like the maven NAR plugin http://duns.github.com/maven-nar-plugin/) > > Some observations/feedback/questions after playing around with the > cpp-lib/cpp-exe plugins: > > * It would be cool to a DSL to configure the toolchain to use. As it seems > now I need to write java/groovy to support a new compiler (gcc/microsoft > compilers/aix compilers) and use gradle internal classes. Configuring the > toolchain via DSL might also make it easier to use some tools that use > wrappers around the compiler, sometimes combined with extra options). Definitely. At the moment, the API to define a new compiler is internal, and we don't really intend that you use this (yet). We do plan to make something publicly visible, probably starting with some way to tweak an existing compiler through the DSL, and later as a more complete SPI. > Having the option to specify a hard path to the compiler would also be nice, > I somewhat dislike relying on path lookups (paranoia :) ). Right. The default approach of the plugin is that it should make a best-effort attempt to build the binaries based on minimal configuration, and so it tried to discover what it can from the environment. However, it certainly will be possible to lock this down, such as being able to declare exactly which compiler should be used, and where to find it. > > * Is it possible to have a header files only dependency ? I seemed unable to > turn off the attempt to download a shared object or vice versa shared object > only dependencies. Not yet. The dependency management for native binaries is very much an early work-in-progress. > > * I would really like a way to specify a dependency that would result in > -L<somedir> and some -l<sublib1> -l<sublib2> options for the compiler, I can > somewhat work around this by specifying arguments for the compiler, but it > would be nice if this was possible out of the box. (this might make my > migration path easier, I now have a number of do-everything builds that > result in more libraries) Not sure what you mean here. Do you have an example? > > * Will it be possible to include more shared/objects/static libraries in a > dependency? I tried to include boost for its test framework with a simplistic > 'boost-all' approach, e.g. header files and multiple shared objects, but got > hung up on the single shared object restriction. Yes. The plan is to allow a published module to declare zero or more artifacts of each type (headers, executable, static lib, shared lib, exported symbol file, etc), plus a bunch of dependencies. When we resolve a dependency, we would grab each of the artifacts of the appropriate type, and do this transitively for its dependencies. > > * Will it be possible to use a more standard approach for library generation > e.g. libfoo.so so you can build libraries that can be linked using -L<dir> > -lfoo from other tools? This is available in milestone 9, for binaries that are built by Gradle. We haven't fixed the naming scheme for binaries resolved from dependencies, yet. > > * The current convention for shared object names makes it also hard to > repackage some third party packages that set the shared object name soname > field in the library/executable. This becomes problematic when trying to run > the generated executable e.g. running a testcase compiled with boost test > will try to lookup libboost_unit_test_framework.so and not the shared object > name in the repository. Right. This is something we still need to fix for dependencies. > > * What is the plan for compile dependencies ? Manual? Or automatic with a gcc > -MM or makedepend phase that configures dependencies (of some gradle internal > mechanism)? Not sure on the exact mechanism yet, other than it won't be manual. There's already some incremental build support there, but it's coarse-grained at the moment. > Then build in the more conventional way .c -> .o -> .a/.so ? Probably. We'll also need to do this to efficiently support different variants. > > * Will static libraries be supported ? Yes. You'll be able to ask for a static or a shared variant of a given library. > > * Will gradle transparently deal with setting LD_LIBRARY_PATH on assorted > unixes (or PATH on windows for DLL's) for running for instance test > executables generated with a 'compileTest' target ? This is available in milestone 9. > > My apologies for so many questions ;) This is excellent feedback. -- Adam Murdoch Gradle Co-founder http://www.gradle.org VP of Engineering, Gradleware Inc. - Gradle Training, Support, Consulting http://www.gradleware.com
