On Fri, Jan 18, 2019 at 06:59:59PM +0000, JN via Digitalmars-d-announce wrote: [...] > The trick with makefiles is that they work well for a single > developer, or a single project, but become an issue when dealing with > multiple libraries, each one coming with its own makefile (if you're > lucky, if you're not, you have multiple CMake/SCons/etc. systems to > deal with). Makefiles are very tricky to do crossplatform, especially > on Windows, and usually they aren't enough, I've often seen people use > bash/python/ruby scripts to drive the building process anyway.
Actually, the problems I had with makefiles come from within single projects. One of the most fundamental problems, which is also a core design, of Make is that it's timestamp-based. This means: (1) it often builds unnecessarily -- `touch source.d` and it rebuilds source.d even though the contents haven't changed; and (2) it often fails to build necessary targets -- if for whatever reason your system clock is out-of-sync or whatever, and a newer version of source.d has an earlier date than a previously-built object. Furthermore, makefiles generally do not have a global view of your workspace, so builds are not reproducible (unless you go out of your way to do it). Running `make` after editing some source files does not guarantee you'll end up with the same executables as if you checked in your changes, did a fresh checkout, and ran `make`. I've had horrible all-nighters looking for heisenbugs that have no representation in the source code, but are caused by make picking up stale object files from who knows how many builds ago. You end up having to `make clean; make` every other build "just to be sure", which is really stupid in this day and age. (And even `make clean` does not guarantee you get a clean workspace -- too many projects than I care to count exhibit this problem.) Then there's parallel building, which again requires explicit effort, macro hell typical of tools from that era, etc.. I've already ranted about this at great lengths before, so I'm not going to repeat them again. But make is currently near (if not at) the bottom of my list of build tools for many, many reasons. Ultimately, as I've already said elsewhere, what is needed is a *standard tool-independent dependency graph declaration* attached to every project, that captures the dependency graph of the project in a way that any tool that understands the standard format can parse and act on. At the core of it, every build system out there is essentially just an implementation of a directed acyclic graph walk. A standard problem with standard algorithms to solve it. But everybody rolls their own implementation gratuitously incompatible with everything else, and so we find ourselves today with multiple, incompatible build systems that, in large-scale software, often has to somehow co-exist within the same project. > The big thing dub provides is package management. Having a package > manager is an important thing for a language nowadays. Gone are the > days of hunting for library source, figuring out where to put > includes. Just add a line in your dub.json file and you have the > library. Need to upgrade to newer version? Just change the version in > dub.json file. Need to download the problem from scratch? No problem, > dub can use the json file to download all the dependencies in proper > versions. Actually, I have the opposite problem. All too often, my projects that depend on some external library become uncompilable because said library has upgraded from version X to version Z, and version X doesn't exist anymore (the oldest version is now Y), or upstream made an incompatible change, or the network is down and dub can't download the right version, etc.. These days, I'm very inclined to just download the exact version of the source code that I need, and include it as part of my source tree, just so there will be no gratuitous breakage due to upstream changes, old versions being no longer supported, or OS changes that break pre-shipped .so files, and all of that nonsense. Just compile the damn thing from scratch from the exact version of the sources that you KNOW works -- sources that you have in hand RIGHT HERE instead of somewhere out there in the nebulous "cloud" which happens to be unreachable right now, because your network is down and in order to fix the network you need to compile this tool that depends on said missing sources. I understand it's convenient for the package manager to "automatically" install dependencies for you, refresh to the latest version, and what-not. But frankly, I find that the amount of effort it takes to download the source code of some library and setup the include paths manually is miniscule, compared to the dependency hell I have to deal with in a system like dub. These days I almost automatically write off 3rd party libraries that have too many dependencies. The best kind of 3rd party code is the standalone kind, like the kind Adam Ruppe provides: just copy the lousy source code into your source tree and import that. 10 years later it will still compile and work as before, and won't suddenly die from missing .so files (because they've been replaced by newer, ABI-incompatible ones), new upstream versions that broke the old API, or from missing source files that were gone from the dub cache when you migrated to a new hard drive and now you can't get them because your network is down, or, worst of all, one of the dependencies of the dependencies of the library you depend on has vanished into the ether and/or become gratuitously incompatible so now you have to spend 5 hours upgrading the entire codebase and 5 days to rewrite your code to work around the missing functionality, etc.. I'm not against fetching new versions of dependencies and what-not, but I'd like to do that as a conscious action instead of some tool deciding to "upgrade" my project and ending up with something uncompilable -- in the middle of a code-build-debug cycle. I don't *want* stuff upgraded behind my back when I'm trying to debug something! With Adam-style libraries, you just copy the damn file into your source tree and that's all there is to it. When you need to upgrade, just download the new file, copy it into your source, and recompile. If that breaks, rollback your code repo and off you go. I'm sick of the baroque nonsense that is the dependency hell of "modern" package managers, or worse, the *versioned* dependency hell of having to explicitly specify what version of the library you depend on. The source file you copied into the source tree *is* the version you're depending on, period. No fuss, no muss. (And don't get me started on dub as a *build* tool. It's actually not bad as a *package manager*, but as a build tool, I find it essentially unusable because it does not support many things I need, like codegen, non-code-related build tasks like data generation, etc.. Until dub is able to support those things, it's not even an option for many of my projects.) T -- All problems are easy in retrospect.
