On Saturday, 14 June 2014 at 16:34:35 UTC, Dmitry Olshansky wrote:
Consider something like REST API generator I have described during DConf. There is different code generated in different contexts from same declarative description - both for server and client. Right now simple fact that you import very same module from both gives solid 100% guarantee that API usage between those two programs stays in sync.

But let's face it - it's a one-time job to get it right in your favorite build tool. Then you have fast and cached (re)build. Comparatively costs of CTFE generation are paid in full during _each_ build.

There is no such thing as one-time job in programming unless you work alone and abandon any long-term maintenance. As time goes any mistake that can possibly happen will inevitably happen.

In your proposed scenario there will be two different generated files imported by server and client respectively. Tiny typo in writing your build script will result in hard to detect run-time bug while code
itself still happily compiles.

Or a link error if we go a hybrid path where the imported module is emitting declarations/hooks via CTFE to be linked to by the proper generated code. This is something I'm thinking that could be a practical solution.


What is the benefit of this approach over simply keeping all ctRegex bodies in separate package, compiling it as a static library and referring from actual app by own unique symbol? This is something that can does not need any changes in compiler or Phobos, just matter of project layout.

It does not work for more complicated cases were you actually need access to generated sources (generate templates for example).

You may keep convenience but losing guarantees hurts a lot. To be able to verify static correctness of your program / group of programs type system needs to be aware how generated code relates to original source.

Build system does it. We have this problem with all of external deps anyway (i.e. who verifies the right version of libXYZ is linked not some other?)

It is somewhat worse because you don't routinely change external libraries, as opposed to local sources.

Huge mess to maintain. According to my experience
all builds systems are incredibly fragile beasts, trusting them
something that impacts program correctness and won't be detected at
compile time is just too dangerous.

Could be, but we have dub which should be simple and nice.
I had very positive experience with scons and half-generated sources.

dub is terrible at defining any complicated build models. Pretty much anything that is not single step compile-them-all approach can only be done via calling external shell script. If using external generators is necessary I will take make over anything else :)


tl; dr: I believe that we should improve compiler technology to achieve same results instead of promoting temporary hacks as the true way to do things. Relying on build system is likely to be most practical solution today but it is not solution I am satisfied with and hardly one I can accept as accomplished target.

Imaginary compiler that continuously runs as daemon/service, is capable of JIT-ing and provides basic dependency tracking as part of compilation step should behave as good as any external solution with much better correctness guarantees and overall user experience out of the box.

Reply via email to