On Tuesday, 18 February 2014 at 07:45:10 UTC, Paulo Pinto wrote:
It is like traveling back in time when parametric polymorphism was debated in university papers and everyone was inventing their own code generation tool.

Back to mid-90's compiler technology when only Ada supported generics, C++ started to adopt some form of genericity at ISO meetings and everything else where academic languages like Eiffel, Modula-3 or Standadard ML.

We are in 2014, not in the early 90's. So to ignore what happened in mainstream language design in the last 20 years, is nothing more than an opinionated political decision against generics.

I understand the sentiment that it is 'backwards', but what exactly on a practical level is harmful about people writing their own code generation tools?

On Monday, 17 February 2014 at 22:53:47 UTC, Asman01 wrote:
I don't think so. Did you know that some of they are the same guys from Bell Labs which created C, UNIX, Plan9, UTF8 etc?

I'm aware of that, but I'm also aware that there are few things in the world more agonisingly complex than writing a C++ compiler. I think I read Walter say somewhere that it took him ten years! That's ten years of domain-specific experience working with generics in the language with the most complex implementation of generics in existence. My impression is that no amount of experience in other aspects of language design would be a substitute for this specific experience, and hence I think it makes more sense to attribute the effectiveness of D's generics implementation to Walter's extensive experience implementing generics than to attribute it to generics being easy to implement well. If generics are easy to implement, then why isn't there another language with the compile-time power of D that's not a monster like C++ or a Lisp?

Reply via email to