On Thursday, 13 November 2014 at 13:46:20 UTC, Manu via Digitalmars-d wrote:
On 13 November 2014 22:01, via Digitalmars-d
<[email protected]> wrote:
On Thursday, 13 November 2014 at 11:44:31 UTC, Manu via Digitalmars-d wrote:
D has attribute inference, that's like, a thing now.

Yes, these days D arguments go like this:

A: "I am saying no because it would go against separate compilation units."

B: "I am saying yes because we have attribute inference."

A: "But when will it be implemented?"

B: "After we have resolved all issues in the bugtracker."

A: "But C++17 will be out by then!"

B: "Please don't compare D to C++, it is a unique language"

A: "And Rust will be out too!"

B: "Hey, that's a low blow. And unfair! Besides, linear types suck."

A: "But 'scope' is a linear type qualifier, kinda?"

B: "Ok, we will only do it as a library type then."

A: "How does that improve anything?"

B: "It changes a lot, it means Walter can focus on ironing out bugs and Andrei will implement it after he has fixed the GC".

A: "When will that happen?"

B: "After he is finished with adding ref counters to Phobos"

A: "I thought that was done?"

B: "Don't be unreasonable, Phobos is huge, it takes at least 6 months! Besides, it is obvious that we need to figure out how to do scope before completing ref counting anyway."

A: "I agree…Where were we?"

B: "I'm not sure. I'll try to find time to write a DIP."


I don't see anything in C++11/14/17 that looks like they'll salvage the language from the sea of barely decipherable template mess and
endless boilerplate. It seems they're getting deeper into that
madness, not less.

Stuff like auto on return types etc makes it easier and less verbose when dealing with templated libraries.

Unfortunately, I guess I can't use it on my next project anyway, since I need to support iOS5.1 which probably means XCode… 4? Sigh…

That's one of the things that annoy me with C++, the long tail for being able to use the new features.

I spent the last 2 days doing some string processing in C++...
possibly the least fun I've ever had programming. Somehow I used to
find it tolerable!

Ack… I try to stick to binary formats. Strings are only fun in languages like Python (and possibly Haskell).

Reply via email to