On Tuesday, 21 May 2013 at 10:53:04 UTC, Timon Gehr wrote:
On 05/21/2013 10:40 AM, Maxim Fomin wrote:
On Monday, 20 May 2013 at 23:10:48 UTC, Timon Gehr wrote:
...
In some notorious cases such artificiality created
problems (for local convenience at the expense of language consistency)

I don't see how this is relevant. The compiler behaviour is at the
expense of language consistency, as I have argued above.

are not easy to fix (ref), or are unfixable by design (@disable), or are
broken and dumped without touch for ages (delegates).


I disagree with those assertions.
What is the conceived issue with delegates?

Conceived issue is that they are holes in type system (unlike
casts and unions they are not supposed to be), let alone there
numerous bugs related to their implementation. See Bugzilla for
particular examples.


I see. I reported a few of those recently. Some of them got fixed.

Clearly, bugs should not stop from implementing a good feature, but here
I see low benefits and some problems.
...

This is not about implementing a new feature. This is about fixing an implementation bug. Otherwise the compiler behaviour would need to be
carried over to the spec. Doing nothing about it is not valid.

You *think* that it is a bug and this cliam is not necessaruly
true.

A compiler bug is a discrepancy between spec and compiler. It is a bug given the current spec.

According to spec you are promised to have a.foo(b) from
foo(a,b) but you are not promissed to have anything beyond that

The fundamental problem is obviously that the "spec" is not fully unambiguous, but consider:

https://en.wikipedia.org/wiki/Abstract_rewriting_system

You are free to disagree by using different than usual fundamental notions, but if those are to be used by the language, they need to be defined by the spec.

E.g. we'd need to introduce hidden state into the AST: Rewritten code would not be D code anymore, which removes most of the benefits of using rewrite rules for specification in the first place. (but given that the rewriting system given by the various rules from the spec does not appear to be confluent, this is a moot point anyway.)

(including several stages of rewrite for operator overloading).

Given the assumption that the spec implies a limit on the number of such "stages", why would the limit be 1 and not 2 or 3? Why not even other numbers?

Please stop presenting a POV as absolute truth.

Mathematical truth. To disagree with it you'd either:

[1] Question the axioms or definitions.
=> In this case, my argument is valid, but this is not relevant for => you. I guess most of the arguing about language features goes on in => this section. To settle disputes here, concrete examples showing
 => the merits of your own viewpoint are usually useful.

[2] Show a flaw in reasoning.

[3] Be wrong.

According to the spec (overload page and TDPL) operators are rewritten specifically named *member* functions and calling *member* function is not the same thing as syntax to call free functions like member functions due to:
- non-member (UFCS) functions cannot be virtualized and overridden
- they do not appear in get members traits
- they cannot access all fields in general.

Using your rewriting notation operator overloading is defined as (a - expression, b - some member function other than opDispatch, c - opDispatch): if (a && b) then a->b else error. Opdispatch is defined as: if(!b && c) then b->c else error.

So, compiler works according to the spec.

By the way, your logic for a->b->c leads to following flaw:

class A
{
   void opSlice(){}
   void opDispatch(){}
}
..
a[] // unconditionally calls opDispatch since '[]'->'opSlice'->'opDispatch'

Reply via email to