On Thursday, 29 September 2016 at 19:39:35 UTC, Jonathan M Davis wrote:
The language can't stop you from doing at least some arbitrary stuff with them (like making + do subtraction), but the whole goal was for user-defined types to be able to act like the built-in types, and as such, it would make no sense to alter them towards being treated like symbols that you can do whatever you want with.

Having `+` do subtraction isn't something you'd normally see. It's not a use case that would normally exist.

Having `+` do addition, but do so in a database layer is a use case that actually may exist. The operator still behaves like a built-in type. It may perform addition as part of a SQL query, for example.

Whether the expression `a + b` is translated into machine language, or translated into SQL, both still perform addition. A value represented by `a` is added to a value represented by `b`. Whether `a` and `b` are variables in D, or columns in a database table is irrelevant.

And as it stands, D can already do this. It's the inability to perform an equivalent action for the expression `a > b`.

Reply via email to