On Saturday, 4 August 2018 at 01:45:44 UTC, Laeeth Isharc wrote:
On Friday, 3 August 2018 at 22:55:51 UTC, Rubn wrote:

The difference is they would have to rework their existing code. If you are writing D source code bindings for your code, then you are essentially writing new code. You don't have to worry about backwards compatibility.

Why would you write bindings if the computer can do it for you, better, faster and consistently?

With the current tools the ones that generate D files to be used aren't very good. They evaluate Macros based on the current implementation, so if there's a define MACHINE_X86 or MACHINE_x64, those macro and #if's will be evaluated based on the current system running the tool instead of generating equivalent version() statements.

That's the idea behind DPP. You can just #include C headers and they will be translated before compile time. This is very helpful with projects like HDF5 that consider it acceptable in a minor release to replace a function with a macro.

Wrappers are a bit different.

In time C++ will follow.

I wouldn't call that the same thing as what generally defines a wrapper. It's a different concept that does the work at compiler time. If I remember correctly Clang has something similar, where two languages can call each other once they have been compiled to Clang's intermediate format. Or at least it was something that was being worked on a while ago, never used it though.

Not only that, who do you think even writes bindings for libraries? Most bindings are done by the community for libraries to other languages. How many companies do you know have bindings for their C/C++ libraries for D, that maintains them?

We do for a few things - for other peoples' libraries not our own. But it would be better not to have to write bindings at all.

It would be, but I don't think it'll ever be 100% and will require manual intervention.

damn hell no. That's what modules are for. So why are you trying to implement namespaces in D under the guise of C++ name mangling.

I don't think either Manu or Atila want to be able to sneak in namespaces by the backdoor. They just want to be able easily to control namespace mangling of C++ linkage symbols.

Never said they did, but that's what Walter and the current implementation seem to indicate. I'd rather just have extern(C++) be what extern(C++) and extern(D) do, just change the name mangling, not try to emulate some features of namespaces like it currently does.

What extern(C++) should be used for is allowing you
to call C++ code from D, not to be able to format C++ code into D. The only problem you have with breaking code up into multiple files is that it isn't 1:1. That's not a technical problem, it's a problem of conflicting personal opinion. If it's not 1:1, who cares? If some some odd reason you have two namespaces in one file in C++, odds are they are probably separated in that one file anyway. If not and for some reason the the code has multiple namespace definitions smashed together into one file, flip-floping between namespaces. That's not D's problem to fix the project's poor organization method.

For automatically translated bindings I think that the request is not unreasonable because there's considerable value in being able to just #include C++ headers as you can already with C headers and just call the C++ code from D. D doesn't have libraries? Well it's got 1500 on code.dlang.org plus C and C++ libraries. What is it you think is missing? That's a good retort!

I understand from Atila present choice just makes it a lot more complicated, not impossible.

The only person I've seen that wants this is Walter. I haven't seen anyone else show interest in wanting a 1:1 correlation. It's unreasonable, D isn't C++ nor should it be trying to strive to be C++.

Where are const pointers? Where are default constructors for structs that make structs not POD anymore, that changes the calling convention used in C++ (on Windows at least).

Reply via email to