ndslice v2 is coming soon

2018-07-28 Thread 9il via Digitalmars-d

Hi,

PR: https://github.com/libmir/mir-algorithm/pull/143

Features
===

 * Slice and Series are C++ ABI compatible without additional 
wrappers.
  See example: 
https://github.com/libmir/mir-algorithm/tree/devel/cpp_example


 * Intuitive API with default params and without explicit 
dimension packs

 ```
   alias Slice = mir_slice;
   struct mir_slice(Iterator, size_t N = 1, SliceKind kind = 
Contiguous)

```
 For example, double[] analog is just Slice!(double*) / 
mir_slice.


Best,
Ilya Yaroshenko



Re: Is there any good reason why C++ namespaces are "closed" in D?

2018-07-28 Thread Nicholas Wilson via Digitalmars-d

On Sunday, 29 July 2018 at 03:20:29 UTC, Walter Bright wrote:

On 7/28/2018 11:18 AM, Manu wrote:
Make a PR that implements namespace as a string... I will use 
that fork of D forever.


1. Look how it is mangled on the C++ side. (Use "grep" on the 
object file.)


2. Use:

   pragma(mangle, "the mangled name")


That a) doesn't scale in a real dynamic codebase (think 
templates), and b) is platform dependent and so isn't a proper 
solution.


Honestly the only problem with Manu's suggestion is you can't 
expose `a::foo` and `b::foo` with the same signature within the 
same module due to (D) name collisions, although I don't see any 
reason why we can't do both.


Then again I don't see any (non philosophical/compiler front end 
internal) issue why you can't reopen a namespace. D is supposed 
to be pragmatic, after all.


Trailing comma in variable declaration

2018-07-28 Thread Ky-Anh Huynh via Digitalmars-d

Hi,

is it nice to have a trailing comma in variable declaration:

[code]
  bool
verbose = false,
download_only = false,
no_confirm = false,
show_help = false,
show_version = false,
list_ops = false,
;
[/code]

As trailing comma is possible (and it's great) for arrays, 
enum,... I wonder why we don't have this fancy thing for 
variables declaration.


Thanks for your reading.


Re: Is there any good reason why C++ namespaces are "closed" in D?

2018-07-28 Thread Manu via Digitalmars-d
On Sat., 28 Jul. 2018, 8:25 pm Walter Bright via Digitalmars-d, <
digitalmars-d@puremagic.com> wrote:

> On 7/28/2018 11:18 AM, Manu wrote:
> > Make a PR that implements namespace as a string... I will use that fork
> of D
> > forever.
>
> 1. Look how it is mangled on the C++ side. (Use "grep" on the object file.)
>
> 2. Use:
>
> pragma(mangle, "the mangled name")


Don't troll me on this one, this is a very sensitive topic!
I could have a legit mental breakdown ;)

>


Re: Is there any good reason why C++ namespaces are "closed" in D?

2018-07-28 Thread Walter Bright via Digitalmars-d

On 7/28/2018 11:18 AM, Manu wrote:
Make a PR that implements namespace as a string... I will use that fork of D 
forever.


1. Look how it is mangled on the C++ side. (Use "grep" on the object file.)

2. Use:

   pragma(mangle, "the mangled name")



Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Ali Çehreli via Digitalmars-d

On 07/28/2018 05:43 AM, Laeeth Isharc wrote:

> It's not that bad calling D from Java.

Running D's GC in a thread that is started by an external runtime (like 
Java's) can be problematic. If a D function on another D-runtime thread 
needs to run a collection, then it will not know about this Java thread 
and won't stop it. One outcome is a crash if this thread continues to 
allocate while the other one is collecting.


The solution is having to call thread_attachThis() upon entry to the D 
function and thread_detachThis() upon exit. However, there are bugs with 
these function, which I posted a pull request (and abandoned it because 
of 32-bit OS X test failures.)


I think a better option would be to forget about all that and not do any 
GC in the D function that is called from Java. This simple function 
should just send a message to a D-runtime thread and return back to Java.


Ali



Re: Faster printing of floats; is this something that D could benefit from?

2018-07-28 Thread PaperBot via Digitalmars-d

On Saturday, 28 July 2018 at 16:52:08 UTC, Eugene Wissner wrote:
On Saturday, 28 July 2018 at 16:02:22 UTC, Gary Willoughby 
wrote:

I just saw this on hacker news:


[...]


Links:

https://pldi18.sigplan.org/event/pldi-2018-papers-ry-fast-float-to-string-conversion
https://www.youtube.com/watch?v=kw-U6smcLzk
https://github.com/ulfjack/ryu


Thanks, it looks very interesting (and hey, it is a bit too 
late, I've just finished to implement errol in D today :)).


After a fast look at the texts, I can't find information about 
the optimality. They compare ryu mostly to grisu and grisu 
produces always correct, but not always optimal results, so it 
fallbacks to a slower algorithm for some values.
It would be also nice to have some paper; I see only videos. 
Maybe a paper will come soon.


paper : https://dl.acm.org/ft_gateway.cfm?id=3192369&type=pdf


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Walter Bright via Digitalmars-d

On 7/28/2018 7:09 AM, Laeeth Isharc wrote:
Opportunities are 
abundant where people aren't looking because they don't want to.


My father told me I wasn't at all afraid of hard work. I could lie down right 
next to it and go to sleep.


Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Walter Bright via Digitalmars-d

On 7/25/2018 4:27 PM, Laeeth Isharc wrote:
I think it's more interesting to be the change you 
wish to see in the world.


Haha, the whole point of me starting D. I was tired of trying to convince 
entrenched interests (and I wasn't very good at that).





Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Dibyendu Majumdar via Digitalmars-d

https://sqlite.org/whyc.html

Personally I think D team should try to convince some well known 
project to switch from C to D. Not many projects are written in C 
these days though ... but SQLite is amongst the few.




Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 21:27:12 UTC, bpr wrote:


I hear you. You're looking (roughly) for a better 
Java/Go/Scala, and I'm looking for a better C/C++/Rust, at 
least for what I work on now. I don't think D can be both right 
now, and that the language which can satisfy both of us doesn't 
exist yet, though D is close.


Yes, this. In the light of D's experience, is it even possible to 
have a language that satisfies both?





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 20:34:37 UTC, Abdulhaq wrote:

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)



Sure, I plucked it out of thin air. But I do think of the 
software development world as an inverted pyramid in terms of 
performance demands and headcount. At the bottom of my inverted 
pyramid I have Linux and Windows. This code needs to be as 
performant as possible and bug free as possible. C/C++/D shine 
at this stuff. However, I number those particular developers in 
the thousands.


The developers at Mozilla working on the browser internals, for 
example, are unaccounted for in your analysis. As are the 
developers where I work.


I think a great bulk of developers, though, sit at the 
application development layer. They are pumping out great 
swathes of Java etc. Users of Spring and dozens of other 
frameworks. C++ is usually the wrong choice for this type of 
work, but can be adopted in a mistaken bid for performance.


I don't know that the great bulk of developers work in Java.


Any how many are churning out all that javascript and PHP code?

Hence I think that the number of developers who really need top 
performance is much smaller than the number who don't.


I'd be willing to accept that, but I have no idea what the actual 
numbers are.


If I had to write CFD code, and I'd love to have a crack, then 
I'd really be wanting to use D for its expressiveness and 
performance. But because of the domain that I do work in, I 
feel that I am no longer in D's target demographic.


If I had to write CFD code, and I wanted to scratch an itch to 
use a new language,
I'd probably pick Julia, because that community is made up of 
scientific computing
experts. D might be high on my list, but not likely the first 
choice. C++ would be in there too :-(.




I remember the subject of write barriers coming up in order (I 
think?) to improve the GC. Around that time Walter said he 
would not change D in any way that would reduce performance by 
even 1%.


Here we kind of agree. If D is going to support a GC, I want a 
state of the art precise GC like Go has. That may rule out some D 
features, or incur some cost that
high performance programmers don't like, or even suggest two 
kinds of pointer (a la Modula-3/Nim), which Walter also dislikes.


Hence I feel that D is ruling itself out of the application 
developer market.


At this stage in its life, I don't think D should try to be all 
things to all programmers, but rather focus on doing a few things 
way better than the competition.


That's totally cool with me, but it me a long time to realise 
that it was the case and that therefore it was less promising 
to me than it had seemed before.


I hear you. You're looking (roughly) for a better Java/Go/Scala, 
and I'm looking for a better C/C++/Rust, at least for what I work 
on now. I don't think D can be both right now, and that the 
language which can satisfy both of us doesn't exist yet, though D 
is close.







Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 19:55:56 UTC, bpr wrote:

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)



Sure, I plucked it out of thin air. But I do think of the 
software development world as an inverted pyramid in terms of 
performance demands and headcount. At the bottom of my inverted 
pyramid I have Linux and Windows. This code needs to be as 
performant as possible and bug free as possible. C/C++/D shine at 
this stuff. However, I number those particular developers in the 
thousands.


Then we have driver writers. Performance is important here but as 
I user I feel that I wish they would concentrate on the 
'bug-free' part a bit more. Especially   those cowboys who 
develop printer and bluetooth drivers. Of course, according to 
them it's the hardware that stinks. These guys and galls number 
in the tens of thousands. Yes I made that up.


Then we have a layer up, Libc developers and co. Then platform 
developers. Unity, Lumberyard for games. Apache.


I think a great bulk of developers, though, sit at the 
application development layer. They are pumping out great swathes 
of Java etc. Users of Spring and dozens of other frameworks. C++ 
is usually the wrong choice for this type of work, but can be 
adopted in a mistaken bid for performance.


Any how many are churning out all that javascript and PHP code?

Hence I think that the number of developers who really need top 
performance is much smaller than the number who don't.




For you, perhaps. I currently work mostly at a pretty low level 
and I'm pretty sure it's not just self delusion that causes us 
to use C++ at that low level. Perhaps you've noticed the rise 
of Rust lately? Are the Mozilla engineers behind it deluded in 
that they eschew GC and exceptions? I doubt it. I mostly prefer 
higher level languages with GCs, but nothing in life is free, 
and GC has significant costs.


If I had to write CFD code, and I'd love to have a crack, then 
I'd really be wanting to use D for its expressiveness and 
performance. But because of the domain that I do work in, I feel 
that I am no longer in D's target demographic.


I remember the subject of write barriers coming up in order (I 
think?) to improve the GC. Around that time Walter said he would 
not change D in any way that would reduce performance by even 1%. 
Hence I feel that D is ruling itself out of the application 
developer market. That's totally cool with me, but it me a long 
time to realise that it was the case and that therefore it was 
less promising to me than it had seemed before.





Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread bpr via Digitalmars-d

On Saturday, 28 July 2018 at 15:36:43 UTC, Abdulhaq wrote:
I think that I no longer fall into the category of developer 
that D is after. D is targeting pedal-to-the-metal 
requirements, and I don't need that. TBH I think 99% of 
developers don't need it.


I'm 99% sure you just made that number up ;-)

For those developers who don't need the performance usually 
achieved with C or C++, and can tolerate GC overheads, there are, 
IMO, better languages than D. I'm not saying that here to be 
inflammatory, just that I believe performance is a very big part 
of the attractiveness of D.


If you're mostly working on Android, then Kotlin seems like your 
best option for a non-Java language. It seems OK, there's a 
Kotlin native in the works, the tooling is fine, there's a REPL, 
etc. I like it better than I like Go.


We like to think we do and we love to marvel at the speed of 
improved code, but like prediction, it's overrated ;-)


For you, perhaps. I currently work mostly at a pretty low level 
and I'm pretty sure it's not just self delusion that causes us to 
use C++ at that low level. Perhaps you've noticed the rise of 
Rust lately? Are the Mozilla engineers behind it deluded in that 
they eschew GC and exceptions? I doubt it. I mostly prefer higher 
level languages with GCs, but nothing in life is free, and GC has 
significant costs.





Re: Faster printing of floats; is this something that D could benefit from?

2018-07-28 Thread Andre Pany via Digitalmars-d

On Saturday, 28 July 2018 at 16:02:22 UTC, Gary Willoughby wrote:

I just saw this on hacker news:

We present Ryū, a new routine to convert binary floating point 
numbers to their decimal representations using only fixed-size 
integer operations, and prove its correctness. Ryū is simpler 
and approximately three times faster than the previously 
fastest implementation.


Links:

https://pldi18.sigplan.org/event/pldi-2018-papers-ry-fast-float-to-string-conversion
https://www.youtube.com/watch?v=kw-U6smcLzk
https://github.com/ulfjack/ryu


Added to the list of SAOC 2018 ideas
https://wiki.dlang.org/SAOC_2018_ideas#Implementation_of_RYU_to_convert_floats_to_strings

Kind regards
André


Re: Is there any good reason why C++ namespaces are "closed" in D?

2018-07-28 Thread Manu via Digitalmars-d
On Fri., 27 Jul. 2018, 3:55 pm Walter Bright via Digitalmars-d, <
digitalmars-d@puremagic.com> wrote:

> Namespaces are a botch in C++, and it is understandable that C++ code
> bases
> naturally have grown willy-nilly to utterly ignore any encapsulation
> principles.


Correct. And D has modules. Solved.

Literally nobody has ever wanted to use a C++ namespaces as a means of
encapsulation in D. We *just* want to mangle our symbol name. We want to
keep our code organised consistently with all other D code.

Please, please, please... Please, please please please please please PLEASE
support extern(C++, "string_ns") as a mangle-only variant.

Current behaviour can coexist, but let us have a way to express a mangling
request without changing the organisation of our D code.

I suggest accepting string, since that will allow us to also access C++
namespaces that conflict with D keywords.


Re: Is there any good reason why C++ namespaces are "closed" in D?

2018-07-28 Thread Manu via Digitalmars-d
On Fri., 27 Jul. 2018, 10:30 am Atila Neves via Digitalmars-d, <
digitalmars-d@puremagic.com> wrote:

> I understand that being able to "reopen" namespaces in C++ is
> contentious - anybody can add to the `std` namespace in their own
> code. D doesn't have anything like it, and instead has packages
> and modules. So far, so good.
>
> But why does this not compile?
>
> extern(C++, ns) { void foo(); }
> extern(C++, ns) { void bar(); }
>
> I could maybe understand the limitation if those functions had
> bodies since we'd be importing the namespace functionality from
> C++ in a sense (and even then I'm not sure it's a big enough
> deal). But all I'm trying to do here is tell the D compiler how
> to mangle symbols.
>
> Why would this matter? Imagine a project that parses C++ headers
> and translates them to D declarations. Imagine that project is
> trying to parse `#include `. There will be many, many
> instances of `namespace std` in there, but such a
> not-so-hypothetical program can't just go through them and open
> and close `extern(C++, std)` as it goes along. Such a program can
> easily do that to `extern(C)`, but doing that to `extern(C++)` is
> for some reason not allowed.
>
> (is there even any semantic difference? extern(C) for a 2nd time
> is just reopening the global namespace!)
>
> One could simply manually `pragma(mangle)` everything up the
> wazoo, but unfortunately that doesn't work for templates, which,
> as it turns out, is pretty much everything inside the `std`
> namespace.
>
> My only solution is to keep track of all namespaces at all times
> and then sort the declarations by namespace, which:
>
> 1) Is incredibly tedious
> 2) Might cause problems with the order of declarations,
> especially if macros are involved.
>
> I can only assume nobody has tried calling a large C++ library
> from D (Qt doesn't count, no namespaces). Imagine manually
> organising namespaces in one huge bindings file instead of being
> able to copy the file layout of the C++ headers!
>
> Sigh.
>
> Atila
>

+1

You know how many times I've been upset about this, including the same day
that the C++ namespace feature was merged without consulting anyone in the
community that intended to use it.

Make a PR that implements namespace as a string... I will use that fork of
D forever.

>


Re: Faster printing of floats; is this something that D could benefit from?

2018-07-28 Thread Eugene Wissner via Digitalmars-d

On Saturday, 28 July 2018 at 16:02:22 UTC, Gary Willoughby wrote:

I just saw this on hacker news:

We present Ryū, a new routine to convert binary floating point 
numbers to their decimal representations using only fixed-size 
integer operations, and prove its correctness. Ryū is simpler 
and approximately three times faster than the previously 
fastest implementation.


Links:

https://pldi18.sigplan.org/event/pldi-2018-papers-ry-fast-float-to-string-conversion
https://www.youtube.com/watch?v=kw-U6smcLzk
https://github.com/ulfjack/ryu


Thanks, it looks very interesting (and hey, it is a bit too late, 
I've just finished to implement errol in D today :)).


After a fast look at the texts, I can't find information about 
the optimality. They compare ryu mostly to grisu and grisu 
produces always correct, but not always optimal results, so it 
fallbacks to a slower algorithm for some values.
It would be also nice to have some paper; I see only videos. 
Maybe a paper will come soon.


Faster printing of floats; is this something that D could benefit from?

2018-07-28 Thread Gary Willoughby via Digitalmars-d

I just saw this on hacker news:

We present Ryū, a new routine to convert binary floating point 
numbers to their decimal representations using only fixed-size 
integer operations, and prove its correctness. Ryū is simpler 
and approximately three times faster than the previously 
fastest implementation.


Links:

https://pldi18.sigplan.org/event/pldi-2018-papers-ry-fast-float-to-string-conversion
https://www.youtube.com/watch?v=kw-U6smcLzk
https://github.com/ulfjack/ryu


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:


It's tough when dealing with genuine - Knightian uncertainty or 
even more radical versions.  When one doesn't even know the 
structure of the problem then maximising expected utility 
doesn't work.  One can look at capacities - Choquet and the 
like - but then its harder to say something useful about what 
you should do.




Sounds interesting, I'll look into it.



But it's a loop and one never takes a final decision to master 
D. Also habits, routines and structures _do_ shape perception.




In truth I avoid discussions that are really just arguing 
about definitions of words, but you made a couple of sweeping 
bumper-stickery comments


That's entertaining.  I've not been accused of that before!  
Bear in mind also I tend to write on my phone.




I think I was just in need of a decent conversation. I didn't 
mean it in an accusatory manner :-). TBH I read those comments as 
coming from a D advocate who was in a motivational mood. They 
triggered a debate in me that has been wanting to come out, but I 
rarely contribute to forums these days.


Yes I read Kahneman et al papers for the first time in 92 in 
the university library.  I speed-read his book, and I thought 
it was a bad book.  I work with a specialist in making 
decisions under uncertainty - she was the only person able to 
articulate to George Soros how he made money because he 
certainly couldn't, and she is mentioned in the preface to the 
revised version of Alchemy.  She has the same view as me - 
behavioural finance is largely a dead end.  One learns much 
more by going straight to the neuroeconomics and incorporating 
also the work of Dr Iain Macgilchrist.


Kahneman makes a mistake in his choice of dimension.  There's 
analytic and intuitive/gestalt and in my experience people 
making high stakes decisions are much less purely analytical 
than a believer in the popular Kahneman might suggest.


What I said about prediction being overrated isn't 
controversial amongst a good number of the best traders and 
business people in finance.  You might read Nassim Taleb also.




You're way ahead of me here, obviously. I didn't read any Taleb 
until he made an appearance at the local bookshop. It was Black 
Swan and it didn't say anything that hadn't independently 
occurred to me already. However, for some reason it seemed to be 
a revelation to a lot of people.




Well it's a pity the D Android ecosystem isn't yet mature.  
Still I remain in awe of the stubborn accomplishment of the man 
(with help) who got LDC to run on Android.


It's not that bad calling D from Java.  Some day I will see if 
I can help automate that - Kai started working on it already I 
think.




D as a programming language has numerous benefits over Java, but 
trying to analyse why I would nevertheless choose Kotlin/Java for 
Android development:


* The Android work I do largely does not need high low level 
performance. The important thinking that is done is the user 
interface, how communication with the servers should look for 
good performance, caching etc. Designing good algorithms.


* Having done the above, I want a low friction way of getting 
that into code. That requires a decent expressive language with a 
quality build system that can churn out an APK without me having 
to think too hard about it. Kotlin/JDK8 are good enough and 
Android Studio helps a lot.


* Given the above, choosing D to implement some of the code would 
just be a cognitive and time overhead. It's no reflection on D in 
any way, it's just that all the tooling is for Java and the 
platform API/ABI is totally designed to host Java.


* "The man who (with help) got LDC to run on Android". The team, 
with the best will in the world, is too small to answer all the 
questions that the world of pain known as Android can throw up. 
Why doesn't this build for me? Gradle is killing me... Dub 
doesn't seem to be working right after the upgrade to X.Y... it 
works on my LG but not my Samsung... I've upgraded this but now 
that doesn't work anymore...


* Will there be a functioning team in 5 years time? Will they 
support older versions of Android? Can I develop on Windows? Or 
Linux? Why not?., etc., etc.



Since you already know D you need to answer a different 
question.
 What's the chance the compiler will die on the relevant 
horizon, and how bad will it be for me if that happens.  
Personally I'm not worried.   If D should disappear in a few 
years, it wouldn't be the end of the world to port things.  I 
just don't think that's very likely.




I answered the Android question already, as for engineering 
/scientific work (I design/develop engineering frameworks/tools 
for wing designers) python has bindings to numpy, Qt, CAD 
kernels, data visualisation tools. Python is fast enough to 
string those things together and run the overarching algorithms, 
GUIs, launch trade studies, scipy optimisations. It has even more 
expressiv

Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Paolo Invernizzi via Digitalmars-d

On Saturday, 28 July 2018 at 14:09:44 UTC, Laeeth Isharc wrote:

On Saturday, 28 July 2018 at 13:55:31 UTC, Paolo Invernizzi


Perceptions, expectations, prediction...   an easy read I 
suggest on the latest trends [1], if someone is interested...


I forgot the link... here it is:
https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710

Yes - it's a competitive advantage, but opportunity often comes 
dressed in work clothes.


Curiosity is the salt of evolution... for example I'm now 
intrigued by the Master and His Emissary, I've to read it.


And another curiosity: I studied in the 90 in Milano, what was 
your thought on Hayek, von Mises, in those time? Classic 
Economics was so boring...


We're in an era when most people are not used to discomfort and 
have an inordinate distaste for it.  If you're fine with that 
and make decisions as best you can based on objective factors 
(objectivity being something quite different from 
'evidence-based' because of the drunk/lamppost issue) then 
there is treasure everywhere (to steal Andrey's talk title).  
Opportunities are abundant where people aren't looking because 
they don't want to.


Me and my colleague are pretty different, in the approach to that 
kind of stuff...


Maybe I'll post on the Forum a 'Request for D Advocacy', a-la 
PostgreSQL, so the community can try to address some of his 
concerns about modern D, and lower his discomfort!


:-P

/Paolo



Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Laeeth Isharc via Digitalmars-d

On Saturday, 28 July 2018 at 13:55:31 UTC, Paolo Invernizzi wrote:

On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:


each project I
start I give some very hard thought about which development 
environment I'm going to use, and D is often one of those 
options. The likely future of D on the different platforms is 
an important part of that assessment, hence 'predicting' the 
future of D, hard and very unreliable though that is, is an 
important element in some of my less trivial decisions.


Since you already know D you need to answer a different 
question.
 What's the chance the compiler will die on the relevant 
horizon, and how bad will it be for me if that happens.  
Personally I'm not worried.   If D should disappear in a few 
years, it wouldn't be the end of the world to port things.  I 
just don't think that's very likely.


Of course it depends on your context.  The people who use D at 
work seem to be more principals who have the right to take the 
best decision as they see it then agents who must persuade 
others who are the real decision-makers.  That's a recipe for 
quiet adoption that's dispersed across many industries 
initially and for the early adopters of D being highly 
interesting people.  Since, as the Wharton professor, Adam 
Grant observes, we are in an age where positive disruptors can 
achieve a lot within an organisation, that's also rather 
interesting.


A very interesting discussion... really.

Perceptions, expectations, prediction...   an easy read I 
suggest on the latest trends [1], if someone is interested...


BTW, Laeeth is right in the last paragraph two. I was one of 
the 'principal' who took the decision to use D in production, 
14 years ago, and he described the reasoning of that era very 
well.


Today I'm still convinced that the adoption of D is a 
competitive advantage for a company, I definitely have to work 
to improve my bad temper (eheh) to persuade my actual CTO to 
give it another change.


/Paolo (btw, I'm the CEO...)


Thanks for the colour, Paolo.

Yes - it's a competitive advantage, but opportunity often comes 
dressed in work clothes.  We're in an era when most people are 
not used to discomfort and have an inordinate distaste for it.  
If you're fine with that and make decisions as best you can based 
on objective factors (objectivity being something quite different 
from 'evidence-based' because of the drunk/lamppost issue) then 
there is treasure everywhere (to steal Andrey's talk title).  
Opportunities are abundant where people aren't looking because 
they don't want to.


Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Paolo Invernizzi via Digitalmars-d

On Saturday, 28 July 2018 at 12:43:55 UTC, Laeeth Isharc wrote:


each project I
start I give some very hard thought about which development 
environment I'm going to use, and D is often one of those 
options. The likely future of D on the different platforms is 
an important part of that assessment, hence 'predicting' the 
future of D, hard and very unreliable though that is, is an 
important element in some of my less trivial decisions.


Since you already know D you need to answer a different 
question.
 What's the chance the compiler will die on the relevant 
horizon, and how bad will it be for me if that happens.  
Personally I'm not worried.   If D should disappear in a few 
years, it wouldn't be the end of the world to port things.  I 
just don't think that's very likely.


Of course it depends on your context.  The people who use D at 
work seem to be more principals who have the right to take the 
best decision as they see it then agents who must persuade 
others who are the real decision-makers.  That's a recipe for 
quiet adoption that's dispersed across many industries 
initially and for the early adopters of D being highly 
interesting people.  Since, as the Wharton professor, Adam 
Grant observes, we are in an age where positive disruptors can 
achieve a lot within an organisation, that's also rather 
interesting.


A very interesting discussion... really.

Perceptions, expectations, prediction...   an easy read I suggest 
on the latest trends [1], if someone is interested...


BTW, Laeeth is right in the last paragraph two. I was one of the 
'principal' who took the decision to use D in production, 14 
years ago, and he described the reasoning of that era very well.


Today I'm still convinced that the adoption of D is a competitive 
advantage for a company, I definitely have to work to improve my 
bad temper (eheh) to persuade my actual CTO to give it another 
change.


/Paolo (btw, I'm the CEO...)




Re: [OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Laeeth Isharc via Digitalmars-d

On Saturday, 28 July 2018 at 11:09:28 UTC, Abdulhaq wrote:

On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:

For me, I think that managing money is about choosing to 
expose your capital intelligently to the market, balancing the 
risk of loss against the prospective gain and considering this 
in a portfolio sense.


Prediction doesn't really come into that



I think this apparent difference of opinion is down to 
different definitions of the word prediction. When I say 
prediction I mean the assessment of what are the possible 
futures for a scenario and how likely each one is. It can be 
conscious or unconscious. I think my understanding of the word 
is not an uncommon one.


By my definition, when you balance the risk of loss (i.e. 
predict how likely you are to lose money) against the 
prospective gain (i.e. multiply the probability of each 
possible outcome by its reward and sum the total to get a 
prospective value) then you are, by my definition and 
therefore, for me, by definition, making predictions.


It's tough when dealing with genuine - Knightian uncertainty or 
even more radical versions.  When one doesn't even know the 
structure of the problem then maximising expected utility doesn't 
work.  One can look at capacities - Choquet and the like - but 
then its harder to say something useful about what you should do.


And I think when dealing with human action and institutions we 
are in a world of uncertainty more often than not.




It's not the prediction that matters but what you do.  It's 
habits, routines, perception, adaptation and actions that 
matter.


I agree they are integral to our behaviour and habits and 
routines do not involve the element of prediction. Perceptions 
come before and actions take place after the decision process 
is made (conscious or not) and so don't factor into this 
discussion for me.


But it's a loop and one never takes a final decision to master D. 
Also habits, routines and structures _do_ shape perception.




In truth I avoid discussions that are really just arguing about 
definitions of words, but you made a couple of sweeping 
bumper-stickery comments


That's entertaining.  I've not been accused of that before!  Bear 
in mind also I tend to write on my phone.



that trying to predict things was
usually a waste of time and as an alternative we should 'be the 
change...'. I wholeheartedly agree we should 'be the change...' 
but it's not an alternative to making predictions, it goes hand 
in hand with it. I'm sure you've read Kahneman's Thinking, Fast 
and Slow. You made a generalisation that applies to the 'fast' 
part. I'm saying your universal rule is wrong because of the 
slow part.


Yes I read Kahneman et al papers for the first time in 92 in the 
university library.  I speed-read his book, and I thought it was 
a bad book.  I work with a specialist in making decisions under 
uncertainty - she was the only person able to articulate to 
George Soros how he made money because he certainly couldn't, and 
she is mentioned in the preface to the revised version of 
Alchemy.  She has the same view as me - behavioural finance is 
largely a dead end.  One learns much more by going straight to 
the neuroeconomics and incorporating also the work of Dr Iain 
Macgilchrist.


Kahneman makes a mistake in his choice of dimension.  There's 
analytic and intuitive/gestalt and in my experience people making 
high stakes decisions are much less purely analytical than a 
believer in the popular Kahneman might suggest.


What I said about prediction being overrated isn't controversial 
amongst a good number of the best traders and business people in 
finance.  You might read Nassim Taleb also.


I learnt D many years ago just after Andrei's book came out. I 
love it but it's on the shelf at the moment for me. I rarely 
get time for side projects these days but when I do I want them 
to run on Android with easy access to all the APIs and without 
too much ado in the build setup. They must continue to work and 
be supported with future versions of Android. At work, on 
Windows, JDK8/JavaFX/Eclipse/maven and 
python/numpy/Qt/OpenCascade/VTK hit the spot.


Well it's a pity the D Android ecosystem isn't yet mature.  Still 
I remain in awe of the stubborn accomplishment of the man (with 
help) who got LDC to run on Android.


It's not that bad calling D from Java.  Some day I will see if I 
can help automate that - Kai started working on it already I 
think.



each project I
start I give some very hard thought about which development 
environment I'm going to use, and D is often one of those 
options. The likely future of D on the different platforms is 
an important part of that assessment, hence 'predicting' the 
future of D, hard and very unreliable though that is, is an 
important element in some of my less trivial decisions.


Since you already know D you need to answer a different question. 
 What's the chance the compiler will die on the relevant h

[OT] Re: C's Biggest Mistake on Hacker News

2018-07-28 Thread Abdulhaq via Digitalmars-d

On Friday, 27 July 2018 at 23:42:47 UTC, Laeeth Isharc wrote:

For me, I think that managing money is about choosing to expose 
your capital intelligently to the market, balancing the risk of 
loss against the prospective gain and considering this in a 
portfolio sense.


Prediction doesn't really come into that



I think this apparent difference of opinion is down to different 
definitions of the word prediction. When I say prediction I mean 
the assessment of what are the possible futures for a scenario 
and how likely each one is. It can be conscious or unconscious. I 
think my understanding of the word is not an uncommon one.


By my definition, when you balance the risk of loss (i.e. predict 
how likely you are to lose money) against the prospective gain 
(i.e. multiply the probability of each possible outcome by its 
reward and sum the total to get a prospective value) then you 
are, by my definition and therefore, for me, by definition, 
making predictions.




It's not the prediction that matters but what you do.  It's 
habits, routines, perception, adaptation and actions that 
matter.


I agree they are integral to our behaviour and habits and 
routines do not involve the element of prediction. Perceptions 
come before and actions take place after the decision process is 
made (conscious or not) and so don't factor into this discussion 
for me.


In truth I avoid discussions that are really just arguing about 
definitions of words, but you made a couple of sweeping 
bumper-stickery comments that trying to predict things was 
usually a waste of time and as an alternative we should 'be the 
change...'. I wholeheartedly agree we should 'be the change...' 
but it's not an alternative to making predictions, it goes hand 
in hand with it. I'm sure you've read Kahneman's Thinking, Fast 
and Slow. You made a generalisation that applies to the 'fast' 
part. I'm saying your universal rule is wrong because of the slow 
part.


I learnt D many years ago just after Andrei's book came out. I 
love it but it's on the shelf at the moment for me. I rarely get 
time for side projects these days but when I do I want them to 
run on Android with easy access to all the APIs and without too 
much ado in the build setup. They must continue to work and be 
supported with future versions of Android. At work, on Windows, 
JDK8/JavaFX/Eclipse/maven and python/numpy/Qt/OpenCascade/VTK hit 
the spot. Each project I start I give some very hard thought 
about which development environment I'm going to use, and D is 
often one of those options. The likely future of D on the 
different platforms is an important part of that assessment, 
hence 'predicting' the future of D, hard and very unreliable 
though that is, is an important element in some of my less 
trivial decisions.












Re: So what is the state of cross-compilation in D?

2018-07-28 Thread Andre Pany via Digitalmars-d

On Saturday, 28 July 2018 at 08:56:30 UTC, Mike Franklin wrote:

On Wednesday, 17 January 2018 at 13:24:37 UTC, Andre Pany wrote:

On Wednesday, 17 January 2018 at 12:06:23 UTC, Rel wrote:

Well, to be completely honest with you the only one
thing I like about the Go programming language is the
ability to easily cross-compile your Go program from
any supported OS to any supported OS.

So I was wondering what is the story of cross-compilation
for different D language compilers? Is it possible to some
extent now? Do you guys have interest in it?

Basically as far as I understood what makes Go suitable
for cross-compilation is their own linker implementation,
and D compilers use current system linker.


Cross compiling from Windows to raspberry pi: 
http://d-land.sepany.de/einstieg-in-die-raspberry-pi-entwicklung-mit-ldc.html


Kind regards
Andre


Andre,

That link appears to be dead.  It was actually quite a nice 
resource.  Any chance you can republish it or something?


Thanks,
Mike


While changing the tool to generate the site, the urls changed. 
New link is 
http://d-land.sepany.de/tutorials/einplatinenrechner/einstieg-in-die-raspberry-pi-entwicklung-mit-ldc/


I need to update the article, I wrote it while LDC 1.5.0 was 
actual.


Kind regards
Andre


Re: Constructing a class in-place

2018-07-28 Thread Johan Engelen via Digitalmars-d

On Thursday, 26 July 2018 at 12:53:44 UTC, rikki cattermole wrote:

On 27/07/2018 12:45 AM, Johan Engelen wrote:


In D, we don't have placement new, great! And now, I learn 
that the _standard library_ _does_ have something that looks 
like placement new, but without extra guarantees of the spec 
that C++ has.

For some more info:
https://stackoverflow.com/a/49569305
https://stackoverflow.com/a/48164192

- Johan


Both of those links is related to structs not classes (and 
original post is about classes).
Given the content (I could be wrong) but I don't think its 
related to our situation in D.


Uhm, this has everything to do with our situation in D and with 
classes in D too. The links are of course about classes with and 
without vtable.


Classes in D are very "heavy" with their explicit vtable. Given 
that classes in C++ can act as a value and a reference type, 
you have to be pretty careful when comparing them.


I'd appreciate it if you reread and think more about it. D's 
classes and C++'s structs/classes are the same in what is 
discussed here, and vtable is just one of the issues.


-Johan



Re: So what is the state of cross-compilation in D?

2018-07-28 Thread Mike Franklin via Digitalmars-d

On Wednesday, 17 January 2018 at 13:24:37 UTC, Andre Pany wrote:

On Wednesday, 17 January 2018 at 12:06:23 UTC, Rel wrote:

Well, to be completely honest with you the only one
thing I like about the Go programming language is the
ability to easily cross-compile your Go program from
any supported OS to any supported OS.

So I was wondering what is the story of cross-compilation
for different D language compilers? Is it possible to some
extent now? Do you guys have interest in it?

Basically as far as I understood what makes Go suitable
for cross-compilation is their own linker implementation,
and D compilers use current system linker.


Cross compiling from Windows to raspberry pi: 
http://d-land.sepany.de/einstieg-in-die-raspberry-pi-entwicklung-mit-ldc.html


Kind regards
Andre


Andre,

That link appears to be dead.  It was actually quite a nice 
resource.  Any chance you can republish it or something?


Thanks,
Mike


Re: Constructing a class in-place

2018-07-28 Thread Johan Engelen via Digitalmars-d
On Thursday, 26 July 2018 at 21:22:45 UTC, Petar Kirov 
[ZombineDev] wrote:


Please excuse if my question is too naive, but how does this 
change anything?


The main insight is to reason about things in terms of language 
semantics, not in terms of actual memory addresses and 
instructions as processed by the CPU. Then reread my post. I am 
not talking about disallowing storing different objects in the 
same physical hardware memory location: the language spec says 
nothing about that, and it shouldn't.


Nothing stops the same bytes from being reused for another 
object of a different type.


Here you are talking about physical memory bits, which is none of 
the language's business. So in practice, of course memory will be 
reused. But (most of) that should be transparent to D's language 
semantics.


D on the other hand is (or at least I'm hopeful that it is) 
moving away giving magical powers to its runtime or standard 
library and is its embracing the spirit of bare bones systems 
programming where the programmer is allowed or even encouraged 
to implement everything from scratch (cref -betterC) for when 
that is the most sensible option.


This is a matter of opinion I guess. But why wouldn't you just 
program in assembly? For example, things like 
`__traits(isReturnOnStack)` don't make sense in a high level 
language like D. Some machines don't have a stack. In other 
cases, the decision whether to return something on the stack can 
be delayed until optimization for better performance. I see you 
mention LTO; forget about _any_ optimization and high-level 
language features, if you care about controlling what the machine 
is doing.


While C and C++ approach portability by abstracting the 
machine, the approaches portability by laying all the cards on 
the table and defining things, rather than letting them be 
unspecified or at least documenting the implementation 
definition.


The kind of low-level control that you want is not what D should 
give (and doesn't). With "laying cards on the table" you mean 
specifying language semantics in hardware behavior? Because the 
strength of most languages is in _not_ doing that. (some of the 
strengths that'd be lost: cross platform, cross architecture, 
performance)


Note that this is not only about optimization. It's about being 
able to reason sensibly about code. You are advocating this?

```
class A { virtual void foo(); }
class B : A { ... }
class C : A { ... }

void bar(A a) {
   a.foo(); // type of a is B, but turns it into C
   a.foo(); // type is now C, call different foo
}
```

- Johan



High-level vision for 2018 H2?

2018-07-28 Thread Peter Alexander via Digitalmars-d
The wiki still links to high-level vision for 2018 H1. We're now 
nearly one month into H2. Is a H2 document in progress?