Re: DConf 2019: Shepherd's Pie Edition

2018-12-27 Thread Joakim via Digitalmars-d-announce
On Thursday, 27 December 2018 at 08:25:23 UTC, Russel Winder 
wrote:
On Thu, 2018-12-27 at 02:13 +, Joakim via 
Digitalmars-d-announce wrote:

[…]

Wow, you've really gone off the deep end now. First you lie 
that I presented no data, then when called out, start claiming 
defamation and talk about bringing lawyers into it.


You seem to be a beginner at gaslighting. Your initial data was 
simply two articles expressing an opinion. There was no data 
about conferences generally just a perception of a failure of 
conferences in the iOS arena.



Good luck with that. :)


I will have good luck. The lawyer is a person I have done 
expert witness work for on libel and email usage in the past in 
the High Court. I am not a beginner at this sort of thing.


You will treat this email as a formal cease and desist letter 
requiring you stop defaming my character in public written 
statements. If you continue to defame me in public emails, I 
will escalate and apply for a cease and desist order in the 
High Court.


Heh, nobody cares about you and your blatant L I E S.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-26 Thread Joakim via Digitalmars-d-announce
On Wednesday, 26 December 2018 at 09:34:48 UTC, Russel Winder 
wrote:
On Wed, 2018-12-26 at 05:07 +, Joakim via 
Digitalmars-d-announce wrote:

[…]
I wrote. I have never called anyone any name or insult in
anything I wrote. I have used a pejorative for a type of 
argument
that was being used, or characterized certain actions 
negatively.

[…]

I beg to differ. I have the emails with you hurling personal 
abuse.


Your continuous, data free, but combative on others providing 
no data, hectoring is beginning to annoy people who are trying 
to be constructive within the D community. You have made your 
point, that you believe, and no one else here now gives a shit 
about.


I don't know who's going around deleting forum posts, which I'm 
against normally though in this case concede that our OT 
squabbling added nothing to this thread, but you missed this 
post, where he first started lying about what I wrote.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-26 Thread Joakim via Digitalmars-d-announce
On Wednesday, 26 December 2018 at 16:56:17 UTC, Russel Winder 
wrote:
On Wed, 2018-12-26 at 09:45 +, Joakim via 
Digitalmars-d-announce wrote:



[…]
Wtf are you talking about? I've never emailed you in my life. 
If you mean in this forum thread, quote what you think is 
"personal abuse," I see none.


We can get to that later, no need for now given your statement 
below.


I see, so you have nothing, as always.


[…]
I have not made my point, since most responding seem to think 
I'm trying to lessen in-person interaction, like even Walter 
above with his concert example, when that's the opposite of 
what I'm saying! I chalk that up to people like you, who lie 
about my not presenting any data when that's clearly linked in 
my first post, either because you don't know how to read or 
choose to lie anyway.


So we do not need any history of this thread to see that you 
have no compunction libelling people. Yes there is free speech, 
but there is also defamation via the written word which is a 
civil offence in UK law. This paragraph is almost certainly 
libellous. I am tempted to take legal advice from a UK libel 
solicitor of my acquaintance.


Wow, you've really gone off the deep end now. First you lie that 
I presented no data, then when called out, start claiming 
defamation and talk about bringing lawyers into it.


Good luck with that. :)

For someome who claims not to give a shit, you certainly keep 
replying a lot to me and lying about what I wrote.


You claimed you were going to stop engaging with me, email on 
this list 2018- 12-23T0808+00:00, but it seems you are failing 
to keep your promises.


I have stopped engaging with you on the thread topic, DConf, 
since then, but once you started lying about the tone and content 
of my posts, I've addressed that.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-26 Thread Joakim via Digitalmars-d-announce
On Wednesday, 26 December 2018 at 09:34:48 UTC, Russel Winder 
wrote:
On Wed, 2018-12-26 at 05:07 +, Joakim via 
Digitalmars-d-announce wrote:

[…]
I wrote. I have never called anyone any name or insult in
anything I wrote. I have used a pejorative for a type of 
argument
that was being used, or characterized certain actions 
negatively.

[…]

I beg to differ. I have the emails with you hurling personal 
abuse.


Wtf are you talking about? I've never emailed you in my life. If 
you mean in this forum thread, quote what you think is "personal 
abuse," I see none.


Your continuous, data free, but combative on others providing 
no data, hectoring is beginning to annoy people who are trying 
to be constructive within the D community. You have made your 
point, that you believe, and no one else here now gives a shit 
about.


I have not made my point, since most responding seem to think I'm 
trying to lessen in-person interaction, like even Walter above 
with his concert example, when that's the opposite of what I'm 
saying! I chalk that up to people like you, who lie about my not 
presenting any data when that's clearly linked in my first post, 
either because you don't know how to read or choose to lie anyway.


For someome who claims not to give a shit, you certainly keep 
replying a lot to me and lying about what I wrote.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-25 Thread Joakim via Digitalmars-d-announce

On Tuesday, 25 December 2018 at 23:09:40 UTC, Walter Bright wrote:

On 12/25/2018 10:54 AM, Joakim wrote:

[...]


It's fine that you disagree with others, and it's ok when you 
insult me, but when you insult others it's time to stop.


It's not clear what you're referring to, since you quote nothing 
I wrote. I have never called anyone any name or insult in 
anything I wrote. I have used a pejorative for a type of argument 
that was being used, or characterized certain actions negatively. 
I stand by those negative descriptions, and if you consider it an 
"insult" that I said "you just had to be there" is a stupid 
argument, I don't know what to tell you. Almost anyone who thinks 
about such matters believes that.


I could sit here and say it was "insulting" how people repeatedly 
characterized me as wanting to stop all in-person interaction, 
when that is _the exact opposite_ of what I wrote! But I don't go 
around making up grievances like that, I suggest you don't either.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-25 Thread Joakim via Digitalmars-d-announce
On Tuesday, 25 December 2018 at 11:27:29 UTC, Nicholas Wilson 
wrote:

On Tuesday, 25 December 2018 at 05:01:43 UTC, Joakim wrote:
On Monday, 24 December 2018 at 22:22:08 UTC, Steven 
Schveighoffer wrote:
The 0.1% of the community that attend seem to like it, the 
vast majority don't, or at least don't care.


You think we have 200k users? More to the point you neglect the 
benefit of development and progress is shared by all users.


I, for one, will not be donating to the foundation as long 
as they continue to waste money this way, just as others 
have said they won't donate as long as it doesn't put out a 
Vision document anymore or otherwise communicate what it's 
doing with their money.


I agree this does need to happen, the foundation will be having 
a another meeting in Feb to set the vision, which I hope will 
be a little more planned and productive than the last one.


Nobody is asking for your money for this conference (unless 
you want to attend), and if you feel this way, that's totally 
your choice.


I'm not talking about the registration fee, I'm talking about 
contributing anything to the foundation, which Walter 
indicates above covers some of the expenses for DConf.


Some additional transparency would help, Mike?


I like the results that come from the conferences, I've
been to all of them since 2013, on my dime for 3, and with 
assistance for 3. I felt it was 100% worth it for all.


Yet you cannot give a single reason _why_ you felt it was 
worth it, or why my suggestions wouldn't make it better.


I'll give my reasons:
I got a job out of it.
I got useful insight into various bits of the compiler.
I got connections for collaboration with stuff that I'm 
interested.



If you're making a bad decision, it _should_ be questioned.


Indeed, but none of us think DConf is a bad idea or that the 
format doesn't work for us.


Almost nothing that has been decided so far would stop most of 
my three suggestions from still being implemented.


You haven't managed to convince us that that would be an 
improvement.


As for how they feel about it, I don't care. The reason most 
projects and companies fail is because the decision-making 
process stops being about putting out a good product but about 
"feelings" and various people "saving face," especially when 
higher up the hierarchy, ie politics. And don't make up some 
nonsense that I'm saying that it's okay if everybody starts 
cursing each other out like Linus did: we're talking about 
_questioning a decision_. That is the whole point of having a 
community.


The day this community starts being more about saving face is 
the day I leave it, as that's the beginning of the end, and I 
don't want to be around for that end.


I totally agree, but again, you haven't convinced us that it is 
an improvement.


Not at all, the whole reason I'm willing to debate is that 
other worthwhile perspectives may be out there. I think the 
evidence and arguments strongly favor the suggestions I'm 
putting forward, but I'm perfectly willing to consider other 
arguments.


That is the same stance they should have, but don't appear to. 
My problem with this "debate" is that nobody was able to 
defend the current DConf format at all.


That reasoning is backwards: in our experience DConf, as done 
in the past, works, and it works well. The onus is on you to 
convince us that it would work better the way you describe.


Simply repeating over and over again that you're not "convinced" 
is not an argument, nor do your own personal reasons above argue 
for one format over another.


I asked for a rationale above and got none from Mike and a very 
weak, confused one from Walter. It's fairly obvious that there 
was never any real deliberation on the DConf format, and that you 
guys have dug in and decided to cut off your nose to spite your 
face. Fine with me, your loss.


Consider some of Walter's silly arguments above: at one point 
he says he wants "successful instantiations of your theories," 
implying that these are all things I'm just talking about and 
nobody's doing them, though it's not clear which aspects he 
thinks that of since I've presented evidence for much of it.


But at another point, he says that other D meetups are already 
doing something I suggest (I pointed out that he's wrong about 
that one, but let's assume he believes it), so there's no 
reason for DConf to do it. First of all, 95+% of D meetups 
appear to follow the DConf format of having a single speaker 
lecture to a room, so why isn't that an argument against doing 
that yet again at DConf?


What works at one scale doesn't necessarily work at another.


I see, so you're arguing that DConf shouldn't be doing in-person 
talks because it's larger than most D meetups? Don't answer that, 
scale as a reason makes no sense and there's no way you can make 
it.


To do something very different from a "traditional" conference 
would be a significant risk when what we have works well.


I see no 

Re: DConf 2019: Shepherd's Pie Edition

2018-12-25 Thread Joakim via Digitalmars-d-announce
On Tuesday, 25 December 2018 at 07:10:46 UTC, rikki cattermole 
wrote:

On 25/12/2018 6:01 PM, Joakim wrote:
See my responses to Nicholas above, I don't think the Android 
port merits a talk. By the same standards I apply to others' 
talks above, I don't think my work merits a talk either. ;)


A talk covering ARM and Android development in general would be 
very well received in the context of D. If you want to be 
convinced we could do a poll on who would want to see it (but I 
expect quite a large number of people would be in support of).


I don't see how it could be worthwhile: nobody has ever given 
such a DConf talk about a port to a specific platform because it 
doesn't really make sense. The whole point of a port is to 
abstract away the platform, so you can simply recompile most of 
your D source for it, as H. S. Teoh has indicated he's been able 
to do with the Android app he's been developing in D recently.


The way to do that talk is to abstract multiple ports into a 
general porting guide, which is the talk Kai already gave, or 
maybe talk about the details of a port to a very obscure or 
different platform, as Igor did this year:


https://dconf.org/2018/talks/cesi.html

While it was fascinating to hear how much work he put into it, 
much more than me, my interest was squelched somewhat because he 
couldn't reveal the platform and it's likely I would never use it 
anyway (not a game programmer). I mean, who really develops for 
non-Windows, non-Posix OS platforms? I haven't since college. For 
those few who do, maybe the talk was great. But the Android port 
wasn't that obscure: it's basically a linux/ARM distro with a 
different libc, Bionic.


If you really mean "ARM and Android development in general" and 
not the details of the port, I can't claim much knowledge of 
that, as I don't have a large Android codebase that I've 
developed and deployed. Hopefully, even if I did, there would be 
nothing to say: as it should be pretty similar to writing D code 
for a desktop platform.


My phone- on whose 5.5" screen I'm viewing the text of this forum 
response as I type it out on a separate, full-sized bluetooth 
keyboard paired with it- has 6 GBs of RAM and 128 GBs of storage 
(of which I have 8 GB free right now). That's about what midrange 
desktops and laptops come with these days (though with much 
larger screens ;) ), so you can't say mobile presents much of a 
constraint in terms of hardware. I've pointed out before that I 
compile code on my phone about as fast as a Macbook Air from a 
couple years ago:


https://forum.dlang.org/thread/sqbtgmbtrorgthspl...@forum.dlang.org

If you see some other angle on an Android talk that I'm missing, 
I'd be happy to hear it, but I don't see it. Maybe someday when I 
have a huge, successful Android app in D, I'll write about or put 
up a talk online about the architecture I used, but hopefully 
there won't be much specific to Android there. :)


Re: DConf 2019: Shepherd's Pie Edition

2018-12-24 Thread Joakim via Digitalmars-d-announce
On Monday, 24 December 2018 at 22:22:08 UTC, Steven Schveighoffer 
wrote:

On 12/24/18 2:44 AM, Joakim wrote:
On Sunday, 23 December 2018 at 22:36:05 UTC, Steven 
Schveighoffer wrote:


Huh? It's their decision, not yours. Even if the decision has 
no reason at all, it's still theirs. What is the problem? 
Start your own D "conference competitor" if you think you can 
do better.


They are accountable to the community, so the decision and its 
reasons matter.


My impression is that the community likes and benefits from 
these conferences, so everything's cool there.


The 0.1% of the community that attend seem to like it, the vast 
majority don't, or at least don't care.


I, for one, will not be donating to the foundation as long as 
they continue to waste money this way, just as others have 
said they won't donate as long as it doesn't put out a Vision 
document anymore or otherwise communicate what it's doing with 
their money.


Nobody is asking for your money for this conference (unless you 
want to attend), and if you feel this way, that's totally your 
choice.


I'm not talking about the registration fee, I'm talking about 
contributing anything to the foundation, which Walter indicates 
above covers some of the expenses for DConf.



I like the results that come from the conferences, I've
been to all of them since 2013, on my dime for 3, and with 
assistance for 3. I felt it was 100% worth it for all.


Yet you cannot give a single reason _why_ you felt it was worth 
it, or why my suggestions wouldn't make it better.


Nobody cares to debate something that has already been 
scheduled and planned, the time to bring up concerns was 
earlier, when you brought it up before. But that failed to 
convince, now it's decided, time to move on.


So you agree with me that there's no point in "debating" it 
again, perhaps you should have addressed this comment to Mike 
then?


Mike didn't start the debate in this thread, you did.


I did no such thing: I asked for the reasons _why_ the decision 
was made, considering the previous debate. That is not restarting 
the debate, simply asking for the rationale. Others then tried to 
debate me again, and while I did respect them enough to engage 
with their arguments, I repeatedly pointed out that I wasn't 
looking to debate it again.


Consider how one feels when careful deliberation is made, and a 
final decision, combined with an announcement is made. Would 
you like to have people question your decisions AFTER they are 
made, and commitments have already been established? The time 
to question them is before they are made, not after. 
Questioning after is simply viewed (rightly) as sour grapes. 
You didn't get your way, move on.


If you're making a bad decision, it _should_ be questioned. 
Almost nothing that has been decided so far would stop most of my 
three suggestions from still being implemented.


As for how they feel about it, I don't care. The reason most 
projects and companies fail is because the decision-making 
process stops being about putting out a good product but about 
"feelings" and various people "saving face," especially when 
higher up the hierarchy, ie politics. And don't make up some 
nonsense that I'm saying that it's okay if everybody starts 
cursing each other out like Linus did: we're talking about 
_questioning a decision_. That is the whole point of having a 
community.


The day this community starts being more about saving face is the 
day I leave it, as that's the beginning of the end, and I don't 
want to be around for that end.


If it's such a great idea, that should be an easy case to 
make, compared to the alternatives given. Yet all I get is a 
bunch of stone-walling, suggesting no reasoning was actually 
involved, just blindly aping others and the past.


It is easy, for those who have attended conferences and like 
them -- they work well. All past dconfs are shining examples. 
Just drop it and move on to something else. You lost the 
battle for this one, it's no longer up for discussion.


Heh, there was no "battle," as most of those responding didn't 
even understand what I wrote, like Iain above, gave no 
arguments (we "like them -- they work well"), and as finally 
clear from Mike and Walter's responses here, there was no real 
deliberation on the matter.


You think they just flipped a coin one day, and didn't think 
about any past experience at all? No real thinking must have 
gone into it because only intelligent people can come to the 
conclusion you reached, right? This kind of "debate" where the 
assumption is that only my way is correct is common out there 
these days, it's tiring.


Not at all, the whole reason I'm willing to debate is that other 
worthwhile perspectives may be out there. I think the evidence 
and arguments strongly favor the suggestions I'm putting forward, 
but I'm perfectly willing to consider other arguments.


That is the same stance they should have, but don't appear to. My 
problem 

Re: DConf 2019: Shepherd's Pie Edition

2018-12-23 Thread Joakim via Digitalmars-d-announce
about me. I'm pointing out 
trends for _most_ devs,


DConf has been growing in size every year it has been held, as 
have IWOCL and the LLVM conferences.


Has it? I don't see any official numbers, but this year's DConf 
eye-balled smaller to me on the videos.


I'm sure some topics for some conferences are declining, it may 
well even be an industry wide trend, but I'd bet good money 
that the new equilibrium will have conferences as a staple.


Perhaps, but not with the outdated format D currently follows.


my own preferences are irrelevant.


I certainly hope not.


Of course it is. Just as Walter shouldn't be making decisions 
based on what he "enjoys," I shouldn't either. Significant 
attention should be paid to what the majority of the audience 
wants, which is why it is important to pay attention to data like 
that which I presented, that shows conference attendance and 
events significantly declining.


But consider that the foundation reimburses speakers and I 
personally would be very interested to hear what you have 
been doing with Andoird/ARM and I'm sure many others would as 
well, the question becomes: is it worth your time?


I don't understand what's so special about "speakers" that it 
couldn't simply reimburse non-speakers that the foundation 
wants at one of the decentralized locations instead. It seems 
like the talk is a made-up excuse to pay for some members of 
the core team to come, when the real reason is to collaborate 
with them. Why not dispense with that subterfuge?


The talks together with the topic of the conference are what 
draw people to the conference and make it economically viable. 
It is a perfectly rational decision. If I was running a 
conference trying to turn a profit I'd probably get more 
applications for the available speaker slots => better quality 
speakers => more attendees => $$$.


This is a giant assumption, that those blog posts explicitly call 
out as not holding anymore, now that most of those speakers 
already get their message out easily online.


DCompute would not exist were it not for that reimbursement, as 
a poor student that made the difference between this is 
something I can work towards, afford to go to and get good 
value out of vs not. Perhaps we could run general travel grants 
like LLVM does but I don't think we're large enough for that, 
Mike Parker would be the person to talk to about that. But if, 
like me, they are students and wan't to have something to talk 
about to aid in networking, then giving a talk will help with 
that.


Then have them do a pre-recorded talk like every other speaker, 
pick some strong contributors to attend every year as you're 
already doing, but don't have them talk, and spend all that 
valuable in-person time actually networking, doing BoF, getting 
things done.


I see little value in a full talk about a port to a new 
platform like Android, that is basically another linux distro 
with a different libc. It's not a matter of my time, I don't 
think it's worth the audience's time. I wish those organizing 
DConf would focus on that more.


You can choose the length of the talk you think would fit the 
topic.


It _might_ make sense for a 5-15 min. lightning talk.

You could cover the basics of using the port for developing 
Android apps


Trivial and available on the wiki, no need.


the difficulties you experienced doing the port


A port is all about fixing a ton of one-off incompatibilities, 
that is the recipe for a bad talk. It could be used as a 
launching point for a much larger exploration of the platform 
itself- say Walter using his DWARF port as a launching point to 
talk about the DWARF format and such debug formats generally- but 
I don't know enough about Android to do that, nor would it really 
make sense at DConf.



and the troubles others might have in doing their own,


Kai gave an excellent, general version of this talk already, 
there's nothing substantive I could add to it other than a bit 
more technical detail of how it applied to my port:


https://dconf.org/2016/talks/nacke.html

I wish it had been available when I started my port three years 
earlier!


... as they say, the stage is yours. It would also present an 
opportunity to convince others of the direction  you think we 
should be going in e.g. w.r.t mobile/ARM/AArch64.


I thought about submitting that as a topic last year, but it's 
better done on the forum, as I've been doing.


On Sunday, 23 December 2018 at 15:32:41 UTC, Iain Buclaw wrote:
On Sun, 23 Dec 2018 at 16:05, Joakim via Digitalmars-d-announce 
 wrote:


I'm not sure how a talk is supposed to inspire anything 
substantive _before_ you've heard it, and pre-recorded talks 
watched at home would fill the same purpose after.




No one is interested in watching pre-recorded talks.


Let's look at the numbers. There were around 100 people at DConf 
this year? Youtube reports 875 views for Andrei's keynote after 
being recorded and put online

Re: DConf 2019: Shepherd's Pie Edition

2018-12-23 Thread Joakim via Digitalmars-d-announce

On Sunday, 23 December 2018 at 10:07:40 UTC, Walter Bright wrote:

On 12/22/2018 10:20 PM, Joakim wrote:
Honestly, yours are routinely the worst presentations at 
DConf. Your strength as a presenter is when you dig deeply 
into a bunch of technical detail or present some new technical 
paradigm, similar to Andrei. Yet, your DConf keynotes usually 
go the exact opposite route and go very lightly over not very 
much at all.


Eh, I went pretty far into the DIP 1000 material.


That one had more technical examples, but I didn't think it was 
very well-motivated and could probably have had more detail.


My feeling is that you save your best stuff for your NWCPP talks 
and present the baby versions at DConf.


1) Ditch in-person presentations for pre-recorded talks that 
people watch on their own time. Getting everybody in the same 
room in London to silently watch talks together is a horrible 
waste, that only made sense before we all had high-speed 
internet-connected TVs and smartphones with good cameras. Do a 
four-day hackathon instead, ie mostly collaboration, not 
passive viewing.


It's very different listening to a presentation live rather 
than pre-recorded. There are the before and after interactions 
they inspire.


I'm not sure how a talk is supposed to inspire anything 
substantive _before_ you've heard it, and pre-recorded talks 
watched at home would fill the same purpose after.


Perhaps this is a generation gap, as I see that you and Russel 
are a couple decades older than me, so let me give my 
perspective. I've probably watched a week or two of recorded tech 
talks online over the last year, and maybe a couple hours in 
person. Invariably, I find myself wishing for a skip-ahead button 
on those in-person talks, like I have for the online videos. ;)


I suspect there are many more like me these days than you two.

2) Rather than doing a central DConf that most cannot justify 
attending, do several locations, eg in the cities the core 
team already lives in, like Boston, Seattle, San Jose, Hong 
Kong, etc. This makes it cost-effective for many more people 
to attend, and since you'll have ditched the in-person tech 
talks, spend the time introducing the many more attendees to 
the language or have those who already know it work on the 
language/libraries, ie something like the current DConf 
hackathon.


London is the most cost-effective destination for most D team 
members. For distributed meetings, there have been several D 
meetups that do what you suggest. While fun and valuable, 
they're not a replacement for DConf.


I have never heard of a meetup doing what I suggest, ie an 
all-day D event with almost no in-person talks, possibly 
co-ordinated with other cities. I think this would be _much 
better_ for D than DConf.


3) Get the core team together as a separate event, either as 
an offline retreat or online video conference or both. I know 
you guys need to meet once in awhile, but it makes no sense to 
spend most of that in-person time at DConf staring at talks 
that could be viewed online later.


If you ever came to one, you might see it differently.


I'm not a member of the core team, so I'm not sure how that's 
relevant. If you just mean that I could observe how the core team 
is getting a lot of value out of in-person talks, I call BS.


While I find it questionable to say that they couldn't easily 
find and recruit those people online, given that D is primarly 
an online project where most everything and everyone is easily 
available online, I see no reason why any of the changes above 
would stop that.


There's a very clear connection between DConf and successful 
collaborations with industry and D developers. Why mess with 
success?


For the chance of much more success? I'm sure there have been 
some fruitful collaborations and hiring at DConf. I'm saying 
there would likely be _even more_ with my suggestions.


It seems clear to me that you, at the very least, have not 
engaged with the links and ideas I've been providing about why 
the current DConf format is broken.


Your opinions would have more weight if (1) you've ever 
attended a DConf


Perhaps but since I haven't been, you could presumably articulate 
what you find so great about DConf that contradicts my opinions, 
but you mention nothing here and your reasons elsewhere aren't 
too worthwhile.



and (2) can point to successful instantiations of your theories.


What do you consider a "theory" above: that you could have better 
outreach at several locations or that pre-recorded talks watched 
at home are a better use of valuable in-person time? I don't 
think that's theorizing, it's well-accepted by most everyone who 
knows these subjects.


I started off by pointing to documented evidence of conferences 
going down, and popular bloggers and people who track this stuff 
talking about how online talks have replaced them, so it is 
well-known that this trend away from the old conference format is 
underway.


I 

Re: DConf 2019: Shepherd's Pie Edition

2018-12-23 Thread Joakim via Digitalmars-d-announce
On Sunday, 23 December 2018 at 09:51:58 UTC, Nicholas Wilson 
wrote:

On Sunday, 23 December 2018 at 08:08:59 UTC, Joakim wrote:
On Sunday, 23 December 2018 at 06:54:26 UTC, Russel Winder 
wrote:
Others have cited Rust and Go. I shall cite Python, Ruby, 
Groovy, Java, Kotlin, Clojure, Haskell, all of which have 
thriving programming language oriented conferences all over 
the world. Then there are the Linux conferences, GStreamer 
conferences, conference all about specific technologies 
rather than programming languages. And of course there is 
ACCU. There is much more evidence that the more or less 
traditional conference format serves a purpose for people, 
and are remaining very successful. Many of these conferences 
make good profits, so are commercially viable.


That's all well and good, but none of this addresses the key 
points of whether there are less tech conferences being done 
and whether they make sense in this day and age. There are 
still people riding in horse and carriage, that doesn't mean 
it's still a good idea. :)


You say that like some superior technology exists to replace 
the conference.


It does, read the first link I gave in my first post above.

Yes, DConf may benefit from tutorials, workshops, BoFs, 
whatever, but the value it brings to the community is very real.


It may bring some value, but that's not the question: the 
question is whether we could get more value out of the 
alternatives, particularly at a cheaper cost? The fact that you 
and others keep avoiding this question suggests you know the 
answer.


Thus I reject the fundamental premise of your position that 
the conference format is dying off. It isn't. The proof is 
there.


Yes, the proof is there: the conference is dying.


Hardly. IME there are two kinds of conferences (or maybe they 
form a spectrum, whatever) academic and industrial. Academic is 
going nowhere, research needs presenting, organisation of 
collaboration needs to happen.


Research conferences are irrelevant. I don't pay attention to 
them and the fact that the Haskell link Atila gave above says 
their conferences are for presenting research is one big reason 
why almost nobody uses that PL in industry.


Industrial, there is project coordination, employment 
prospectus, business opportunities, why do you think companies 
sponsor conferences? They get their moneys worth out of it.


Clearly not in the iOS community, and according to a commenter in 
my second link above, the Javascript community in his country, as 
the number of tech conferences is going down a lot. It is my 
impression that this is true across the board for pretty much 
every tech community, but I presented that iOS link because he 
actually tallies the evidence. That is a canary in the coal mine 
for the conference format, that the largest burgeoning dev market 
on the planet has a dying conference scene.


Perhaps you as an individual believe that they are not cost 
effective for you, fine.


As I keep repeating, this is not about me. I'm pointing out 
trends for _most_ devs, my own preferences are irrelevant.


But consider that the foundation reimburses speakers and I 
personally would be very interested to hear what you have been 
doing with Andoird/ARM and I'm sure many others would as well, 
the question becomes: is it worth your time?


I don't understand what's so special about "speakers" that it 
couldn't simply reimburse non-speakers that the foundation wants 
at one of the decentralized locations instead. It seems like the 
talk is a made-up excuse to pay for some members of the core team 
to come, when the real reason is to collaborate with them. Why 
not dispense with that subterfuge?


I see little value in a full talk about a port to a new platform 
like Android, that is basically another linux distro with a 
different libc. It's not a matter of my time, I don't think it's 
worth the audience's time. I wish those organizing DConf would 
focus on that more.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-23 Thread Joakim via Digitalmars-d-announce

On Sunday, 23 December 2018 at 09:36:19 UTC, Russel Winder wrote:
On Sun, 2018-12-23 at 08:08 +, Joakim via 
Digitalmars-d-announce wrote: […]


This questioning of iOS is so removed from reality that it 
makes me question if you are qualified to comment on this 
matter at all. iOS is the largest consumer software platform 
that is still growing, as it's estimated to bring in twice the 
revenue of google's Play store (that doesn't count other 
Android app stores, but they wouldn't make up the gap):


Fair enough I have no interest in iOS at all. But you must 
agree that you are clearly so far removed from the reality of 
putting on technical conferences generally, that you are not 
qualified to make assertions such as "conferences are a dead 
form".


You could make various arguments for why they're still having 
less and less conferences, as my second link above listing 
them does. But to argue that iOS is not doing well is so 
ludicrous that it suggests you don't know much about these 
tech markets.


Ludicrous is a good description of the entire situation in this 
thread. You are making assertions as though they are facts, 
working on the principle that if you shout long enough and loud 
enough, people will stop disagreeing. A classic technique.


[…]

Yes, the proof is there: the conference is dying. You simply 
don't want to admit it.


This is just assertions with no  data and thus is a religious 
position. And I know conferences are thriving, you just do not 
want to admit that.


This seems to be a religious issue for you, with your bizzare 
assertions above, so I'll stop engaging with you now.


No it is you that has faith in the death of conferences, I am 
involved in the reality of conferences being a relevant thing 
that people want to attend. Just because you do not want to go 
to conferences doesn't give you the right to try and stop 
others from doing so.


If you are going to stop ranting on this, I think that will 
make a lot of people very happy. The idea of this email list is 
to announce things, not debate things. Also on the debating 
lists the idea is to have a collaborative not combative debate 
about things. That includes if some people want to do something 
they should be allowed to do it and not be harangued from the 
wings. If people want to have a DConf, it is not your position 
to tell them they cannot.


Your statements above are so ridiculous that they refute 
themselves, no need for me to do so. :)


As for your final ridiculous characterization that I'm 
"ranting/haranguing" people on this matter, I have only ever 
presented evidence and reasons for why the DConf format doesn't 
make sense. If that's "ranting" to you, it's clear you don't 
understand reasoned debate.


In this thread, all I've asked is why all those reasons were 
ignored, as Mike never gave any arguments for why those reasons 
aren't worth heeding. Walter's response suggests he never read my 
suggestions or reasons in the first place.


Nobody is telling "anyone they cannot," as though any of us have 
that power. Rather, I'm trying to figure out how this decision 
was made, in the face of all the reasons given and almost none 
given for maintaining the status quo.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-23 Thread Joakim via Digitalmars-d-announce

On Sunday, 23 December 2018 at 06:54:26 UTC, Russel Winder wrote:
On Sat, 2018-12-22 at 13:46 +, Joakim via 
Digitalmars-d-announce wrote:



[…]
Given that this conference format is dying off, is there any 
explanation for why the D team wants to continue this 
antiquated ritual?


https://marco.org/2018/01/17/end-of-conference-era 
http://subfurther.com/blog/2018/01/15/the-final-conf-down/ 
https://forum.dlang.org/thread/ogrdeyojqzosvjnth...@forum.dlang.org


[…]

So iOS conferences are a dying form. Maybe because iOS is a 
dying form?


This questioning of iOS is so removed from reality that it makes 
me question if you are qualified to comment on this matter at 
all. iOS is the largest consumer software platform that is still 
growing, as it's estimated to bring in twice the revenue of 
google's Play store (that doesn't count other Android app stores, 
but they wouldn't make up the gap):


https://techcrunch.com/2018/07/16/apples-app-store-revenue-nearly-double-that-of-google-play-in-first-half-of-2018/

You could make various arguments for why they're still having 
less and less conferences, as my second link above listing them 
does. But to argue that iOS is not doing well is so ludicrous 
that it suggests you don't know much about these tech markets.


Your evidence of the failure of the iOS community to confer is 
not evidence of the failure of the conference in other 
communities.


I never said they fail to confer, I said they're doing it much 
less, because the format is not relevant anymore.


Others have cited Rust and Go. I shall cite Python, Ruby, 
Groovy, Java, Kotlin, Clojure, Haskell, all of which have 
thriving programming language oriented conferences all over the 
world. Then there are the Linux conferences, GStreamer 
conferences, conference all about specific technologies rather 
than programming languages. And of course there is ACCU. There 
is much more evidence that the more or less traditional 
conference format serves a purpose for people, and are 
remaining very successful. Many of these conferences make good 
profits, so are commercially viable.


That's all well and good, but none of this addresses the key 
points of whether there are less tech conferences being done and 
whether they make sense in this day and age. There are still 
people riding in horse and carriage, that doesn't mean it's still 
a good idea. :)


Thus I reject the fundamental premise of your position that the 
conference format is dying off. It isn't. The proof is there.


Yes, the proof is there: the conference is dying. You simply 
don't want to admit it.


This seems to be a religious issue for you, with your bizzare 
assertions above, so I'll stop engaging with you now.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-22 Thread Joakim via Digitalmars-d-announce
On Saturday, 22 December 2018 at 22:13:44 UTC, Walter Bright 
wrote:

On 12/22/2018 6:26 AM, Atila Neves wrote:
If you don't like conferences you don't have to go. I for one 
am excited about being in London in May. Please don't sour it 
for other who think/feel like I do.


That's right. And hefting a pint with Atila is guaranteed to be 
a highlight of the conference! I recommend it for those who 
haven't had the pleasure.


I'm sure he's fun to be around, the question is whether it's 
worth the cost of flying to London.


That said, I think we've probably tried to cram too many 
presentations into the schedule. We should probably have fewer 
and put gaps between them for people to digest and talk about 
them.


The question is if it's worth doing in-person presentations at 
all.


Also, I try to make my presentations less "I lecture and you 
listen silently" to be much more interactive and engaging with 
you guys. I suggest others planning a presentation to also 
think along those lines.


Honestly, yours are routinely the worst presentations at DConf. 
Your strength as a presenter is when you dig deeply into a bunch 
of technical detail or present some new technical paradigm, 
similar to Andrei. Yet, your DConf keynotes usually go the exact 
opposite route and go very lightly over not very much at all.


Reading through your listed benefits of DConf below tells me you 
didn't read anything I wrote in the linked forum thread above 
from months ago, as nowhere did I say not to get people together 
in person at all, which is where most of your benefits come from.


Rather, I made three primary suggestions for how to get people 
together instead:


1) Ditch in-person presentations for pre-recorded talks that 
people watch on their own time. Getting everybody in the same 
room in London to silently watch talks together is a horrible 
waste, that only made sense before we all had high-speed 
internet-connected TVs and smartphones with good cameras. Do a 
four-day hackathon instead, ie mostly collaboration, not passive 
viewing.


2) Rather than doing a central DConf that most cannot justify 
attending, do several locations, eg in the cities the core team 
already lives in, like Boston, Seattle, San Jose, Hong Kong, etc. 
This makes it cost-effective for many more people to attend, and 
since you'll have ditched the in-person tech talks, spend the 
time introducing the many more attendees to the language or have 
those who already know it work on the language/libraries, ie 
something like the current DConf hackathon.


3) Get the core team together as a separate event, either as an 
offline retreat or online video conference or both. I know you 
guys need to meet once in awhile, but it makes no sense to spend 
most of that in-person time at DConf staring at talks that could 
be viewed online later.


Some other advantages of DConf off the top of my head, in no 
particular order:


1. putting a face and name to the person greatly helps working 
with people remotely the rest of the year


Maybe, but only 2) above mitigates it somewhat, and is it worth 
the cost?


2. it's amazing how intractable, obstinate online positions 
just melt away when discussed in person over a beer


1) and 3) enable that more, 2) sacrifices that for greater 
outreach.


3. it's fun to see what other people are doing, as it's easy to 
miss what's important by just monitoring the n.g.


1) and 3) enable that more, 2) sacrifices it somewhat.

4. I regard all you folks as my friends, and it's fun to be 
with y'all


Is that more important than outreach and getting things done?

5. many, many collaborations have spawned from meeting like 
minded individuals at DConf


They still would with the suggestions above, just differently.

6. employers come to DConf looking for D developers, and many D 
developers have gotten jobs from them. If that isn't a win-win, 
I don't know what is!


While I find it questionable to say that they couldn't easily 
find and recruit those people online, given that D is primarly an 
online project where most everything and everyone is easily 
available online, I see no reason why any of the changes above 
would stop that.


It seems clear to me that you, at the very least, have not 
engaged with the links and ideas I've been providing about why 
the current DConf format is broken.


My fundamental point is that the current DConf conference format 
is an outdated relic, that made sense decades ago when getting 
everybody together in a room in Berlin was a fantastic way to get 
everybody connected. With the ready availability of high-speed 
internet and video displays to everybody who can afford to pay 
the registration fee and go to London, that hoary conference 
format needs to be rethought for the internet age.


I have no problem with anybody disagreeing with my suggestions or 
the reasoning behind them, but I find it flabbergasting for 
anyone to suggest, as Mike has above, that the old conference 
format 

Re: DConf 2019: Shepherd's Pie Edition

2018-12-22 Thread Joakim via Digitalmars-d-announce
On Saturday, 22 December 2018 at 17:36:08 UTC, Bastiaan Veelo 
wrote:

On Saturday, 22 December 2018 at 16:57:10 UTC, Joakim wrote:
On Saturday, 22 December 2018 at 16:35:27 UTC, Johannes Loher 
wrote:
Also I don't think this is the right place for this 
discussion. If you feel that we indeed need to rediscuss this 
issue, I think it should be done in a separate thread.


I'm not trying to discuss it with you or the community. I'm 
asking the D team [...]


Then why post in the announce thread? If you don’t feel your 
previous thread got your message through, you know how to reach 
the foundation.


Why wouldn't I post in here? There's currently a 84-post thread 
in this Announce forum discussing Atila's blog post about what D 
got wrong.


Similarly, this is the thread where the topic is the next DConf. 
I almost never send private emails over community matters, which 
should be discussed publicly.


I don’t understand how you can argue against technical 
conferences so much if you never attended one, much less DConf.


I didn't say I never attended one, I probably sat through 
something back in my college days. I watch some conf videos now 
and then, but like most techies these days, don't find any value 
in going.



I know the odds are slim, but I hope to meet you there someday.


I'd like to meet you too, but I think if it happens, it won't 
ever be at an outdated format like the current DConf. :P





Re: DConf 2019: Shepherd's Pie Edition

2018-12-22 Thread Joakim via Digitalmars-d-announce

On Saturday, 22 December 2018 at 17:13:06 UTC, Mike Parker wrote:

On Saturday, 22 December 2018 at 16:57:10 UTC, Joakim wrote:

I'm not trying to discuss it with you or the community. I'm 
asking the D team who're making this decision why it's being 
made, despite all the reasoning in that thread, and 
reiterating that it's a bad move. I suspect they're not 
thinking this through, but they can speak for themselves.


The decision was made because your reasoning failed to convince 
anyone involved in the planning that maintaining the current 
format of DConf is a mistake. Nor do they agree with you that 
it's a bad move. We like the current format and see no need to 
change it at this time.


I see, so you admit no reasoning was involved on your part? 
Because you present none, either there or here.


If you would like to carry on another debate about this, please 
open another thread in thhe General forum. This one isn't the 
place for it. Thanks!


As I just noted, I don't care to "debate" it with people who make 
no arguments. Instead, I'm asking you or whoever made this 
horrible decision why it's being made.


If it's such a great idea, that should be an easy case to make, 
compared to the alternatives given. Yet all I get is a bunch of 
stone-walling, suggesting no reasoning was actually involved, 
just blindly aping others and the past.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-22 Thread Joakim via Digitalmars-d-announce
On Saturday, 22 December 2018 at 16:35:27 UTC, Johannes Loher 
wrote:

On Saturday, 22 December 2018 at 15:11:10 UTC, Joakim wrote:
On Saturday, 22 December 2018 at 14:26:29 UTC, Atila Neves 
wrote:

On Saturday, 22 December 2018 at 13:46:39 UTC, Joakim wrote:
On Saturday, 22 December 2018 at 12:18:25 UTC, Mike Parker 
wrote:


The egregious waste of time and resources of this DConf 
format strongly signals that D is not a serious effort to 
build a used language,


It's the same signal being emitted by all of these "failures" 
as well:


Go: https://twitter.com/dgryski/status/1034939523736600576
Rust: https://rustconf.com/
Clojure: https://clojure.org/community/events
Haskell: https://wiki.haskell.org/Conferences
C++: https://cppcon.org/ https://cpponsea.uk/ 
http://cppnow.org/ https://meetingcpp.com/


etc.

To me it's obvious from that short list that took me less 
than 5min to come up with that conferences aren't a dying 
format. I gave up on C++ conferences after the 4th link, 
there are just too many.


The fact that a short list of conferences still exists at all 
somehow makes it "obvious" to you that they're not dying? Did 
you even look at my second link that actually tallies some 
numbers for a particular tech market?


It is true that a few conferences are still being done, even 
my second link above never said they're _all_ gone. But simply 
saying some are still following this outdated ritual is not an 
argument for continuing it, nor does it contradict anything I 
said about the number of conferences going down.



If you don't like conferences you don't have to go.


This has nothing do me: I've never been to DConf or most any 
other tech conference and likely never will. This is about 
whether the D team should be wasting time with this dying 
format.


I for one am excited about being in London in May. Please 
don't sour it for other who think/feel like I do.


Heh, so that's your two big arguments for why the conference 
format should continue: other languages are doing it and you 
want to visit London in May? You are exemplifying the mindset 
that I'm pointing out with these flimsy arguments, everything 
that is wrong with D and DConf.


We talked a great deal about this in your thread 
(https://forum.dlang.org/thread/ogrdeyojqzosvjnth...@forum.dlang.org). I believe the main takeaway from that discussion was that many of us disagree with your opinion to at least some degree.


As I recall, you largely agreed with me:

"I totally agree with you on your first point, i.e. making DConf 
more interactive."


"I disagree with your second point, i.e. decentralising DConf... 
On the other hand, I have to admit that decentralising the event 
would open it up for a much bigger audience, which definitely is 
a good idea."

https://forum.dlang.org/post/omsxuayxkaqbxeobe...@forum.dlang.org

I know that you are very convinced about your idea of how we 
should do DConf being superior and that is OK. Maybe you are 
just ahead of time in this case, I don't know. But it is also  
a fact that many people stated that they actually enjoy the 
current DConf format very much and believe it is not a waste of 
time and money at all. So to me, it is no surprise at all that 
it was decided to to stick with the current format.


I really don't care how many people agree or disagree. All I care 
about is the reasoning presented. As I see it, I gave lots of 
good reasons, and like Atila here, they gave none: only "I 
enjoyed myself." That's not a worthwhile reason, if the goal is 
to further the D language and community.


Also I don't think this is the right place for this discussion. 
If you feel that we indeed need to rediscuss this issue, I 
think it should be done in a separate thread.


I'm not trying to discuss it with you or the community. I'm 
asking the D team who're making this decision why it's being 
made, despite all the reasoning in that thread, and reiterating 
that it's a bad move. I suspect they're not thinking this 
through, but they can speak for themselves.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-22 Thread Joakim via Digitalmars-d-announce

On Saturday, 22 December 2018 at 14:26:29 UTC, Atila Neves wrote:

On Saturday, 22 December 2018 at 13:46:39 UTC, Joakim wrote:
On Saturday, 22 December 2018 at 12:18:25 UTC, Mike Parker 
wrote:


The egregious waste of time and resources of this DConf format 
strongly signals that D is not a serious effort to build a 
used language,


It's the same signal being emitted by all of these "failures" 
as well:


Go: https://twitter.com/dgryski/status/1034939523736600576
Rust: https://rustconf.com/
Clojure: https://clojure.org/community/events
Haskell: https://wiki.haskell.org/Conferences
C++: https://cppcon.org/ https://cpponsea.uk/ 
http://cppnow.org/ https://meetingcpp.com/


etc.

To me it's obvious from that short list that took me less than 
5min to come up with that conferences aren't a dying format. I 
gave up on C++ conferences after the 4th link, there are just 
too many.


The fact that a short list of conferences still exists at all 
somehow makes it "obvious" to you that they're not dying? Did you 
even look at my second link that actually tallies some numbers 
for a particular tech market?


It is true that a few conferences are still being done, even my 
second link above never said they're _all_ gone. But simply 
saying some are still following this outdated ritual is not an 
argument for continuing it, nor does it contradict anything I 
said about the number of conferences going down.



If you don't like conferences you don't have to go.


This has nothing do me: I've never been to DConf or most any 
other tech conference and likely never will. This is about 
whether the D team should be wasting time with this dying format.


I for one am excited about being in London in May. Please don't 
sour it for other who think/feel like I do.


Heh, so that's your two big arguments for why the conference 
format should continue: other languages are doing it and you want 
to visit London in May? You are exemplifying the mindset that I'm 
pointing out with these flimsy arguments, everything that is 
wrong with D and DConf.


Re: DConf 2019: Shepherd's Pie Edition

2018-12-22 Thread Joakim via Digitalmars-d-announce

On Saturday, 22 December 2018 at 12:18:25 UTC, Mike Parker wrote:
Thanks to Symmetry Investments, DConf is heading to London! 
We're still ironing out the details, but I've been sitting on 
this for weeks and, now that we have a venue, I just can't keep 
quiet about it any longer.


I've updated the DConf site and published a blog post, but I 
ask that you please don't share this to reddit just yet. I want 
to wait until after Christmas to share it there. We're still 
ironing out some details (deadlines, prices, hotels) and I'll 
update the DConf site in the coming days with info as I get it.


Happy Holidays!

http://dconf.org/2019/index.html

https://dlang.org/blog/2018/12/22/dconf-2019-shepherds-pie-edition/


Given that this conference format is dying off, is there any 
explanation for why the D team wants to continue this antiquated 
ritual?


https://marco.org/2018/01/17/end-of-conference-era
http://subfurther.com/blog/2018/01/15/the-final-conf-down/
https://forum.dlang.org/thread/ogrdeyojqzosvjnth...@forum.dlang.org

It costs $3k to hire a pull request manager, something D 
desperately needed, yet here you are having the average 
conference participant spend that mostly on flights and hotels to 
go to London, only to stare silently at presentations most of the 
time, while surrounded by a room full of people. What are the 
possible priorities that this can be considered a good idea?


The egregious waste of time and resources of this DConf format 
strongly signals that D is not a serious effort to build a used 
language, but a hobby project by two tech retirees, W, who just 
want to prototype some different ideas, show it off to a bunch of 
fellow hobbyists, and then have some beers and go sight-seeing.


If this is the core team's goal, please just stop stating 
otherwise and broadcast this on the front page of the website, as 
you're essentially doing by the way this blog post was written. 
Giant companies like google or Microsoft can afford these 
antiquated, giant wastes of time known as conferences and even 
they are cutting back. The fact that the D team is moving forward 
with this given how tech is moving is a horrible sign, suggesting 
it is completely out of touch and unable to prioritize well.


Re: LDC 1.13.0

2018-12-19 Thread Joakim via Digitalmars-d-announce

On Sunday, 16 December 2018 at 15:57:25 UTC, kinke wrote:

Glad to announce LDC 1.13:

* Based on D 2.083.1.
* The Windows packages are now fully self-sufficient, i.e., a 
Visual Studio/C++ Build Tools installation isn't required 
anymore.

* Substantial debug info improvements.
* New command-line option `-fvisibility=hidden` to hide 
functions/globals not marked as export, to reduce the size of 
shared libraries.


Full release log and downloads: 
https://github.com/ldc-developers/ldc/releases/tag/v1.13.0


New Wiki page highlighting cross-compilation: 
https://wiki.dlang.org/Cross-compiling_with_LDC


Thanks to all contributors!


Native Android packages for the Termux app have been updated, 
including an Android/x64 package for the first time (with the 
std.variant issue from the last beta now fixed). While no Android 
device uses x64, many x64 and AArch64 Chromebooks support 
installing Android apps like Termux, so if you have a Chromebook, 
you can now start writing and compiling D code on there too: :)


https://medium.com/@clumsycontraria/learning-to-code-on-a-bone-stock-chromebook-a7d0e75303bb

An Alpine build of ldc for Docker containers and microservices is 
also up now.


Re: Liran Zvibel of WekaIO on using D to Create the World’s Fastest File System

2018-12-09 Thread Joakim via Digitalmars-d-announce

On Wednesday, 5 December 2018 at 19:59:46 UTC, Joakim wrote:
On Wednesday, 5 December 2018 at 09:04:49 UTC, Walter Bright 
wrote:

#4 on HackerNews front page!

https://news.ycombinator.com/

33 points at the moment!


Now one of the top-voted links on the front page of HN.

I'd just like to point out that Andrei put Liran and I together 
to do this interview in summer '17- though I had emailed Liran 
about doing one in '15 and never got a response- and he finally 
got some time to respond this summer.


Now I just need Andrei to finally do an interview, which he's 
been putting off for even longer. :)


Got to fifth highest-voted link all-time from dlang.org on HN:

https://hn.algolia.com/?sort=byPopularity=false=0=all=story=false=dlang.org


Re: I've just released Vasaro

2018-12-07 Thread Joakim via Digitalmars-d-announce
On Thursday, 6 December 2018 at 20:45:07 UTC, Andrea Fontana 
wrote:

Hi!

I've just released the first version of vasaro.
It's a simple program I wrote to create 3d printable vases.

It's written in D (of course). It uses derelict-gl, 
derelict-sdl and gtkd.


It should work on linux, macOS and Windows.

A special thanks to Adam Ruppe and his SimpleDisplay library I 
used on earlier versions :)


A demo video:
https://www.youtube.com/watch?v=HkYo8WCW9jM

On reddit: 
https://www.reddit.com/r/3Dprinting/comments/a3qykj/ive_just_released_vasaro_the_easytouse_software/


On github:
https://github.com/trikko/vasaro/


Feel free to test it!

Andrea


Nice, when does the version with genetic algorithms come out? ;)

https://www.economist.com/technology-quarterly/2015/09/03/wonderful-widgets

JK, of course, demo looks good.


Re: Liran Zvibel of WekaIO on using D to Create the World’s Fastest File System

2018-12-05 Thread Joakim via Digitalmars-d-announce
On Wednesday, 5 December 2018 at 09:04:49 UTC, Walter Bright 
wrote:

#4 on HackerNews front page!

https://news.ycombinator.com/

33 points at the moment!


Now one of the top-voted links on the front page of HN.

I'd just like to point out that Andrei put Liran and I together 
to do this interview in summer '17- though I had emailed Liran 
about doing one in '15 and never got a response- and he finally 
got some time to respond this summer.


Now I just need Andrei to finally do an interview, which he's 
been putting off for even longer. :)


D is in GCC 9 proggit thread

2018-12-05 Thread Joakim via Digitalmars-d-announce

For those who missed it:

https://www.reddit.com/r/programming/comments/a30hg9/gcc_9_adds_frontend_support_for_the_d_programming/


Re: Interview with Liran Zvibel of WekaIO

2018-12-05 Thread Joakim via Digitalmars-d-announce

On Wednesday, 5 December 2018 at 13:30:21 UTC, Joakim wrote:

On Wednesday, 5 December 2018 at 08:02:21 UTC, M.M. wrote:

On Tuesday, 4 December 2018 at 14:21:02 UTC, Mike Parker wrote:

[...]


Interesting read. I am new to dlang, and after reading the 
post, I asked myself: the company liked the language, but 
tweaked the compiler. Could the company now switch to one of 
the official compilers? If not, why?


All three compilers listed on the official download page use 
the same frontend, written in D:


https://dlang.org/download

The LDC and GDC teams take that DMD frontend and attach it to 
the LLVM and GCC code-generation backends.


As for Weka's tweaks, github shows these different commits from 
their last 1.11 release to the official tag:


https://github.com/ldc-developers/ldc/compare/v1.11.0...weka-io:weka-2.071


Sorry, I compared the wrong Weka branch. Here's the right tag, 
shows fewer commits different:


https://github.com/ldc-developers/ldc/compare/v1.11.0...weka-io:weka-v1.11


Re: Interview with Liran Zvibel of WekaIO

2018-12-05 Thread Joakim via Digitalmars-d-announce

On Wednesday, 5 December 2018 at 08:02:21 UTC, M.M. wrote:

On Tuesday, 4 December 2018 at 14:21:02 UTC, Mike Parker wrote:
Joakim interviewed Liran for the D Blog about their file 
system, Matrix, and their use of D. Thanks to Joakim for 
putting it together, and to Liran for taking the time to 
participate!


Blog:
https://dlang.org/blog/2018/12/04/interview-liran-zvibel-of-wekaio/

Reddit:
https://www.reddit.com/r/programming/comments/a3106x/interview_liran_zvibel_of_wekaio/


Interesting read. I am new to dlang, and after reading the 
post, I asked myself: the company liked the language, but 
tweaked the compiler. Could the company now switch to one of 
the official compilers? If not, why?


All three compilers listed on the official download page use the 
same frontend, written in D:


https://dlang.org/download

The LDC and GDC teams take that DMD frontend and attach it to the 
LLVM and GCC code-generation backends.


As for Weka's tweaks, github shows these different commits from 
their last 1.11 release to the official tag:


https://github.com/ldc-developers/ldc/compare/v1.11.0...weka-io:weka-2.071

I get the sense that's mostly patches backported from newer LDC 
releases, as they understandably go slower than official LDC for 
stability, and some git cruft from maintaining their own branch. 
Their tweaks don't appear to be substantial on a skim, which 
makes sense since Johan is a committer on the LDC team.


Since LDC is an OSS project, they're free to tweak it for their 
own use and use it as they like. Johan has done much work for 
them which they've contributed back upstream to LDC. See Johan's 
blog posts for more info:


http://johanengelen.github.io


Re: Liran Zvibel of WekaIO on using D to Create the World’s Fastest File System

2018-12-05 Thread Joakim via Digitalmars-d-announce
On Wednesday, 5 December 2018 at 09:04:49 UTC, Walter Bright 
wrote:

#4 on HackerNews front page!

https://news.ycombinator.com/

33 points at the moment!


It's on lobste.rs now too:

https://lobste.rs/t/d

Thanks, Atila!


Re: Liran Zvibel of WekaIO on using D to Create the World’s Fastest File System

2018-12-05 Thread Joakim via Digitalmars-d-announce
On Wednesday, 5 December 2018 at 09:04:49 UTC, Walter Bright 
wrote:

#4 on HackerNews front page!

https://news.ycombinator.com/

33 points at the moment!


Fantastic, I want to get more commercial uses like this 
highlighted on the blog- started another interview now with a 
financial/ML firm using D- so nobody can say D isn't being used.


BTW, the top HN comment asks for more detail: tell him to click 
on the DConf links in the post for Liran's slides and videos. 
There's a surfeit of tech detail there, this is just an overview 
to get people started on learning more.


Re: Interview with Liran Zvibel of WekaIO

2018-12-04 Thread Joakim via Digitalmars-d-announce

On Wednesday, 5 December 2018 at 06:50:13 UTC, Mike Parker wrote:

On Tuesday, 4 December 2018 at 17:15:44 UTC, Joakim wrote:

On Tuesday, 4 December 2018 at 14:21:02 UTC, Mike Parker wrote:




Great to see this finally up! I agree with the only proggit 
comment though: the title is not descriptive enough for 
reddit/HN, as I doubt most have heard of Liran or Weka.


It's on HN now under a better title.


Thanks, let's see if it does any better. I asked Atila to submit 
it to lobste.rs too, his mutex post did well on there last month:


https://lobste.rs/t/d


Re: Interview with Liran Zvibel of WekaIO

2018-12-04 Thread Joakim via Digitalmars-d-announce

On Tuesday, 4 December 2018 at 14:21:02 UTC, Mike Parker wrote:
Joakim interviewed Liran for the D Blog about their file 
system, Matrix, and their use of D. Thanks to Joakim for 
putting it together, and to Liran for taking the time to 
participate!


Blog:
https://dlang.org/blog/2018/12/04/interview-liran-zvibel-of-wekaio/

Reddit:
https://www.reddit.com/r/programming/comments/a3106x/interview_liran_zvibel_of_wekaio/


Great to see this finally up! I agree with the only proggit 
comment though: the title is not descriptive enough for 
reddit/HN, as I doubt most have heard of Liran or Weka.


Re: LDC 1.13.0-beta2

2018-11-29 Thread Joakim via Digitalmars-d-announce

On Thursday, 22 November 2018 at 16:54:55 UTC, Joakim wrote:

On Thursday, 22 November 2018 at 16:36:22 UTC, H. S. Teoh wrote:
On Thu, Nov 22, 2018 at 01:25:53PM +, Joakim via 
Digitalmars-d-announce wrote:

On Wednesday, 21 November 2018 at 10:43:55 UTC, kinke wrote:
> Glad to announce the second beta for LDC 1.13:
> 
> * Based on D 2.083.0+ (yesterday's DMD stable).

[...]
I've added native builds for Android, including 
Android/x86_64 for the first time. Several tests for 
std.variant segfault, likely because of the 128-bit real 
causing x64 codegen issues, but most everything else passes.

[...]

What's the status of cross-compiling to 64-bit ARM?  On the 
wiki you wrote that it doesn't fully work yet.  Does it work 
with this new release?


It's been mostly working since 1.11. That note on the wiki 
links to this tracker issue that lists the few remaining holes, 
mostly just extending Phobos support for 80-bit precision out 
to full 128-bit Quadruple precision in a few spots and 
finishing off the C/C++ compatibility:


https://github.com/ldc-developers/ldc/issues/2153


Btw, if you ever want to check the current status of the AArch64 
port, all you have to do is look at the logs for the latest run 
of the ldc AArch64 CI, which kinke setup and is run for every ldc 
PR, on this dashboard:


https://app.shippable.com/github/ldc-developers/ldc/dashboard

Clicking on the last job on the master branch, expanding the 
build_ci output in the log, then doing the same for the stdlib 
tests, I see only five Phobos modules with failing tests. Three 
are mentioned in the tracker issue above, while std.complex has a 
single assert that trips, because it's a few bits off at 113-bit 
precision, which is still much more accurate than the 64-bit 
precision (or less) it's normally run at on x86/x64.


Also, a single assert on std.algorithm.sorting trips for the same 
reason as a handful of tests in std.math: -real.nan at 
compile-time is output as real.nan by ldc running natively on 
AArch64, though not when cross-compiling. 
std.internal.math.gammafunction works fine at 64-bit precision on 
AArch64, but only a couple of the 100 or so constant reals it 
uses are at full 113-bit precision, so several tests assert that 
only allow a couple bits to be off from full real precision. 
Obviously that only matters if you need full 113-bit precision 
from that module.


kinke recently disabled the tests for core.thread on the CI 
because they're super-flaky on linux/glibc/AArch64, while I 
haven't had that problem with Bionic/AArch64. You will see more 
tests failing if you cross-compile from x64, because of the 
mismatch between 64-bit precision for compile-time reals and 
113-bit precision for runtime reals on AArch64. Also, you can see 
the 10-12 modules that assert in the dmd compiler testsuite 
earlier in that log, most because of missing 
core.stdc.stdarg.va_arg support to call C varargs on AArch64.


That's about it: help is appreciated on tightening those last few 
screws.


Re: D compilation is too slow and I am forking the compiler

2018-11-27 Thread Joakim via Digitalmars-d-announce

On Monday, 26 November 2018 at 16:42:40 UTC, bachmeier wrote:

On Monday, 26 November 2018 at 16:21:39 UTC, Joakim wrote:

I agree that it was a risky title, as many who don't know D 
will simply see it and go, "Yet another slow compiler, eh, 
I'll pass" and not click on the link. Whereas others who have 
heard something of D will be intrigued, as they know it's 
already supposed to compile fast. And yet more others will 
click on it purely for the controversy, just to gawk at some 
technical bickering.


I don't actually think it was risky. What are the odds that 
someone was going to start using D for a major project but then 
changed her mind upon seeing a title on HN or Reddit? Probably 
very small that even one person did that.


Yes, but you're ignoring the much larger group I mentioned- those 
who only vaguely heard of D, if at all- and the negative title 
gives them a reason not to look into it further.


And then there is always the fact that there was a story on 
HN/Reddit about D. It's hard for publicity for a language like 
D to be bad when so few people use it.


The quote that "there's no such thing as bad publicity" comes 
from art and show business though, don't think it's true for tech 
and other markets. When your audience is looking for a tool and 
not entertainment, there's lots of ways for bad publicity to sink 
it.


Anyway, I noted in this case that the provocative title may 
actually have gotten more people to read a positive post, so the 
pros likely outweighed the cons. We can just never know how large 
the unclicked-on downside was: you can never measure how many 
people heard of but _didn't_ buy your book, because they didn't 
like the title or something else about its exterior.


On Monday, 26 November 2018 at 16:53:59 UTC, Guillaume Piolat 
wrote:

On Monday, 26 November 2018 at 16:21:39 UTC, Joakim wrote:
In my opinion language adoption is a seduction/sales process 
very much like business-to-consumer is, the way I see it it's 
strikingly similar to marketing B2C apps, unless there will 
be no "impulse buy".


I find that hard to believe: we are talking about a technical 
tool here.


How many times have you been in this conversation:

--

- What language are you using?
- D.
- I know next to nothing about D.
- Oh, it's very good, I even built a business on it! list of arguments and features>.
- Oh no thanks. I should try Rust, it's secure, fast, modern 
blah blah; facts don't matter to me. But in reality I won't 
even learn a new language, I'm happy with a language without 
multi-threading.


--

It happens to me ALL THE TIME.
This pattern is so predictable it's becoming boring so now I 
just keep silent.


Never, I don't go around trying to convince people one-on-one to 
use D. I have given talks to groups introducing the language, 
that's how I go about it.



What happens? Rust / Go have outmarketed us with words.

The battle (of marketing) is on words not technical features, 
Rust happen to own "programming language" + "safety", what do 
we own? D is good in all kinds of directions and the marketing 
message is less simple.


The leaders choose to own the word "fast" (see our new motto 
"fast code, fast" which is very accurate) and it's important to 
get aligned.


I'll note that in your example they haven't actually learnt Rust 
either. I don't think marketing is that relevant for D at this 
stage, nor for Rust/Go either.


The way anything- tech, fashion, TV shows- becomes popular is 
that some early tastemaker decides that it's worth using or 
backing. Eventually, enough early adopters find value that it 
spreads out to the majority, who simply follow their lead.


Most people aren't early adopters of most things. They like to 
think they are, so they'll give you all kinds of 
rational-sounding reasons for why they don't like some new tech, 
but the real underlying thought process goes something like this, 
"I have no idea if this new tech will do well or not. I could 
take a risk on it but it's safer not to, so I will just wait and 
see if it gets popular, then follow the herd."


Very few will admit this though, hence the list of 
plausible-sounding reasons that don't actually make sense! ;)


As Laeeth always says, you're best off looking for people who're 
actually capable and empowered to make such risky decisions, 
rather than aiming for the majority too early, because they only 
jump on board once the bandwagon is stuffed and rolling downhill.


Also, regardless of how languages are chosen as they get into 
the majority, D is very much still in the 
innovators/early-adopters stage:


But the current state of D would very much accomodate the 
middle-of-the-curve adopters. The language rarely breaks stuff. 
People making money with it, making long-term bets.


Hell, I could make a laundry list of things that are better in 
D versus any alternatives! That doesn't bring users.


I'm not talking about the quality of 

Re: D compilation is too slow and I am forking the compiler

2018-11-26 Thread Joakim via Digitalmars-d-announce
On Monday, 26 November 2018 at 16:00:36 UTC, Guillaume Piolat 
wrote:
On Thursday, 22 November 2018 at 04:48:09 UTC, Vladimir 
Panteleev wrote:
On Wednesday, 21 November 2018 at 20:51:17 UTC, Walter Bright 
wrote:
Unfortunately, you're right. The title will leave the 
impression "D is slow at compiling". You have to carefully 
read the article to see otherwise, and few will do that.


Sorry about that. I'll have to think of two titles next time, 
one for the D community and one for everyone else.


If it's of any consolation, the top comments in both 
discussion threads point out that the title is inaccurate on 
purpose.


Please don't get me wrong, it's an excellent article, a 
provocative title, and fantastic work going on. I didn't meant 
to hurt!


In my opinion language adoption is a seduction/sales process 
very much like business-to-consumer is, the way I see it it's 
strikingly similar to marketing B2C apps, unless there will be 
no "impulse buy".


I find that hard to believe: we are talking about a technical 
tool here.


Also, regardless of how languages are chosen as they get into the 
majority, D is very much still in the innovators/early-adopters 
stage:


https://en.m.wikipedia.org/wiki/Technology_adoption_life_cycle

That is a very different type of sales process, much more geared 
towards what the new tech can actually do.


Actually no less than 3 programmer friends came to (I'm the 
weirdo-using-D and people are _always_ in disbelief and invent 
all sorts of reasons not to try) saying they saw an article on 
D on HN, with "D compilation is slow", and on further 
examination they didn't read or at best the first paragraph. 
But they did remember the title. They may rationally think 
their opinion of D hasn't changed: aren't we highly capable 
people?


With people like that, it's almost impossible to get them in the 
early adopter stage. They will only jump on the bandwagon once 
it's full, ie as part of the late majority.



I'm not making that up! So why is it a problem ?

HN may be the only time they hear about D. The words of the 
title may be their only contact with it. The first 3 words of 
the title may be the only thing associated with the "D 
language" chunk in their brain.


The associative mind doesn't know _negation_ so even a title 
like "D compilation wasn't fast so I forked the compiler" is 
better from a marketing point of view since it contains the 
word "fast" in it! That's why marketing people have the 
annoying habit of using positive words, you may think this 
stuff is unimportant but this is actually the important meat.


Reasonable people may think marketing and biases don't apply to 
them but they do, it works without your consent.


I agree that it was a risky title, as many who don't know D will 
simply see it and go, "Yet another slow compiler, eh, I'll pass" 
and not click on the link. Whereas others who have heard 
something of D will be intrigued, as they know it's already 
supposed to compile fast. And yet more others will click on it 
purely for the controversy, just to gawk at some technical 
bickering.


Given how well it did on HN/reddit/lobste.rs, I think Vlad's 
gamble probably paid off. We can't run the counterfactual of 
choosing a safer title to see if it would have done even better, 
let's just say it did well enough. ;)


Re: LDC 1.13.0-beta2

2018-11-22 Thread Joakim via Digitalmars-d-announce

On Thursday, 22 November 2018 at 16:36:22 UTC, H. S. Teoh wrote:
On Thu, Nov 22, 2018 at 01:25:53PM +, Joakim via 
Digitalmars-d-announce wrote:

On Wednesday, 21 November 2018 at 10:43:55 UTC, kinke wrote:
> Glad to announce the second beta for LDC 1.13:
> 
> * Based on D 2.083.0+ (yesterday's DMD stable).

[...]
I've added native builds for Android, including Android/x86_64 
for the first time. Several tests for std.variant segfault, 
likely because of the 128-bit real causing x64 codegen issues, 
but most everything else passes.

[...]

What's the status of cross-compiling to 64-bit ARM?  On the 
wiki you wrote that it doesn't fully work yet.  Does it work 
with this new release?


It's been mostly working since 1.11. That note on the wiki links 
to this tracker issue that lists the few remaining holes, mostly 
just extending Phobos support for 80-bit precision out to full 
128-bit Quadruple precision in a few spots and finishing off the 
C/C++ compatibility:


https://github.com/ldc-developers/ldc/issues/2153






Re: LDC 1.13.0-beta2

2018-11-22 Thread Joakim via Digitalmars-d-announce

On Wednesday, 21 November 2018 at 10:43:55 UTC, kinke wrote:

Glad to announce the second beta for LDC 1.13:

* Based on D 2.083.0+ (yesterday's DMD stable).
* The Windows packages are now fully self-sufficient, i.e., a 
Visual Studio/C++ Build Tools installation isn't required 
anymore.

* Substantial debug info improvements.
* New command-line option `-fvisibility=hidden` to hide 
functions/globals not marked as export, to reduce the size of 
shared libraries.


Full release log and downloads: 
https://github.com/ldc-developers/ldc/releases/tag/v1.13.0-beta2


Thanks to all contributors!


I've added native builds for Android, including Android/x86_64 
for the first time. Several tests for std.variant segfault, 
likely because of the 128-bit real causing x64 codegen issues, 
but most everything else passes.


This means that if you have an x86 or x64 Chromebook that 
supports running Android apps, you can install the Termux app and 
compile D code on there:


https://nosarthur.github.io/coding/2018/01/15/termux.html


Re: DMD backend now in D

2018-11-13 Thread Joakim via Digitalmars-d-announce

On Tuesday, 13 November 2018 at 20:42:00 UTC, Temtaime wrote:
On Monday, 12 November 2018 at 02:37:54 UTC, Walter Bright 
wrote:

On 11/11/2018 3:58 PM, Mike Franklin wrote:

This is a significant milestone.  Congratulations, Walter!


Many people helped out with this, too.

There are still a few .c files in 
https://github.com/dlang/dmd/tree/master/src/dmd/backend, so 
what's the significance of those?


tk.c
fp.c
os.c
strtold.c
tk/mem.c

These could be converted too, but are independent from 
everything else and hardly seem worth the bother. Sebastian 
has a PR for os.cd.



Will there ever be a day when we no longer need a C/C++ 
compiler to build DMD?


Sure.



No, as phobos is dependent on C libraries such as a zlib for 
example.


DMD doesn't use Phobos.


Also D is dependent on libc.


It's possible to reimplement the subset of libc functions that 
DMD depends on in D.


Re: DIP 1015--Deprecation of Implicit Conversion of Int. & Char. Literals to bool--Formal Assement

2018-11-12 Thread Joakim via Digitalmars-d-announce

On Monday, 12 November 2018 at 17:25:15 UTC, Johannes Loher wrote:

On Monday, 12 November 2018 at 16:39:47 UTC, Mike Parker wrote:
Walter and Andrei take the position that this is incorrect the 
wrong way to view a bool.


Unfortunately you did not include their justification for this 
position (if any). To me it would be interesting to know about 
the reasoning that is behind this position.


Maybe you didn't read the link to their reasoning in the DIP, but 
it's quite simple: they view a bool as an integral type with two 
possible values, a `bit` if you like. As such, they prefer to fit 
it into the existing scheme for integral types rather than 
special-casing booleans as Mike proposed.


Re: The New Fundraising Campaign

2018-11-10 Thread Joakim via Digitalmars-d-announce

On Saturday, 10 November 2018 at 16:09:12 UTC, Mike Parker wrote:
I've just published a new blog post describing our new 
fundraising campaign. TL;DR: We want to pay a Pull Request 
Manager to thin out the pull request queues and coordinate 
between relevant parties on newer pull requests so they don't 
go stale. We've launched a three-month campaign, and Nicholas 
Wilson has agreed to do the work.


We have high hopes that this will help reduce frustration for 
current and future contributors. And we will be grateful for 
your support in making it happen.


Please read the blog post for more details:

https://dlang.org/blog/2018/11/10/the-new-fundraising-campaign/

For the impatient:
https://www.flipcause.com/secure/cause_pdetails/NDUwNTY=


"Walter and Andrei both" -> both Walter and Andrei
"Pull requests were" -> Pull Requests (PRs) were
"the list. The one linked above, for example." -> the list, for 
example, the one linked above.


Nice work setting this up, looking forward to many more targeted 
campaigns like this.


Re: Profiling DMD's Compilation Time with dmdprof

2018-11-08 Thread Joakim via Digitalmars-d-announce

On Thursday, 8 November 2018 at 08:29:28 UTC, Manu wrote:
On Thu, Nov 8, 2018 at 12:10 AM Joakim via 
Digitalmars-d-announce  
wrote:


On Thursday, 8 November 2018 at 07:54:56 UTC, Manu wrote:
> On Wed, Nov 7, 2018 at 10:30 PM Vladimir Panteleev via 
> Digitalmars-d-announce 
>  wrote:

>>
>> On Thursday, 8 November 2018 at 06:08:20 UTC, Vladimir 
>> Panteleev wrote:
>> > It was definitely about 4 seconds not too long ago, a few 
>> > years at most.

>>
>> No, it's still 4 seconds.
>>
>> digger --offline --config-file=/dev/null -j auto -c 
>> local.cache=none build 7.31s user 1.51s system 203% cpu 
>> 4.340 total

>>
>> > It does seem to take more time now; I wonder why.
>>
>> If it takes longer, then it's probably because it's being 
>> built in one CPU core, or in the release build.

>
> https://youtu.be/msWuRlD3zy0

Lol, I saw that link and figured it was either some comedy 
video, like the Python ones Walter sometimes posts, or you 
were actually showing us how long it takes. Pretty funny to 
see the latter.


It's not so funny when every one-line tweak burns 2 minutes of 
my life away.


I was laughing that you actually proved your point with direct 
video evidence, obviously it's sad that it takes so long.



> DMD only builds with one core, since it builds altogether.

Yes, but your build time is unusually long even with one core. 
Are the D backend and frontend at least built in parallel to 
each other?


That doesn't matter, you can clearly see the backend built in 
less than 2 seconds.


The C/C++ files in the beginning are built very fast, but the D 
files in the backend appear to take much longer, kicking in at 
1:18 of your video and then the next compilation step starts at 
1:40.


I suspect part of the problem is that your build is being done 
completely serially, even for separate compilation. I have no 
experience with VS, so I don't know why that is.


It doesn't seem to be even doing that, though they're separate 
invocations of DMD.


I didn't configure the build infrastructure!


Maybe you can? I have no experience with VS, but surely it has 
some equivalent of ninja -j5?



> And all builds are release builds... what good is a debug
> build? DMD
> is unbelievably slow in debug. If it wasn't already slow
> enough... if
> I try and build with a debug build, it takes closer to 5
> minutes.
>
> I suspect one part of the problem is that DMD used to be 
> built with a C compiler, and now it's built with DMD... it 
> really should be built with LDC at least?


Could be part of the problem on Windows, dunno.


Well... ffs... people need to care about this! >_<


I agree that the official release of DMD for Windows should be 
faster, and we should be building it with ldc... if that's the 
problem.


Re: Backend nearly entirely converted to D

2018-11-08 Thread Joakim via Digitalmars-d-announce

On Wednesday, 7 November 2018 at 21:40:58 UTC, welkam wrote:

On Wednesday, 7 November 2018 at 14:39:55 UTC, Joakim wrote:


I don't know why you think that would matter: I'm using the 
same compilers to build each DMD version and comparing the 
build times as the backend was translated to D


What did you compared is whether clang or DMD compiles code 
faster not whether D code compiles faster than C++. To check 
that you should compile both C++ and D with the same backend.


I'm not making any general statements about whether C++ or D 
compiles faster, only pointing out that in a common setup of 
building dmd with clang and dmd on linux/x64, I didn't see much 
of a speed gain. However, I did mention that the frontend should 
be removed to really measure the backend conversion, so that's 
what I just did.


I built the backends for DMD 2.080.1 through master in the same 
single-core VPS by slightly modifying src/posix.mak, only 
replacing the line "all: $G/dmd" with "all: $G/backend.a". Here 
are the results I got and how many D files were built in each 
backend:


2.080.1 - 1D  8.0s
2.081.2 - 4D  7.2s
2.082.1 - 27D 6.9s
2.083.0 - 45D 5.6s
master d398d8c - 50D 4.3s

So the frontend might have been obscuring things, as we see a 
clear win from moving the backend to D, with only about 10 C/C++ 
files left in the backend now and compilation time cut almost in 
half. I think we'll see even more of a gain if the D files in the 
backend are built all at once.


Re: Profiling DMD's Compilation Time with dmdprof

2018-11-08 Thread Joakim via Digitalmars-d-announce

On Thursday, 8 November 2018 at 07:54:56 UTC, Manu wrote:
On Wed, Nov 7, 2018 at 10:30 PM Vladimir Panteleev via 
Digitalmars-d-announce  
wrote:


On Thursday, 8 November 2018 at 06:08:20 UTC, Vladimir 
Panteleev wrote:
> It was definitely about 4 seconds not too long ago, a few 
> years at most.


No, it's still 4 seconds.

digger --offline --config-file=/dev/null -j auto -c 
local.cache=none build 7.31s user 1.51s system 203% cpu 
4.340 total


> It does seem to take more time now; I wonder why.

If it takes longer, then it's probably because it's being 
built in one CPU core, or in the release build.


https://youtu.be/msWuRlD3zy0


Lol, I saw that link and figured it was either some comedy video, 
like the Python ones Walter sometimes posts, or you were actually 
showing us how long it takes. Pretty funny to see the latter.



DMD only builds with one core, since it builds altogether.


Yes, but your build time is unusually long even with one core. 
Are the D backend and frontend at least built in parallel to each 
other? It doesn't seem to be even doing that, though they're 
separate invocations of DMD.


And all builds are release builds... what good is a debug 
build? DMD
is unbelievably slow in debug. If it wasn't already slow 
enough... if
I try and build with a debug build, it takes closer to 5 
minutes.


I suspect one part of the problem is that DMD used to be built 
with a C compiler, and now it's built with DMD... it really 
should be built with LDC at least?


Could be part of the problem on Windows, dunno.


Re: Profiling DMD's Compilation Time with dmdprof

2018-11-07 Thread Joakim via Digitalmars-d-announce

On Thursday, 8 November 2018 at 07:41:58 UTC, Manu wrote:
On Wed, Nov 7, 2018 at 10:30 PM Joakim via 
Digitalmars-d-announce  
wrote:


On Thursday, 8 November 2018 at 04:16:44 UTC, Manu wrote:
> On Tue, Nov 6, 2018 at 10:05 AM Vladimir Panteleev via 
> Digitalmars-d-announce 
>  wrote:

>> [...]
>
> "Indeed, a clean build of DMD itself (about 170’000 lines of 
> D and 120’000 lines of C/C++) takes no longer than 4 seconds 
> to build on a rather average developer machine."

>
> ...what!? DMD takes me... (compiling) ... 1 minute 40 
> seconds to build! And because DMD does all-files-at-once 
> compilation, rather than separate compilation for each 
> source file, whenever you change just one line in one file, 
> you incur that entire build time, every time, because it 
> can't just rebuild the one source file that changed. You 
> also can't do multi-processor builds with all-in-one build 
> strategies.

>
> 4 seconds? That's just untrue. D is actually kinda slow 
> these days... In my experience it's slower than modern C++ 
> compilers by quite a lot.


It sounds like you're not using "a rather average developer 
machine" then, as there's no way DMD should be that slow to 
build on a core i5 or better:


https://forum.dlang.org/post/rqukhkpxcvgiefrdc...@forum.dlang.org


I'm on an i7 with 8 threads and plenty of ram... although 
threads are useless, since DMD only uses one ;)


Running Windows XP? ;) That does sound like Windows though, as I 
do remember being surprised how long dmd took to build on Win7 
when I tried it 8-9 years back. I still don't think the toolchain 
should be _that_ much slower than linux though.


Btw, the extra cores are _not_ useless for the DMD backend, which 
has always used separate compilation, whether written in C++ or D.


Re: Profiling DMD's Compilation Time with dmdprof

2018-11-07 Thread Joakim via Digitalmars-d-announce

On Thursday, 8 November 2018 at 04:16:44 UTC, Manu wrote:
On Tue, Nov 6, 2018 at 10:05 AM Vladimir Panteleev via 
Digitalmars-d-announce  
wrote:

[...]


"Indeed, a clean build of DMD itself (about 170’000 lines of D 
and 120’000 lines of C/C++) takes no longer than 4 seconds to 
build on a rather average developer machine."


...what!? DMD takes me... (compiling) ... 1 minute 40 seconds 
to build! And because DMD does all-files-at-once compilation, 
rather than separate compilation for each source file, whenever 
you change just one line in one file, you incur that entire 
build time, every time, because it can't just rebuild the one 
source file that changed. You also can't do multi-processor 
builds with all-in-one build strategies.


4 seconds? That's just untrue. D is actually kinda slow these 
days... In my experience it's slower than modern C++ compilers 
by quite a lot.


It sounds like you're not using "a rather average developer 
machine" then, as there's no way DMD should be that slow to build 
on a core i5 or better:


https://forum.dlang.org/post/rqukhkpxcvgiefrdc...@forum.dlang.org


Re: Backend nearly entirely converted to D

2018-11-07 Thread Joakim via Digitalmars-d-announce

On Wednesday, 7 November 2018 at 15:12:13 UTC, Dukc wrote:

On Wednesday, 7 November 2018 at 14:39:55 UTC, Joakim wrote:
I don't know why you think that would matter: I'm using the 
same compilers to build each DMD version and comparing the 
build times as the backend was translated to D.


Because generally, LLVM compilers provide faster code, but 
compile slower than Digital Mars compilers AFAIK. So if you 
compile the D code with DMD but C code with LDC, the program 
will likely compile faster but execute slower as increasing 
portions are written in D, compared to using the same backend 
for both languages.


I'm not sure if you benchmarked the time used to build DMD, or 
the time used by generated DMD to compile some other program. 
If it was the former, the "real" result is probably worse than 
your results. But if it was the latter, it is likely better.


The former, if it wasn't clear. It's also possible something 
slowed down building the frontend in successive DMD versions, so 
ideally I'd only time building the backend for each DMD version, 
but I haven't looked into that.


Re: Backend nearly entirely converted to D

2018-11-07 Thread Joakim via Digitalmars-d-announce

On Wednesday, 7 November 2018 at 11:22:13 UTC, Dukc wrote:

On Wednesday, 7 November 2018 at 08:31:21 UTC, Joakim wrote:
I just benchmarked building the last couple versions of DMD, 
when most of the backend was converted to D, by building them 
with the latest DMD 2.083.0 official release and clang 6.0 in 
a single-core linux/x64 VPS. Here are the times I got, best of 
3 runs for each:


2.081.2 - 11.5s
2.082.1 - 10.5s
2.083.0 - 9.9s
master  - 10.8s

Not quite the gains hoped for, particularly with those last 
large files you just converted to D seemingly slowing 
compilation down


Could this be because you used a LLVM compiler for the C code 
but a Mars compiler for D code? If one either uses DMC for C or 
LDC for D, perhaps the results will be better.


I don't know why you think that would matter: I'm using the same 
compilers to build each DMD version and comparing the build times 
as the backend was translated to D. Maybe I'd get different 
results by using different compilers, but these are two fairly 
fast and commonly used compilers so they're worth checking with.


Re: Backend nearly entirely converted to D

2018-11-07 Thread Joakim via Digitalmars-d-announce

On Tuesday, 6 November 2018 at 22:12:02 UTC, Walter Bright wrote:

With the recent merging of the last of the big files machobj.d:

https://github.com/dlang/dmd/pull/8911

I'm happy to say we're over the hump in converting the backend 
to D!


Great! Although I wish it didn't have to be you mostly doing this 
grunt work.


Remaining files are minor: tk.c, cgen.c, dt.c, fp.c, os.c, 
outbuf.c, sizecheck.c, strtold.c and mem.c. I'll probably leave 
a couple in C anyway - os.c and strtold.c. sizecheck.c will 
just go away upon completion.


Thanks to everyone who helped out with this!

Of course, the code remains as ugly as it was in C. It'll take 
time to bit by bit refactor it into idiomatic D.


I just benchmarked building the last couple versions of DMD, when 
most of the backend was converted to D, by building them with the 
latest DMD 2.083.0 official release and clang 6.0 in a 
single-core linux/x64 VPS. Here are the times I got, best of 3 
runs for each:


2.081.2 - 11.5s
2.082.1 - 10.5s
2.083.0 - 9.9s
master  - 10.8s

Not quite the gains hoped for, particularly with those last large 
files you just converted to D seemingly slowing compilation down, 
but maybe it will get better with refactoring and when the entire 
backend is compiled at once, rather than the DMD separate 
compilation used now.


The more immediate benefit is to get rid of all the parallel .h 
files, which were a constant source of bugs when they didn't 
match the .d versions.


I was going to ask why you wouldn't need those headers for your 
C/C++ compiler, DMC, but it looks like you've translated that to 
mostly D already:


https://github.com/DigitalMars/Compiler/tree/master/dm/src/dmc


Re: Lost in Translation: Encapsulation

2018-11-06 Thread Joakim via Digitalmars-d-announce

On Tuesday, 6 November 2018 at 15:14:55 UTC, Mike Parker wrote:
Last week, inspired by another discussion in these forums about 
D's private-to-the-module form of encapsulation, I spent a few 
hours putting a new article together for the blog. Ali, Joakim, 
Nicholas helped me get it in shape.


The blog:
https://dlang.org/blog/2018/11/06/lost-in-translation-encapsulation/

Reddit:
https://www.reddit.com/r/programming/comments/9up2yo/lost_in_translation_encapsulation_in_d/


Nicely done, think this could do well on proggit/HN/lobste.rs.


Re: LDC 1.13.0-beta1

2018-11-04 Thread Joakim via Digitalmars-d-announce

On Friday, 2 November 2018 at 21:04:13 UTC, kinke wrote:

Glad to announce the first beta for LDC 1.13:

* Based on D 2.083.0.
* The Windows packages are now fully self-sufficient, i.e., a 
Visual Studio/C++ Build Tools installation isn't required 
anymore.

* Substantial debug info improvements for GDB.

Full release log and downloads: 
https://github.com/ldc-developers/ldc/releases/tag/v1.13.0-beta1


Thanks to all contributors!


I've added native Termux builds for Android, including x86 for 
the first time.


Cross-compiling to Android/x64 mostly works, but LDC itself 
segfaults when cross-compiled
and run on Android/x64, likely because it uses a 128-bit real 
just like AArch64. I'll see if I can get that fixed before the 
final 1.13 release.


Re: smile.amazon.com Promotion

2018-10-31 Thread Joakim via Digitalmars-d-announce

On Thursday, 1 November 2018 at 03:18:44 UTC, SealabJaster wrote:

On Monday, 29 October 2018 at 16:40:20 UTC, FooledDonor wrote:

On Monday, 29 October 2018 at 16:01:38 UTC, Mike Parker wrote:
One of the easiest ways to support the D Language Foundation 
is using smile.amazon.com when you make a purchase. Until Nov 
2, they're running a special where they're donating 5% (10 
times the usual amount) you buy through AmazonSmile.


smile.amazon.com/ch/47-5352856


Perhaps a fundamental principle is not clear enough at the 
foundation: transparency.


Where is the vision of the third and fourth quarter? Where are 
the deliveries of things in the pipeline? What is the progress 
of the various jobs started?


Which people is funding, with how much money and for what 
expected results?
Where is the newCTFE? Was the work on this point financed by 
the foundation?


I've never seen a report on the state of affairs, neither from 
the president, nor from Andrei, nor from Walter.


How do you hope to obtain trust and funding, if NO one even 
deigns to give the least development plan or feedback on past 
developments?


It seems that everyone has locked up in their ivory tower ...


It's kind of discouraging to see that your post, as well as 
another thread asking something similar regarding the vision 
document[1] have gone unanswered...


Maybe the people who could answer these things just don't see 
them, or maybe they're purposefully being quiet. It would be 
nice to know what's going on at the very least ;(


[1] 
https://forum.dlang.org/thread/qmwovarkjgvxyibsl...@forum.dlang.org


My guess, and this is purely a guess, is that they got 
discouraged by how few people paid attention to the Vision 
document or donated to the foundation on Opencollective and 
haven't bothered with this stuff since.


I think that's a mistake, as you may need to do this stuff for 
awhile before it picks up. In any case, I don't care that it 
isn't happening, as I always said that it's better to have 
decentralized bounties, like we had on bountysource, rather than 
centralized funding through the D Foundation.


Maybe the upcoming targeted campaigns will be a good middle 
ground, in that you will be able to directly contribute to 
specific targets:


https://dlang.org/blog/2018/07/13/funding-code-d/


Re: Add D front-end, libphobos library, and D2 testsuite... to GCC

2018-10-30 Thread Joakim via Digitalmars-d-announce

On Monday, 29 October 2018 at 09:57:46 UTC, Walter Bright wrote:

On 10/28/2018 8:43 PM, Mike Parker wrote:
Congratulations are in order for Iain Buclaw. His efforts have 
been rewarded in a big way. Last Friday, he got the greenlight 
to move forward with submitting his changes into GCC:


Reddit: 
https://www.reddit.com/r/programming/comments/9sb74k/the_d_language_frontend_finally_merged_into_gcc_9/


HackerNews (at #12 on the front page):
https://news.ycombinator.com/news


On Lobsters too:

https://lobste.rs/s/9ziils/d_language_front_end_finally_merged_into


Re: New Initiative for Donations

2018-10-27 Thread Joakim via Digitalmars-d-announce

On Friday, 26 October 2018 at 17:20:08 UTC, Neia Neutuladh wrote:

On Fri, 26 Oct 2018 06:19:29 +, Joakim wrote:

On Friday, 26 October 2018 at 05:47:05 UTC, Neia Neutuladh 
wrote:

On Fri, 26 Oct 2018 02:38:08 +, Joakim wrote:
As with D, sometimes the new _is_ better, so perhaps you 
shouldn't assume old is better either.


There's no assuming going on. Cryptocurrencies are worse than 
credit cards for everything that normal people care about,


Such as? I already noted that they're easier and cheaper, you 
simply flatly state that "normal people" find them worse.


In most countries where people are going to donate to D, the 
vast majority of people have access to a credit card.


That's not really true, and that's not actually something "worse" 
about cryptocurrencies. If you really mean have some lying 
around, it is true that more are using credit cards. If you 
actually mean access, crypto-currencies are pretty easy to buy 
these days.


If for some reason cryptocurrencies become popular and 
sufficiently stable to be used as currency, I have no doubt 
that existing credit card companies will start offering 
automatic currency exchange, so you can have an account in 
USD and pay a vendor who accepts only Ethereum, or vice 
versa. As such, accepting credit card payments is good enough.


I don't know what we'd be waiting for, the tokens I mentioned 
are all worth billions and widely used, particularly by 
techies:


Very few merchants accept any sort of cryptocurrency. I think 
I've found three. One was through a cryptocurrency forum, and 
one was Valve announcing that they would stop accepting it.


You must not have looked very hard, there are online retailers 
accepting crypto-tokens and websites that will make payments for 
you on Amazon or other sites through Bitcoin:


https://www.overstock.com/blockchain
https://purse.io/shop

Why would I wait for antiquated credit-card companies to 
accept these tokens? The whole point of these new tokens is to 
obsolete the credit card companies.


You wouldn't wait. You haven't waited. For you, the benefits 
are large enough and the downsides small enough that it doesn't 
make sense to wait. But I'm not you.


No, I'm not much of a cryptocurrency user or online shopper even. 
I mostly buy locally with cash.


I would wait because I've lost access to important credentials 
before and had to send a copy of my government-issued ID to a 
company to get them to deactivate two-factor authentication. 
I've had to use password reset mechanisms frequently. I don't 
trust myself not to lose access to a cryptocurrency private 
key. And that would destroy currency and lose me my life 
savings.


I don't blame you for being careful if you've had these problems, 
most of which I've never had, but you wildly exaggerate with your 
last sentence. Crypto-tokens are a replacement for cash and 
credit cards, which you should never be carrying around more than 
a couple hundred or thousand dollars worth of. If you're carrying 
around your life savings in cash or credit cards and are worried 
about moving them to bitcoin, you have much bigger problems. ;)


I would wait because I want a mechanism to dispute 
transactions. Maybe I authorized that transaction, but the 
merchant didn't deliver.


I don't think the payment provider is the right mechanism for 
that. The seller wants to protect their reputation and your 
payment is publicly verifiable through the blockchain. There are 
much better ways to build trust through those building blocks 
than the currently broken credit card chargeback process:


https://www.shopify.com/retail/what-is-a-chargeback

I would wait because I want an environmentally-friendly system 
instead of one that uses as much electricity as Afghanistan to 
process fifteen transactions per second.


Yes, I noted the Bitcoin "Proof of work" problem in this forum 
almost five years ago, so I'm well aware:


https://forum.dlang.org/post/xzuzvykrqouqlsjmk...@forum.dlang.org

There are "Proof of stake" crypto-tokens out there that purport 
to avoid that issue:


https://blockgeeks.com/guides/proof-of-work-vs-proof-of-stake/

Ether, one of the tokens I mentioned originally, is moving to 
this scheme.


I would wait because cryptocurrencies have extremely volatile 
exchange rates, which makes it difficult to set prices or store 
value in them.


If you're buying online, which is what we're talking about, it's 
trivially simple to track the exchange rates and instantaneously 
set store prices accordingly. It may be a bit different for 
consumers, but by the time they're all using some payments tech 
like this, the exchange rates will likely have settled down.


I would wait because I can't use cryptocurrency to do anything 
useful, so I would incur a fee to transfer money into it and 
another to transfer money out of it.


Not necessarily- it depends on who you're buying your tokens 
from- and crypto-tokens usually work out cheaper once you include 
other 

Re: New Initiative for Donations

2018-10-26 Thread Joakim via Digitalmars-d-announce

On Friday, 26 October 2018 at 05:47:05 UTC, Neia Neutuladh wrote:

On Fri, 26 Oct 2018 02:38:08 +, Joakim wrote:
As with D, sometimes the new _is_ better, so perhaps you 
shouldn't assume old is better either.


There's no assuming going on. Cryptocurrencies are worse than 
credit cards for everything that normal people care about,


Such as? I already noted that they're easier and cheaper, you 
simply flatly state that "normal people" find them worse.



and they're better than credit cards for illegal transactions.


Yes, just like cash, and have other benefits that come with cash 
too.



This might eventually change, and we can re-evaluate then.

If for some reason cryptocurrencies become popular and 
sufficiently stable to be used as currency, I have no doubt 
that existing credit card companies will start offering 
automatic currency exchange, so you can have an account in USD 
and pay a vendor who accepts only Ethereum, or vice versa. As 
such, accepting credit card payments is good enough.


I don't know what we'd be waiting for, the tokens I mentioned are 
all worth billions and widely used, particularly by techies:


https://coinmarketcap.com

Why would I wait for antiquated credit-card companies to accept 
these tokens? The whole point of these new tokens is to obsolete 
the credit card companies.


Re: New Initiative for Donations

2018-10-25 Thread Joakim via Digitalmars-d-announce
On Thursday, 25 October 2018 at 22:35:40 UTC, Nick Sabalausky 
wrote:

On Wednesday, 24 October 2018 at 10:25:17 UTC, Joakim wrote:
On Wednesday, 24 October 2018 at 10:18:51 UTC, Mike Parker 
wrote:

On Wednesday, 24 October 2018 at 10:12:50 UTC, Joakim wrote:



Any effort underway to take Bitcoin Cash, Ether, or Ripple 
as donations? The current payment options seem fairly 
antiquated: credit cards, wire transfers, and the like.


Not that I'm aware of. I'd hardly call credit cards 
antiquated, though :-)


60-year old tech seems pretty old to me:

https://en.m.wikipedia.org/wiki/Credit_card#BankAmericard_and_Master_Charge



And yet it's still by far the most common payment method. So 
what if it isn't trendy. Deal with it.


In the US maybe, not in most of the world, where they're still 
using cash. ;) I almost never use my cards, and like that 
crypto-currencies have more in similar to cash.


On Thursday, 25 October 2018 at 23:10:50 UTC, H. S. Teoh wrote:
On Thu, Oct 25, 2018 at 10:35:40PM +, Nick Sabalausky via 
Digitalmars-d-announce wrote:

On Wednesday, 24 October 2018 at 10:25:17 UTC, Joakim wrote:
> On Wednesday, 24 October 2018 at 10:18:51 UTC, Mike Parker 
> wrote:
> > On Wednesday, 24 October 2018 at 10:12:50 UTC, Joakim 
> > wrote:

[...]

> > > [...]
> > 
> > Not that I'm aware of. I'd hardly call credit cards 
> > antiquated, though :-)
> 
> 60-year old tech seems pretty old to me:
> 
> https://en.m.wikipedia.org/wiki/Credit_card#BankAmericard_and_Master_Charge
> 

And yet it's still by far the most common payment method. So 
what if it isn't trendy. Deal with it.


Common fallacy: new == better.


As with D, sometimes the new _is_ better, so perhaps you 
shouldn't assume old is better either.


Re: shared - i need it to be useful

2018-10-21 Thread Joakim via Digitalmars-d

On Monday, 22 October 2018 at 00:22:19 UTC, Manu wrote:
On Sun, Oct 21, 2018 at 2:35 PM Walter Bright via Digitalmars-d 
 wrote:


On 10/21/2018 2:08 PM, Walter Bright wrote:
> On 10/21/2018 12:20 PM, Nicholas Wilson wrote:
>> Yes, but the problem you describe is arises from implicit 
>> conversion in the other direction, which is not part of the 
>> proposal.

>
> It's Manu's example.

Then I don't know what the proposal is. Pieces of it appear to 
be scattered over numerous posts, mixed in with other text,


No no, they're repeated, not scattered, because I seem to have 
to keep repeating it over and over, because nobody is reading 
the text, or perhaps imaging there is a lot more text than 
there is.


I told you this is what happens with forum posts 4 days ago, yet 
you didn't listen:


https://forum.dlang.org/post/fokdcnzircoiuhrhz...@forum.dlang.org


opinions, and handwavy stuff.


You mean like every post in opposition which disregards the 
rules and baselessly asserts it's a terrible idea? :/



There's nothing to point to that is "the proposal".


You can go back to the OP, not a single detail is changed at 
any point, but I've repeated it a whole bunch of times 
(including in direct response to your last post) and the 
summary has become more concise, but not different.


1. Shared has no read or write access to data
2. Functions with shared arguments are threadsafe with respect 
to

those arguments
  a. This is a commitment that must be true in _absolute terms_ 
(there
exists discussion about the ways that neighbours must not 
undermine

this promise)
  b. There are numerous examples demonstrating how to configure 
this
(TL;DR: use encapsulation, and @trusted at the bottom of the 
stack)


If you can find a legitimate example where those rules don't 
hold, I

want to see it.
But every example so far has been based on a faulty premise 
where

those 2 simple rules were not actually applied.


Put it all together in a 2-3 page proposal elsewhere, so he 
doesn't have to hunt everything out in a blizzard of forum posts.


I responded to your faulty program directly with the correct 
program, and you haven't acknowledged it. Did you see it?


I suggest you and Manu write up a proper proposal. Something 
that is complete, has nothing else in it, has a rationale, 
illuminating examples, and explains why alternatives are 
inferior.


I have written this program a couple of times, including in 
direct

response to your last sample program.
You seem to have dismissed it... where is your response to that
program, or my last entire post?


For examples of how to do it:

https://github.com/dlang/DIPs/tree/master/DIPs

Trying to rewrite the semantics of shared is not a simple 
task, doing multithreading correctly is a minefield of "OOPS! 
I didn't think of that!" and if anything cries out for a DIP, 
your and Manu's proposal does.


Yes, I agree it's DIP worthy. But given the almost nothing but 
overt

hostility I've received here, why on earth would I waste my time
writing a DIP?
I put months into my other DIP which sits gathering dust... if 
this
thread inspired any confidence that it would be well-received I 
would
make the effort, but the critical reception we've seen here 
is... a

bit strange.
It's a 2-point proposal, the rules are **SO SIMPLE**, which is 
why I
love it. How it can be misunderstood is something I'm having 
trouble
understanding, and I don't know how to make it any clearer than 
I
already have; numerous times over, including in my last reply 
to you,

which you have ignored and dismissed it seems.

Please go back and read my response to your last program.


He did not say to write a full DIP, just a proposal, so he knows 
exactly what you mean, just as I said. It will require a DIP 
eventually, but he didn't ask you to write one now.


Re: Interesting Observation from JAXLondon

2018-10-20 Thread Joakim via Digitalmars-d
On Sunday, 21 October 2018 at 01:12:44 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 10/12/18 4:05 AM, Vijay Nayar wrote:
But the D community has also been very receptive of changes to 
the language




The community is. I don't feel like it's been true of the 
leadership for some years now (and I don't mean just W)


One thing that does concern me, is the avenues in which people 
can discover D.  For me personally, after a particularly nasty 
C++ project, I just googled for "alternatives to C++" and 
that's how I found D back in 2009 or so.  But the same search 
today turns up nothing about D.  I'm not sure sure how people 
are supposed to find D.


This is a VERY important thing, and it's true for many of us 
(myself included). This why it was a HUGE mistake when the 
community decided it should become taboo to promote D as a 
redesigned C++. That was ALWAYS D's core strength, we all know 
it, that's why many (if not most) of us are here, and hell, 
that's literally what D was *intentionally designed* to be.


But then political correctness came and threw that angle out 
the window, in favor of this awkward "fast code fast" nonsense, 
and we've been fighting the uphill "I don't understand the 
point of D" image battle ever since.


Simple, C++ is increasingly seen as irrelevant by those choosing 
a new language, so D's real competition is now Go, Rust, Swift, 
Nim, Zig, etc. These are people who want to write "fast code 
fast," well except for Rust users, who value ownership more.


Also, D can pitch itself to Java/C# users who need more 
performance with that softer pitch, because many of them have 
been burned by C++ and would recoil if you made the explicit C++ 
comparison. It is well-known that Rust and Go are attracting 
users from the Java and scripting communities, D needs to attract 
them too, as the Tilix dev noted to me last year:


"[M]y background is in Java. I found it quite interesting at 
DConf when I asked how many people came from a non C/C++ 
background that only one other fellow raised his hand...


I tend to get more annoyed about the negativity in the forums 
with regards to GC. I do feel that sometimes people get so 
wrapped up in what D needs for it to be a perfect systems 
language (i.e. no GC, memory safety, etc.), it gets overlooked 
that it is a very good language for building native applications 
as it is now. While D is often compared to Rust, in some ways the 
comparison to Go is more interesting to me. Both are GC-based 
languages and both started as systems languages, however Go 
pivoted and doubled down on the GC and has seen success. One of 
the Red Hat products I support, OpenShift, leverages Kubernetes 
(a Google project) for container orchestration and it’s written 
in Go.


I think D as a language is far superior to Go, and I wish we 
would toot our horn a little more in this regard instead of the 
constant negative discussion around systems programming."

https://dlang.org/blog/2017/08/11/on-tilix-and-d-an-interview-with-gerald-nunn/


Re: Need help with setting up LDC to cross-compile to Android/ARM

2018-10-20 Thread Joakim via Digitalmars-d

On Friday, 19 October 2018 at 22:19:31 UTC, H. S. Teoh wrote:
On Fri, Oct 19, 2018 at 02:41:48PM -0700, H. S. Teoh via 
Digitalmars-d wrote: [...]
In the meantime, is there a particular version of the NDK that 
I should use?  Currently I have 
android-ndk-r13b-linux-x86_64.zip installed.  Will it work?

[...]

Haha, I feel so silly now.  NDK r13b does not seem to have the 
sysroot subdir required by the clang build command, that's why 
it couldn't find the system headers.  So I ditched r13b and 
installed r17b instead, and now I can build the runtime 
successfully!


Ah, that makes sense, that NDK is ancient, ;) it came out two 
years ago:


https://developer.android.com/ndk/downloads/revision_history

Official D support for Android was added to ldc 1.4 last 
September, which was after NDK r15c came out, when they switched 
to that sysroot directory with unified headers for all Android 
versions, so that's what ldc uses. Before that, each Android 
version had its headers in a separate directory, which isn't 
supported by LDC.



I tried ldc-build-runtime with --ninja and it
came back with a bunch of errors about "cortex-a8"
being an unsupported target, and then segfaulted.


That's likely because you left off the double-quotes around the 
list of semicolon-separated flags passed to ldc-build-runtime 
--dFlags: the double quotes are required, as shown in the docs.


On Saturday, 20 October 2018 at 04:01:37 UTC, H. S. Teoh wrote:
On Fri, Oct 19, 2018 at 08:50:36PM +0000, Joakim via 
Digitalmars-d wrote:
On Wednesday, 17 October 2018 at 21:23:21 UTC, H. S. Teoh 
wrote:

> I'm trying to follow the instructions on this page:
> 
> 	https://wiki.dlang.org/Build_D_for_Android

[...]

On a side note, the last section on that page mentions not 
knowing how to create a keystore from scratch; actually, it's 
very simple with the `keytool` utility that comes with the 
Oracle JRE.  I added the steps on the talk page.  The only 
thing I'm unsure about is whether keytool is available natively 
on Android.  If not, then you'll have to generate the keystore 
on a PC and copy it over to Android afterwards.


From scratch meaning without using keytool, ie OpenSSL or some 
other hashing/fingerprinting tool alone, because keytool isn't 
available in the Termux app. As mentioned at the end of the wiki 
page, I used to manually edit the apk hashed manifests using 
OpenSSL alone until that apksigner tool was added later.


Re: Need help with setting up LDC to cross-compile to Android/ARM

2018-10-19 Thread Joakim via Digitalmars-d

On Friday, 19 October 2018 at 20:50:36 UTC, Joakim wrote:

On Wednesday, 17 October 2018 at 21:23:21 UTC, H. S. Teoh wrote:

I'm trying to follow the instructions on this page:

https://wiki.dlang.org/Build_D_for_Android

[...]


Hmm, that's weird: can you extract the full compiler command 
for that file? For example, if you use Ninja, by appending 
--ninja to ldc-build-runtime, it will tell you the full command 
that failed. Not sure if Make has a way to get that too.


Also, if you're using a system-provided LDC, it may not support 
Android, if it wasn't built against our slightly tweaked llvm:


https://github.com/ldc-developers/llvm

In that case, use the LDC download from github instead:

https://github.com/ldc-developers/ldc/releases


Re: Need help with setting up LDC to cross-compile to Android/ARM

2018-10-19 Thread Joakim via Digitalmars-d

On Wednesday, 17 October 2018 at 21:23:21 UTC, H. S. Teoh wrote:

I'm trying to follow the instructions on this page:

https://wiki.dlang.org/Build_D_for_Android

[...]


Hmm, that's weird: can you extract the full compiler command for 
that file? For example, if you use Ninja, by appending --ninja to 
ldc-build-runtime, it will tell you the full command that failed. 
Not sure if Make has a way to get that too.


[OT] Android

2018-10-19 Thread Joakim via Digitalmars-d

On Thursday, 18 October 2018 at 19:37:24 UTC, H. S. Teoh wrote:
On Thu, Oct 18, 2018 at 07:09:42PM +, Patrick Schluter via 
Digitalmars-d wrote: [...]
I often have the impression that a lot of things are going 
slower than necessary because a mentality where the perfect is 
in the way of good.


That is indeed an all-too-frequent malady around these parts, 
sad to say. Which has the sad consequence that despite all 
efforts, there are still unfinished areas in D, and promises 
that haven't materialized in years (like multiple alias this).


Still, the parts of D that are working well form a very 
powerful and comfortable-to-use language.  Not quite the ideal 
we wish it to be, but IMO much closer than any other language 
I've seen yet.  Recently I began dabbling in Android 
programming, and the one thing that keeps sticking out to me is 
how painful writing Java is.  Almost every day of writing Java 
code has me wishing for this or that feature in D.  Slices. 
Closures.  Meta-programming.  I found most of my time spent 
fighting with language limitations rather than make progress 
with the problem domain.


Yes, this is why I began the Android port: I couldn't imagine 
writing Java.


Eventually I resorted to generating Java code from D for some 
fo the most painful repetitive parts, and the way things are 
looking, I'm likely to be doing a lot more of that.  I fear the 
way things are going will have be essentially writing a D to 
Java compiler at some point!


Why not just use the Android port of D?


Re: Interfacing D with C: Arrays Part 1

2018-10-18 Thread Joakim via Digitalmars-d-announce

On Wednesday, 17 October 2018 at 15:20:08 UTC, Mike Parker wrote:
I had intended to publish the next GC series post early this 
month, but after many revisions and discussions with a couple 
of reviewers, I've decided to put it on hold until something 
gets worked out about the conflation of destruction and 
finalization in D (something I'll be pushing for soon).


[...]


"article is has morphed"



Re: shared - i need it to be useful

2018-10-17 Thread Joakim via Digitalmars-d

On Wednesday, 17 October 2018 at 23:12:48 UTC, Manu wrote:
On Wed, Oct 17, 2018 at 2:15 PM Stanislav Blinov via 
Digitalmars-d  wrote:


On Wednesday, 17 October 2018 at 19:25:33 UTC, Manu wrote:
> On Wed, Oct 17, 2018 at 12:05 PM Stanislav Blinov via 
> Digitalmars-d  wrote:

>>
>> On Wednesday, 17 October 2018 at 18:46:18 UTC, Manu wrote:
>>
>> > I've said this a bunch of times, there are 2 rules:
>> > 1. shared inhibits read and write access to members
>> > 2. `shared` methods must be threadsafe
>> >
>> >>From there, shared becomes interesting and useful.
>>
>> Oh God...
>>
>> void atomicInc(shared int* i) { /* ... */ }
>>
>> Now what? There are no "methods" for ints, only UFCS. Those 
>> functions can be as safe as you like, but if you allow 
>> implicit promotion of int* to shared int*, you *allow 
>> implicit races*.

>
> This function is effectively an intrinsic. It's unsafe by 
> definition.


Only if implicit conversion is allowed. If it isn't, that's 
likely @trusted, and this:


void atomicInc(ref shared int);

can even be @safe.


In this case, with respect to the context (a single int) 
atomicInc()
is ALWAYS safe, even with implicit conversion. You can 
atomicInc() a

thread-local int perfectly safely.


> It's a tool for implementing threadsafe machinery.
> No user can just start doing atomic operations on random ints
> and say
> "it's threadsafe", you must encapsulate the threadsafe
> functionality
> into some sort of object that aggregates all concerns and
> presents an
> intellectually sound api.

Threadsafety starts and ends with the programmer. By your 
logic *all* functions operating on `shared` are unsafe then. 
As far as compiler is concerned, there would be no difference 
between these two:


struct S {}
void atomicInc(ref shared S);

and

struct S { void atomicInc() shared { /* ... */ } }

The signatures of those two functions are exactly the same. 
How is that different from a function taking a shared int 
pointer or reference?


It's not, atomicInc() of an int is always safe with respect to 
the int itself.
You can call atomicInc() on an unshared int and it's perfectly 
fine,
but now you need to consider context, and that's a problem for 
the

design of the higher-level scope.

To maintain thread-safety, the int in question must be 
appropriately contained.


The problem is that the same as the example I presented before, 
which I'll repeat:


struct InvalidProgram
{
  int x;
  void fun() { ++x; }
  void gun() shared { atomicInc(); }
}

The method gun() (and therefore the whole object) is NOT 
threadsafe by

my definition, because fun() violates the threadsafety of gun().
The situation applies equally here that:
int x;
atomicInc();
++x; // <- by my definition, this 'API' (increment an int) 
violates
the threadsafety of atomicInc(), and atomicInc() is therefore 
not

threadsafe.

`int` doesn't present a threadsafe API, so int is by 
definition, NOT threadsafe. atomicInc() should be @system, and 
not @trusted.


If you intend to share an int, use Atomic!int, because it has a 
threadsafe API.
atomicInc(shared int*) is effectively just an unsafe intrinsic, 
and

its only use is at ground-level implementation of threadsafe
machinery, like malloc() and free().


> Let me try one:
>
> void free(void*) { ... }
>
> Now what? I might have dangling pointers... it's a 
> catastrophe!


One could argue that it should be void free(ref void* p) { /* 
...

*/ p = null; }


void *p2 = p;
free(p);
p2.crash();

As a matter of fact, in my own allocators memory blocks 
allocated by them are passed by value and are non-copyable, 
they're not just void[] as in std.experimental.allocator. One 
must 'move' them to pass ownership, and that includes 
deallocation. But that's another story altogether.


Right, now you're talking about move semantics to implement 
transfer of ownership... you might recall I was arguing this 
exact case to express transferring ownership of objects between 
threads earlier. This talk of blunt casts and "making sure 
everything is good" is all just great, but it doesn't mean 
anything interesting with respect to `shared`. It should be 
interesting even without unsafe casts.



> It's essentially the same argument.
> This isn't a function that professes to do something that
> people might
> misunderstand and try to use in an unsafe way, it's a 
> low-level

> implementation device, which is used to build larger *useful*
> constructs.

You're missing the point, again. You have an int. You pass a 
pointer to it to some API that takes an int*. You continue to 
use your int as just an int.


You have written an invalid program. I can think of an infinite 
number

of ways to write an invalid program.
In this case, don't have an `int`, instead, have an Atomic!int; 
you

now guarantee appropriate access, problem solved!
If you do have an int, don't pass it to other threads at random 
when
you don't have any idea what they intend to do with it! That's 
basic
common sense. You don't 

Re: Interesting Observation from JAXLondon

2018-10-12 Thread Joakim via Digitalmars-d

On Friday, 12 October 2018 at 07:13:33 UTC, Russel Winder wrote:
On Thu, 2018-10-11 at 13:00 +, bachmeier via Digitalmars-d 
wrote: […]

Suggestions?

My guess is that the reason they've heard of those languages 
is because their developers were writing small projects using 
Go and Rust, but not D.


I fear it may already be too late. Go, and now Rust, got 
marketing hype from an organisation putting considerable 
resources into the projects. This turned into effort from the 
community that increased rapidly, turning the hype into 
frameworks and libraries, and word of mouth marketing. It is 
the libraries and frameworks that make for traction. Now the 
hype is gone, Go and Rust, and their libraries and frameworks, 
are well positioned and with significant penetration into the 
minds of developers.


Talk to Java developers and they have heard of Go and Rust, but 
not D. Go is
more likely to them because of Docker and the context of The 
Web, for which Go
has a strong pitch. They have heard of Rust but usually see it 
as not relevant

to them, despite Firefox.

Talk to Python developers and they know of Go, many of them of 
Rust, but
almost never D. C and C++ are seen as the languages of 
performance extensions,

though Rust increasingly has a play there.

D has vibe.d, PyD, GtkD, and lots of other bits, but they've 
never quite had the resources of the equivalents in Go and Rust.


Also the D community as a whole is effectively introvert, 
whereas Go and Rust communities have been quite extrovert. 
"Build it and they will come" just doesn't work, you have to be 
pushy and market stuff, often using guerilla marketing, to get 
mindshare.


D has an excellent position against Python (for speed of 
development but without the performance hit) but no chance of 
penetrating the places where Python is strong due to lack of 
libraries and frameworks that people use – cf. Pandas, 
SciKit.Learn, etc.


D has an excellent position against Go as a language except 
that Go has goroutines and channels. The single threaded event 
loop and callback approach is losing favour. Kotlin is 
introducing Kotlin Coroutines which is a step on from the 
observables system of Rx. Structured concurrency abstracting 
away from fibres and threadpools. Java may well get this via 
Project Loom which is Quasar being inserted into the JVM 
directly. Whatever D has it doesn't seem to be going to compete 
in this space.


D without the GC has a sort of position against Rust, but I 
think that battle has been lost. Rust has won in the "the new C 
that isn't Go and doesn't have a garbage collector, and isn't 
C++, but does have all the nice monads stuff, oh and memory 
safety mostly".


When it comes down to it D will carry on as a niche language 
loved by a few unknown to most.


There is truth in much of what you say, but D has to pick its 
battles. Given the design of the language, I see two primary 
use-cases right now:


1. apps that need some level of performance, ie Tilix
2. Low-level tools that need a lot of performance, ie Weka or 
Sociomantic


Going after some established tool like Pandas and its mix of 
Python and C is likely to fail right now, as D is never going to 
be as easy as Python, and presumably Pandas has already sped up 
whatever it needs to in C. Maybe you could create a better tool 
in D some day when the D ecosystem is larger, but I don't think 
it would be the best approach today.


We need to think about what markets D would be best suited for 
and aim for those, while at the same time resisting the 
temptation to make D too specialized for those initial markets, 
which is a trap many other languages fall into.


Re: Interesting Observation from JAXLondon

2018-10-11 Thread Joakim via Digitalmars-d

On Thursday, 11 October 2018 at 12:22:19 UTC, Vijay Nayar wrote:

On Thursday, 11 October 2018 at 11:50:39 UTC, Joakim wrote:
On Thursday, 11 October 2018 at 07:58:39 UTC, Russel Winder 
wrote:

This was supposed to come to this list not the learn list.

On Thu, 2018-10-11 at 07:57 +0100, Russel Winder wrote:
It seems that in the modern world of Cloud and Kubernetes, 
and the charging
model of the Cloud vendors, that the startup times of JVMs 
is becoming a
financial problem. A number of high profile companies are 
switching from

Java
to Go to solve this financial difficulty.

It's a pity D is not in there with a pitch.

I suspect it is because the companies have heard of Go (and 
Rust), but not

D.


I doubt D could make a pitch that would be heard, no google 
behind it and all that jazz. D is better aimed at startups 
like Weka who're trying to disrupt the status quo than Java 
shops trying to sustain it, while shaving off some up-front 
time.


Personally I think this is going to change soon depending on 
what options are available.  The amount of time and money that 
companies, especially companies using Java and AWS, are putting 
in to saving money with Nomad or Kubernetics on the promise of 
having more services per server is quite high.  However, these 
JVM based services run in maybe 1-2GB of RAM at the minimum, so 
they get maybe 4 services per box.


A microservice built using D and vibe.d could easily perform 
the same work using less CPU and maybe only 500MB of RAM.  The 
scale of improvement is roughly the same as what you would get 
by moving to containerization.


If D has the proper libraries and integrations available with 
the tools that are commonly used, it could easily break through 
and become the serious language to use for the competitive 
business of the future.


But libraries and integrations will make or break that.  It's 
not just Java you're up against, it's all the libraries like 
SpringBoot and all the integrations with AWS systems like SQS, 
SNS, Kinesis, MySQL, PostGREs, Redis, etc.


My hope is that D will be part of that future and I'm trying to 
add libraries as time permits.


I'm skeptical of that cloud microservices wave building up right 
now. I suspect what's coming is a decentralized mobile wave, just 
as the PC once replaced big iron like mainframes and 
minicomputers, since top mobile CPUs now rival desktop CPUs:


"the [Apple] A12 outperforms a moderately-clocked Skylake CPU in 
single-threaded performance"

https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-unveiling-the-silicon-secrets/4

Many of the crypto-coins are trying to jumpstart a decentralized 
app ecosystem: someone will succeed.


Re: Interesting Observation from JAXLondon

2018-10-11 Thread Joakim via Digitalmars-d

On Thursday, 11 October 2018 at 07:58:39 UTC, Russel Winder wrote:

This was supposed to come to this list not the learn list.

On Thu, 2018-10-11 at 07:57 +0100, Russel Winder wrote:
It seems that in the modern world of Cloud and Kubernetes, and 
the charging
model of the Cloud vendors, that the startup times of JVMs is 
becoming a
financial problem. A number of high profile companies are 
switching from

Java
to Go to solve this financial difficulty.

It's a pity D is not in there with a pitch.

I suspect it is because the companies have heard of Go (and 
Rust), but not

D.


I doubt D could make a pitch that would be heard, no google 
behind it and all that jazz. D is better aimed at startups like 
Weka who're trying to disrupt the status quo than Java shops 
trying to sustain it, while shaving off some up-front time.


Re: Iain Buclaw at GNU Tools Cauldron 2018

2018-10-07 Thread Joakim via Digitalmars-d-announce

On Sunday, 7 October 2018 at 15:41:43 UTC, greentea wrote:

Date: September 7 to 9, 2018.
Location: Manchester, UK

GDC - D front-end GCC

https://www.youtube.com/watch?v=iXRJJ_lrSxE


Thanks for the link, just watched the whole video. The first 
half-hour sets the standard as an intro to the language, as only 
a compiler developer other than the main implementer could give, 
ie someone with fresh eyes.


I loved that Iain started off with a list of real-world projects. 
That's a mistake a lot of tech talks make, ie not motivating 
_why_ anybody should care about their tech and simply diving into 
the tech itself. I hadn't heard some of that info either, great 
way to begin.


My only nitpick is that I wish he'd emphasized how much of a 
focus D puts on metaprogramming, as I've noticed a lot of 
comments on proggit/HN/etc. saying that the power and ease of use 
of D's metaprogramming really stood out for them when trying the 
language.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-04 Thread Joakim via Digitalmars-d

On Thursday, 4 October 2018 at 10:02:28 UTC, Russel Winder wrote:
On Thu, 2018-10-04 at 08:06 +, Joakim via Digitalmars-d 
wrote:

[…]

The link in my OP links to a guy who maintained a spreadsheet 
of Apple-related conferences as evidence. He lists several 
that went away and says nothing replaced them. If you don't 
even examine the evidence provided, I'm not sure why we should 
care about your opinions.


So Apple conferences are a dead end.


Remember though that this is the top developer ecosystem on the 
planet right now, as iOS apps bring in more revenue than Android 
still.


Python, C++, Go, Rust, all these languages have thriving 
conferences. You just have to look at the world-wide increase 
in the number of such conferences for the data required.


But then my opinion, and indeed my data, doesn't seem matter to 
you so we might as well just stop communicating since you are 
never going to change you mind about this issue, even though 
you are actually wrong.


I've presented evidence in a handy link, you give none.

I could be wrong about anything, including that the Earth is 
round and I'm not in the Matrix. But to convince me that I am, 
I'll need evidence, same as I've presented to you.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-04 Thread Joakim via Digitalmars-d

On Thursday, 4 October 2018 at 08:54:29 UTC, Iain Buclaw wrote:

On Thursday, 4 October 2018 at 08:06:24 UTC, Joakim wrote:

On Thursday, 4 October 2018 at 07:53:54 UTC, Iain Buclaw wrote:

[...]


Did anybody pay attention to the live talks either? ;) That's 
the real comparison.


Anyway, the reason I'm giving to prerecord talks is so you can 
watch them on your own time before the conference. Watching 
prerecorded talks with everybody else at a conference is 
layering stupid on top of stupid. :D


Sure, but you really think it's an appropriate use of my free 
time spending 22 hours (which may as well be half a month) 
watching prerecorded talks instead of contributing?


That's a strange question: do you prefer being forced to sit 
through all 22 hours live at DConf? At least with pre-recorded 
talks, you have a choice of which ones to watch.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-04 Thread Joakim via Digitalmars-d

On Thursday, 4 October 2018 at 07:12:03 UTC, Russel Winder wrote:
On Wed, 2018-10-03 at 18:46 +, Joakim via Digitalmars-d 
wrote:

[…]

I don't doubt that some are like you and prefer viewing live, 
but given how conferences keep dying off and online tech talks 
are booming, you're in an extreme minority that prefers that 
high-cost live version. That means the market inevitably stops 
catering to you, which is why the talk-driven conference 
format is dying off.


And new conferences keep being started and being successful. 
And many just keep on going, often getting more and more 
successful.


Your personal view of conferences cannot be stated as global 
truth, since it patently is not fact, and evidence indicates 
not true, it is just your opinion.


The link in my OP links to a guy who maintained a spreadsheet of 
Apple-related conferences as evidence. He lists several that went 
away and says nothing replaced them. If you don't even examine 
the evidence provided, I'm not sure why we should care about your 
opinions.


On Thursday, 4 October 2018 at 07:53:54 UTC, Iain Buclaw wrote:

On Wednesday, 3 October 2018 at 16:17:48 UTC, Joakim wrote:
On Wednesday, 3 October 2018 at 01:28:37 UTC, Adam Wilson 
wrote:

On 10/2/18 4:34 AM, Joakim wrote:
On Tuesday, 2 October 2018 at 09:39:14 UTC, Adam Wilson 
wrote:

[...]


It is not clear what you disagree with, since almost nothing 
you say has any bearing on my original post. To summarize, I 
suggest changing the currently talk-driven DConf format to 
either


1. a more decentralized collection of meetups all over the 
world, where most of the talks are pre-recorded, and the 
focus is more on introducing new users to the language or


2. at least ditching most of the talks at a DConf still held 
at a central location, maybe keeping only a couple panel 
discussions that benefit from an audience to ask questions, 
and spending most of the time like the hackathon at the last 
DConf, ie actually meeting in person.




This point has a subtle flaw. Many of the talks raise points 
of discussion that would otherwise go without discussion, and 
potentially unnoticed, if it were not for the person bringing 
it up. The talks routinely serve as a launchpad for the 
nightly dinner sessions. Benjamin Thauts 2016 talk about 
shared libraries is one such example. Indeed every single 
year has brought at least one (but usually more) talk that 
opened up some new line of investigation for the dinner 
discussions.


I thought it was pretty obvious from my original post, since I 
volunteered to help with the pre-recorded talks, but the idea 
is to have pre-recorded talks no matter whether DConf is held 
in a central location or not.




I went to a conference once where they had mixed live talks and 
prerecorded talks - questions where taken at the end to the 
speaker of the prerecorded talk via a sip call.


The organisers at the end admitted that the prerecorded talks 
experiment failed. No one really paid attention to any of the 
content in it.


Did anybody pay attention to the live talks either? ;) That's the 
real comparison.


Anyway, the reason I'm giving to prerecord talks is so you can 
watch them on your own time before the conference. Watching 
prerecorded talks with everybody else at a conference is layering 
stupid on top of stupid. :D


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 17:51:00 UTC, Russel Winder wrote:
On Wed, 2018-10-03 at 17:26 +, Joakim via Digitalmars-d 
wrote: […]
At least look at the first two bullet points in my post 
responding to Adam, because you're missing the entire point of 
my suggestions, which is that certain things like talks are 
better suited to online whereas conferences are more suited 
for in-person interaction.


In your opinion. In my opinion, online material is a waste of 
time, I never watch YouTube videos, for me it is a waste of my 
time. But that is the point, different people have a different 
view. This doesn't mean I am right or wrong, it means different 
people have different ways of dealing with material.


I like a live presentation that I can then ignore *or* take up 
with a gusto with the presenter, or other people, after the 
session. Conferences allow this. Presentations are an 
introduction to interaction with others. For me. Others prefer 
to consume videos and have no interactions about the material. 
Personal differences.


Except that you can also view the videos at home, then discuss 
them later at a conference, which is the actual suggestion here.


Since there is a population of people who like online stuff, 
then online stuff there must be. As there are people who like a 
live presentation and post session discussion, this must also 
happen. The two are not in conflict.


They are in conflict because the cost of doing it live is much, 
much higher. DConf organizers' goal should be to enable the 
widest reach at the lowest cost, not catering to off-the-wall 
requests from a select few like yourself.


I don't doubt that some are like you and prefer viewing live, but 
given how conferences keep dying off and online tech talks are 
booming, you're in an extreme minority that prefers that 
high-cost live version. That means the market inevitably stops 
catering to you, which is why the talk-driven conference format 
is dying off.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 17:13:51 UTC, Dejan Lekic wrote:

On Wednesday, 3 October 2018 at 16:21:45 UTC, Joakim wrote:
Like most of the responses in this thread, I have no idea why 
you're stumping for in-person interaction, when all my 
suggestions were geared around having _more in-person 
interaction_.


If you're still not sure what I mean, read this long post I 
wrote fisking Adam's similar post:


https://forum.dlang.org/post/eoygemytghynpogvl...@forum.dlang.org


Perhaps you did not get my point?


No, I got it, you didn't get mine.

- I have nothing against core D team having web-conferences as 
much as they please. It is up to them (and they may already 
have them?) how they want to communicate.


What I argued about was that, just because some antisocial geek 
argues that conferences are "dead" because we have 
web-conferencing and similar means of communication does not 
mean we all share that opinion... Everyone can record a "talk" 
with slides and put it on some video streaming site like Vimeo 
or YouTube, but I personally see that as ANOTHER way to reach 
the community, certainly NOT an alternative to a well-organised 
conference!


Do not get me wrong, I have nothing against the proposal - I 
think D community can have both good, annual conference, AND 
what web-conferencing between core D devs, and people who would 
record talks in their rooms or offices and make them public...


While my OP did mention some of those things, it only did so as a 
way to have _more in-person interaction_ at the two DConf 
alternative formats I suggested, neither of which was primarily 
about any of the stuff you mention.


At least look at the first two bullet points in my post 
responding to Adam, because you're missing the entire point of my 
suggestions, which is that certain things like talks are better 
suited to online whereas conferences are more suited for 
in-person interaction.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 11:48:06 UTC, Dejan Lekic wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:
I'm sure some thought and planning is now going into the next 
DConf, so I'd like to make sure people are aware that the 
conference format that DConf uses is dying off, as explained 
here:


https://marco.org/2018/01/17/end-of-conference-era


It is a matter of personal preference, and a view of a 
modern-day geek, in my humble opinion... I _highly disagree_. 
People go to conferences for different reasons. You know, even 
though we "computer people" tend to be branded as antisocial, 
there are still many of us who prefer to see someone in person, 
talk to him/her, meet new people, speak to them too, build the 
network, exchange phone numbers, etc...


As usual with conferences not all people are happy - you will 
ALWAYS have people who prefer more technical stuff, and people 
who prefer more business side - people who try to promote their 
products and services. - Conferences are brilliant places for 
them.


Another group of people interested in conferences and meetups 
are recruiters. My company found few new colleagues this way...


Yet another group are people who also want to see the town 
where the conference is held - it is a form of tourism if you 
like.


Yes, you can have all that interaction with some 
internet-conferencing software, but not at the level when 
people interact with each other directly!


Like most of the responses in this thread, I have no idea why 
you're stumping for in-person interaction, when all my 
suggestions were geared around having _more in-person 
interaction_.


If you're still not sure what I mean, read this long post I wrote 
fisking Adam's similar post:


https://forum.dlang.org/post/eoygemytghynpogvl...@forum.dlang.org


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 01:28:37 UTC, Adam Wilson wrote:

On 10/2/18 4:34 AM, Joakim wrote:

On Tuesday, 2 October 2018 at 09:39:14 UTC, Adam Wilson wrote:

On 10/1/18 11:26 PM, Joakim wrote:

[snip]


I disagree.


It is not clear what you disagree with, since almost nothing 
you say has any bearing on my original post. To summarize, I 
suggest changing the currently talk-driven DConf format to 
either


1. a more decentralized collection of meetups all over the 
world, where most of the talks are pre-recorded, and the focus 
is more on introducing new users to the language or


2. at least ditching most of the talks at a DConf still held 
at a central location, maybe keeping only a couple panel 
discussions that benefit from an audience to ask questions, 
and spending most of the time like the hackathon at the last 
DConf, ie actually meeting in person.




This point has a subtle flaw. Many of the talks raise points of 
discussion that would otherwise go without discussion, and 
potentially unnoticed, if it were not for the person bringing 
it up. The talks routinely serve as a launchpad for the nightly 
dinner sessions. Benjamin Thauts 2016 talk about shared 
libraries is one such example. Indeed every single year has 
brought at least one (but usually more) talk that opened up 
some new line of investigation for the dinner discussions.


I thought it was pretty obvious from my original post, since I 
volunteered to help with the pre-recorded talks, but the idea is 
to have pre-recorded talks no matter whether DConf is held in a 
central location or not.


Since both of these alternatives I suggest are much more about 
in-person interaction, which is what you defend, and the only 
big change I propose is ditching the passive in-person talks, 
which you do not write a single word in your long post 
defending, I'm scratching my head about what you got out of my 
original post.


There is much more to the conference than just a 4-day meetup 
with talks. The idea that it's just the core 8-15 people with 
a bunch of hangers-on is patently false. It's not about the 
conversations I have with the "core" people. It's 
Schveighoffer, or Atila, or Jonathan, or any of a long list 
of people who are interested enough in coming. Remember these 
people self-selected to invest non-trivial treasure to be 
there, they  are ALL worthy of conversing with.


Since both my mooted alternatives give _much more_ opportunity 
for such interaction, I'm again scratching my head at your 
reaction.




This is untrue. See responses further down.


It is true. You merely prefer certain interaction for yourself to 
the overall interaction of the community.


Is it a "mini-vaction"? Yea, sure, for my wife. For her it's 
a four day shopping spree in Europe. For me it's four days of 
wall-to-wall action that leaves me drop-dead exhausted at the 
end of the day.


So it's the talks that provide this or the in-person 
interaction? If the latter, why are you arguing against my 
pushing for more of it and ditching the in-person talks?




It's everything. The talks, the coding, the talking, the 
drinking. All of it has some social component I find valuable.


Please try to stay on the subject. Nobody's talking about getting 
rid of coding/talking/drinking, in fact, the idea is to have 
_more_ time for those, by ditching the in-person talks.


So the relevant info here would be what you find "social" about 
passively watching a talk in person with 100 other people in the 
same room, which as usual, you don't provide.


Every time I see somebody predicting the end of "X" I roll my 
eyes. I have a vivid memory of the rise of Skype and 
videoconferencing in the early 2000's giving way to 
breathless media reports about how said tools would kill the 
airlines because people could just meet online for a trivial 
fraction of the price.


People make stupid predictions all the time. Ignoring all such 
"end of" predictions because many predict badly would be like 
ignoring all new programming languages because 99% are bad. 
That means you'd never look at D.


And yes, some came true: almost nobody programs minicomputers 
or buys standalone mp3 players like the iPod anymore, compared 
to how many used to at their peak.




Sure, but the predictions about videoconferencing have yet to 
come true. As told but the data itself. The travel industry is 
setting new records yearly in spite of videoconferencing. 
That's not conjecture or opinion, go look for yourself. As I 
have previously suggested, the stock prices and order-books of 
Airbus and Boeing are are record highs. Airplanes are more 
packed than ever (called load-factor). For example, Delta's 
system-wide load-factor was 85.6% last year. Which means that 
85.6% of all available seats for the entire year were occupied. 
(Source: 
https://www.statista.com/statistics/221085/passenger-load-factor-of-delta-air-lines/). Airlines are delivering entire planes for business travelers.


All of 

Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 16:10:20 UTC, Johannes Loher wrote:

On Tuesday, 2 October 2018 at 15:42:20 UTC, Joakim wrote:
On Tuesday, 2 October 2018 at 15:03:45 UTC, Adam D. Ruppe 
wrote:
That is what Joakim is talking about - changing the main 
event to be more like the after-hours stuff everyone loves so 
much, to actually use all the time to maximize the potential 
of in-person time.


I'm talking about growing two different qualities much more, 
with my two suggested alternatives to the DConf format.


1. Ditch the talks, focus on in-person interaction. That's why 
I suggest having almost no talks, whether at a central DConf 
or not. You clearly agree with this.


2. Decentralize the DConf location, casting a much wider net 
over many more cities. Walter and Adam could rent a room and 
setup a Seattle DConf location, Andrei and Steven in Boston, 
Ali and Shammah in the bay area, and so on (only illustrative, 
I'm not imposing this on any of these people). Some of the 
money that went to renting out a large conference room in 
Munich can instead be spent on these much smaller rooms in 
each city.


Charge some minimal fee for entrance in some locations, if 
that means they can spend time with W and to cover costs. I 
wouldn't charge anything more than $2 in my city for my event, 
as event organizers here have found that that's low enough to 
keep anyone who's really interested while discouraging fake 
RSVPs, ie those who have no intent of ever showing up but 
strangely sign up anyway (I know an organizer who says he had 
150 people RSVP for a Meetup here and only 15 showed up).


By keeping travel and ticket costs much lower, you invite much 
more participation.


Obviously my second alternative to DConf listed above wouldn't 
be decentralized at all, only enabling in-person interaction 
at a still-central DConf.


Mix and match as you see fit.


I totally agree with you on your first point, i.e. making DConf 
more interactive. I have had very good experiences with formats 
like open space or barcamp. However, these formats only work if 
people are actually willing to participate and bring in their 
own ideas. Not having anything prepared can in rare cases lead 
to the situation where there is a lack of things to talk about 
(I doubt this would be the case for the D community, but it is 
something to keep in mind).


As long as you plan ahead and compile an online list of stuff to 
work on or discuss in the weeks preceding, I don't see this being 
a problem.


However, I must say I disagree with your second point, i.e. 
decentralising DConf. As many people here have already 
mentioned, DConf is about talking to people. And to me it is 
especially important to talk to lots of different people whom I 
otherwise don’t get the chance to talk to in person. By 
decentralising the conference, we would limit the number of 
different people you can get in touch with directly by a huge 
amount.


I doubt that, it would just be different people you're talking 
to. There are probably three types of current and potential D 
users worth talking about. There's the core team, power users, 
and everybody else, ie casual or prospective users.


A central DConf caters to the first two, almost nobody from the 
largest camp, ie casual/prospective users, is flying out or 
paying $400 to attend. A decentralized DConf tries to get much 
more casual/prospective users and power users who couldn't 
justify traveling so far before, but it has two potential costs:


1. The core team may be spread out and not mostly gathered in one 
spot anymore. That is why I have suggested having them meet 
separately from DConf or at one of the DConf locations earlier in 
this thread.


2. A power user who might have paid to travel to Berlin before 
doesn't get access to the entire core team at once, someone like 
you I'm guessing. I think there's some value there, but I suspect 
it's much less than the value gained from a decentralized DConf.


Just to use myself as an example, last Docnf I was able to talk 
to Andrei, Walter, Mike, Ali, Jonathan, Kai and lots of others 
and exchange ideas with them. This would not have been possible 
with a decentralised event (except for the off chance that all 
those people by chance attend the same local „meetup“).


Yes, but what did the D ecosystem concretely get out of it? Is it 
worth not having the hundreds of people who might have met them 
at decentralized DConf locations at Boston/SV/Seattle/Berlin not 
meeting them last year?


That's the kind of tough-minded calculation that needs to be made.

On the other hand, I have to admit that decentralising the 
event would open it up for a much bigger audience, which 
definitely is a good idea. However, I would much prefer to have 
something like a main DConf and if there are enough interested 
people in an area who will not go to the main event, they can 
host their own mini conference and watch streams, make their 
own small workshops etc. This is 

Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 15:03:45 UTC, Adam D. Ruppe wrote:
That is what Joakim is talking about - changing the main event 
to be more like the after-hours stuff everyone loves so much, 
to actually use all the time to maximize the potential of 
in-person time.


I'm talking about growing two different qualities much more, with 
my two suggested alternatives to the DConf format.


1. Ditch the talks, focus on in-person interaction. That's why I 
suggest having almost no talks, whether at a central DConf or 
not. You clearly agree with this.


2. Decentralize the DConf location, casting a much wider net over 
many more cities. Walter and Adam could rent a room and setup a 
Seattle DConf location, Andrei and Steven in Boston, Ali and 
Shammah in the bay area, and so on (only illustrative, I'm not 
imposing this on any of these people). Some of the money that 
went to renting out a large conference room in Munich can instead 
be spent on these much smaller rooms in each city.


Charge some minimal fee for entrance in some locations, if that 
means they can spend time with W and to cover costs. I wouldn't 
charge anything more than $2 in my city for my event, as event 
organizers here have found that that's low enough to keep anyone 
who's really interested while discouraging fake RSVPs, ie those 
who have no intent of ever showing up but strangely sign up 
anyway (I know an organizer who says he had 150 people RSVP for a 
Meetup here and only 15 showed up).


By keeping travel and ticket costs much lower, you invite much 
more participation.


Obviously my second alternative to DConf listed above wouldn't be 
decentralized at all, only enabling in-person interaction at a 
still-central DConf.


Mix and match as you see fit.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 14:49:31 UTC, bachmeier wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:

"Once the videos are all up, set up weekend meetups in several 
cities [all over the world], where a few livestreamed talks 
may talk place if some speakers don't want to spend more time 
producing a pre-recorded talk, but most time is spent like the 
hackathon, discussing various existing issues from bugzilla in 
smaller groups or brainstorming ideas, designs, and libraries 
for the future."


I can setup an event like this in my city, where AFAIK nobody 
uses D, so most of it would be geared towards introducing them 
to the language.


I estimate that you could do ten times better at raising 
awareness and uptake with this approach than the current DConf 
format, by casting a much wider net, and it would cost about 
10X less, ie you get two orders of magnitude better bang for 
the buck.


I think this is something that could be done *in addition to* 
DConf.


It depends what you mean by that. If DConf keeps running as it 
has, as you suggest below, but you simply add some satellite 
meetups around it in other cities watching the livestreamed talks 
from the main DConf, then you have addressed some of these 
concerns, but not very much.


If you go the decentralized approach I suggested, but maybe pick 
one of those locations as the one the core team goes to and don't 
do almost any in-person talks anywhere, that would address much 
more.


I honestly don't think DConf is very effective at promoting D, 
except perhaps to a small sliver of the overall population of 
programmers, due to the content of most of the presentations.


I agree. I'll go farther and say that it's a small sliver of 
existing D programmers too who get much value out of it.


{This is not intended to be a criticism or a statement that 
anything about DConf should be changed.}


Heh, of course it's a criticism and of course it should be 
changed. :)


I believe it would be a mistake to drop DConf. If we did that, 
the story that would be told is "D couldn't even support its 
own conference. Use Rust or Go or Julia instead." Our view 
would be "we're on the cutting edge" but everyone else's view 
would be "the language is dying".


Great. Everybody thought Apple was nuts when they released a $500 
iPhone in 2007, now Ballmer wishes he'd come up with the idea:


https://www.macrumors.com/2016/11/07/former-microsoft-ceo-steve-ballmer-wrong-iphone/

As long as you communicate that you're replacing one DConf 
location with several and why you're doing it, I don't see why we 
should care how they end up interpreting it. Our goal is to get 
users and adoption, not to look good to other 
programming-language developers.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 10:37:44 UTC, Nicholas Wilson wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:

[...]


As I'm sure has been said before, if it were just the talks it 
probably wouldn't be worth it. But conferences are sooo 
much more than just the talks. Its the conversations over 
breakfast/lunch/dinner/ between talks and long into the night 
(sometimes too long). Its the networking, the hacking, the face 
to face. The talks are usually pretty good too.


The conference is definitely not dead, I'm going to one in San 
José in 2 weeks, sure the talks look really interesting but the 
main reason is to talk to other people to get stuff done.


Then I'm not sure why you're saying any of this to me, as almost 
nothing you write contradicts anything I wrote.


If you're still not sure what I mean, read this long post I just 
wrote fisking Adam's similar post:


https://forum.dlang.org/post/eoygemytghynpogvl...@forum.dlang.org


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 09:39:14 UTC, Adam Wilson wrote:

On 10/1/18 11:26 PM, Joakim wrote:

[snip]


I disagree.


It is not clear what you disagree with, since almost nothing you 
say has any bearing on my original post. To summarize, I suggest 
changing the currently talk-driven DConf format to either


1. a more decentralized collection of meetups all over the world, 
where most of the talks are pre-recorded, and the focus is more 
on introducing new users to the language or


2. at least ditching most of the talks at a DConf still held at a 
central location, maybe keeping only a couple panel discussions 
that benefit from an audience to ask questions, and spending most 
of the time like the hackathon at the last DConf, ie actually 
meeting in person.


Since both of these alternatives I suggest are much more about 
in-person interaction, which is what you defend, and the only big 
change I propose is ditching the passive in-person talks, which 
you do not write a single word in your long post defending, I'm 
scratching my head about what you got out of my original post.


There is much more to the conference than just a 4-day meetup 
with talks. The idea that it's just the core 8-15 people with a 
bunch of hangers-on is patently false. It's not about the 
conversations I have with the "core" people. It's 
Schveighoffer, or Atila, or Jonathan, or any of a long list of 
people who are interested enough in coming. Remember these 
people self-selected to invest non-trivial treasure to be 
there, they  are ALL worthy of conversing with.


Since both my mooted alternatives give _much more_ opportunity 
for such interaction, I'm again scratching my head at your 
reaction.


Is it a "mini-vaction"? Yea, sure, for my wife. For her it's a 
four day shopping spree in Europe. For me it's four days of 
wall-to-wall action that leaves me drop-dead exhausted at the 
end of the day.


So it's the talks that provide this or the in-person interaction? 
If the latter, why are you arguing against my pushing for more of 
it and ditching the in-person talks?


Every time I see somebody predicting the end of "X" I roll my 
eyes. I have a vivid memory of the rise of Skype and 
videoconferencing in the early 2000's giving way to breathless 
media reports about how said tools would kill the airlines 
because people could just meet online for a trivial fraction of 
the price.


People make stupid predictions all the time. Ignoring all such 
"end of" predictions because many predict badly would be like 
ignoring all new programming languages because 99% are bad. That 
means you'd never look at D.


And yes, some came true: almost nobody programs minicomputers or 
buys standalone mp3 players like the iPod anymore, compared to 
how many used to at their peak.


However, it's 2018 and the airlines are reaping record profits 
on the backs of business travelers (ask me how I know). 
Airlines are even now flying planes with NO standard economy 
seats for routes that cater specifically to business travelers 
(e.g. Singapore Airlines A350-900ULR). The order books (and 
stock prices) of both Airbus and Boeing are at historic highs.


You know what is much higher? Business communication through 
email, video-conferencing, online source control, etc. that 
completely replaced old ways of doing things like business travel 
or sending physical packages. However, business travel might 
still be up- I don't know as I haven't seen the stats, and you 
provide nothing other than anecdotes- because all that virtual 
communication might have enabled much more collaboration and 
trade that also grew business travel somewhat.


There are more conferences, attendees, and business travelers 
than there has ever been in history, in spite of the great 
technological leaps in videoconferencing technology in the past 
two decades.


The market has spoken. Reports of the death of 
business/conference travel have been greatly exaggerated.


You are conflating two completely different markets here, 
business versus conference travel. Regarding conferences, your 
experience contradicts that of the iOS devs in the post I linked 
and the one he links as evidence, where that blogger notes 
several conferences that have shut down. In your field, it is my 
understanding that MS has been paring back and consolidating 
their conferences too, though I don't follow MS almost at all.


The reason for this is fundamental to human psychology and, as 
such, is unlikely to change in the future. Humans are social 
animals, and no matter how hard we have tried, nothing has been 
able to replace the face-to-face meeting for getting things 
done. Be it the conversations we have over beers after the 
talks, or the epic number of PR's that come out the hackathon, 
or even mobbing the speaker after a talk.


It is funny that you say this on a forum where we're 
communicating despite never having met "face-to-face," discussing 
a language where 99.999% of the work is done 

Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d
On Tuesday, 2 October 2018 at 08:21:11 UTC, maarten van damme 
wrote:
While I have never attended dconf itself, conferences itself 
usually aren't about the talks but about the people you meet 
and get to interact with.


Since this thread is about replacing the outdated DConf format 
with two possible in-person formats that feature _more_ 
interpersonal interaction, I have no idea why you're making this 
point to me.


On Tuesday, 2 October 2018 at 08:56:36 UTC, bauss wrote:

On Tuesday, 2 October 2018 at 07:32:58 UTC, Joakim wrote:
Ex. for D conf there is much more than just D. There is also 
the minor escape from reality to new surroundings, like a 
mini vacation etc.


Thank you for making clear that the real reason you and some 
others like the current format is because you want to have a 
fun "vacation"- as I pointed out in that earlier thread- 
rather than anything to do with D or advancing the ecosystem.


Thank you for not reading everything I said and literally only 
the past 5 words; I said it's also that, but not entirely.


Everything you wrote before that I addressed with a separate 
comment which you didn't cut-n-paste, maybe you missed that too.


As for this justification, the only reason you gave is that it's 
a "escape from reality"/"mini vacation", along with hand-waving 
about "much more." I can't address reasons you never gave.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 08:08:38 UTC, Gary Willoughby wrote:

On Tuesday, 2 October 2018 at 07:32:58 UTC, Joakim wrote:
Thank you for making clear that the real reason you and some 
others like the current format is because you want to have a 
fun "vacation"- as I pointed out in that earlier thread- 
rather than anything to do with D or advancing the ecosystem.


Yes, please let's not have any fun at Dconf this year!!! /s


Then why are you sitting around listening to boring tech talks on 
your "super-fun" vacation? Get W and a bunch of other D devs 
and go on a boat tour of the Greek islands! You'll have a lot 
more fun!!! endSarcasm()


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 07:14:54 UTC, bauss wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:

[...]


I highly disagree with this.

I love conferences and meetups.

It's good socially and a conference is not 100% just about the 
topic it hosts.


I think you didn't read what I wrote, as nowhere did I say not to 
gather people in conferences or meetups, but that the traditional 
conference _format_, as exemplified by previous DConfs, is a 
waste of time.


Ex. for D conf there is much more than just D. There is also 
the minor escape from reality to new surroundings, like a mini 
vacation etc.


Thank you for making clear that the real reason you and some 
others like the current format is because you want to have a fun 
"vacation"- as I pointed out in that earlier thread- rather than 
anything to do with D or advancing the ecosystem.


Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d
I'm sure some thought and planning is now going into the next 
DConf, so I'd like to make sure people are aware that the 
conference format that DConf uses is dying off, as explained here:


https://marco.org/2018/01/17/end-of-conference-era

There was a discussion about this in a previous forum thread:

https://forum.dlang.org/post/bnbldtdfeppzjuthx...@forum.dlang.org

Jonathan and Mike argue in that thread that DConf is great for 
the core team to get together in person and hash things out for D 
with very high-bandwidth interaction, but I pointed out that 
doesn't justify 95%+ of the attendees being there. If there's a 
real need for this, maybe get those 8-15 people together in an 
online video conference or offline retreat, without a bunch of 
hangers-on and talks.


People are now experimenting with what replaces conferences, we 
should be doing that too. I came up with some ideas in that 
thread:


"Have most talks prerecorded by the speaker on their webcam or 
smartphone, which produce excellent video these days with not 
much fiddling, and have a couple organizers work with them to get 
those home-brewed videos up to a certain quality level, both in 
content and presentation, before posting them online."


I volunteer to help presenters do this.

"Once the videos are all up, set up weekend meetups in several 
cities [all over the world], where a few livestreamed talks may 
talk place if some speakers don't want to spend more time 
producing a pre-recorded talk, but most time is spent like the 
hackathon, discussing various existing issues from bugzilla in 
smaller groups or brainstorming ideas, designs, and libraries for 
the future."


I can setup an event like this in my city, where AFAIK nobody 
uses D, so most of it would be geared towards introducing them to 
the language.


I estimate that you could do ten times better at raising 
awareness and uptake with this approach than the current DConf 
format, by casting a much wider net, and it would cost about 10X 
less, ie you get two orders of magnitude better bang for the buck.


At the very least, DConf should just be a big hackathon of 
self-organizing groups, rather than wasting any time passively 
imbibing talks next to a hundred other people. I still don't 
think the cost of getting a hundred people in the same room for 
3-4 days would be justified, but at least it would be a step in 
the right direction.


Re: Funny way to crash dmd and brick the whole computer

2018-10-01 Thread Joakim via Digitalmars-d

On Friday, 28 September 2018 at 11:58:25 UTC, Zardoz wrote:

CTE fib :

module fib_cte;
import std.stdio;

long fib(long n) {
  if (n <= 1) return 1;
  return fib(n - 1) + fib(n - 2);
}

static immutable valueFib = fib(46);

void main() {
writeln(valueFib);
}


I tried it on Android with LDC, it eventually just kills the 
process. You need to get a real OS. ;)


Re: DlangUI and android

2018-09-24 Thread Joakim via Digitalmars-d-learn

On Monday, 10 September 2018 at 09:19:52 UTC, Josphe Brigmo wrote:
Is there an emulator that can run the apks? Android emulator 
does not work, I suppose, because it isn't java. Complains 
about a missing classes.dex file.


It isn't clear what you're trying to do: you're trying to run a D 
apk compiled for ARM in an Android/x86 emulator? That won't work.



I'd rather have an emulator version if possible for quicker dev.


I've just been trying out the Anbox container for linux, which 
allows you to run Android apps on the same CPU as your linux 
distro, so Android/x64 apps on my linux/x64 VPS:


https://anbox.io

However, 15 modules from Phobos have tests that are failing or 
segfaulting, as it's the first time I tried Android/x64. I hope 
to have it working better by the next ldc 1.12 beta, maybe you 
can try it then.


I'll also note that you can get pretty quick turnaround for 
actual Android hardware by enabling USB debugging or WiFi adb 
access on your device. But if you need to do some touch or GUI 
testing with the screen and don't want to manually handle the 
hardware each time, you'll either have to use an ARM emulator or 
wait for the Android/x64 support to get better.


Re: Webassembly TodoMVC

2018-09-23 Thread Joakim via Digitalmars-d-announce
On Saturday, 22 September 2018 at 19:51:48 UTC, Sebastiaan Koppe 
wrote:

On Saturday, 22 September 2018 at 14:54:29 UTC, aberba wrote:

[...]


Currently the whole thing is not so developer-friendly, it was 
just the easiest way for me to get it up and running.


[...]


Vladimir mentioned that there's a Musl port to wasm, have you 
tried it?


https://github.com/jfbastien/musl

Druntime and ldc support Musl.


Re: Updating D beyond Unicode 2.0

2018-09-21 Thread Joakim via Digitalmars-d

On Friday, 21 September 2018 at 20:25:54 UTC, Walter Bright wrote:
When I originally started with D, I thought non-ASCII 
identifiers with Unicode was a good idea. I've since slowly 
become less and less enthusiastic about it.


First off, D source text simply must (and does) fully support 
Unicode in comments, characters, and string literals. That's 
not an issue.


But identifiers? I haven't seen hardly any use of non-ascii 
identifiers in C, C++, or D. In fact, I've seen zero use of it 
outside of test cases. I don't see much point in expanding the 
support of it. If people use such identifiers, the result would 
most likely be annoyance rather than illumination when people 
who don't know that language have to work on the code.


Extending it further will also cause problems for all the tools 
that work with D object code, like debuggers, disassemblers, 
linkers, filesystems, etc.


To wit, Windows linker error with Unicode symbol:

https://github.com/ldc-developers/ldc/pull/2850#issuecomment-422968161


Absent a much more compelling rationale for it, I'd say no.


I'm torn. I completely agree with Adam and others that people 
should be able to use any language they want. But the Unicode 
spec is such a tire fire that I'm leery of extending support for 
it.


Someone linked this Swift chapter on Unicode handling in an 
earlier forum thread, read the section on emoji in particular:


https://oleb.net/blog/2017/11/swift-4-strings/

I was laughing out loud when reading about composing "family" 
emojis with zero-width joiners. If you told me that was a tech 
parody, I'd have believed it.


I believe Swift just punts their Unicode support to ICU, like 
most any other project these days. That's a horrible sign, that 
you've created a spec so grotesquely complicated that most 
everybody relies on a single project to not have to deal with it.


Re: Jai compiles 80,000 lines of code in under a second

2018-09-20 Thread Joakim via Digitalmars-d

On Friday, 21 September 2018 at 00:47:27 UTC, Adam D. Ruppe wrote:
Of course, D can also take ages to compile one line of code. It 
all depends on that that line is doing... ctfe and templates 
are slow. C or Java style code compiling in D is very fast.


I was going to say this too, ie how much of that Jai code is run 
at compile-time, how much is uninstantiated templates that is 
just skipped over like D does, and how much is templates 
instantiated many times? Lines of code is not a good enough 
measure with those programming constructs.


I was just building the stdlib tests with LDC yesterday and they 
took so much memory on a new Linux/x64 VPS with 2GB of RAM that I 
had spun up that I couldn't even ssh in anymore. I eventually had 
to restart the VPS and add a swapfile, which I usually have but 
simply hadn't bothered with yet for this new Ubuntu 18.04 VPS. 
The stdlib tests instantiate a ton of templates.


dub auto-tester

2018-09-19 Thread Joakim via Digitalmars-d
On Thursday, 20 September 2018 at 04:16:41 UTC, Neia Neutuladh 
wrote:
On Thursday, 20 September 2018 at 02:51:52 UTC, Neia Neutuladh 
wrote:
On Monday, 10 September 2018 at 01:27:20 UTC, Neia Neutuladh 
wrote:
Not on dlang.org anywhere, but I built a crude version of 
this. Results are available at http://ikeran.org/report/.


A quick status update:


And source code is available at 
https://git.ikeran.org/dhasenan/dubautotester


Please don't judge me.


Nice, what will it take to get this integrated with the official 
dub website?


BTW, the gitea self-hosted github-workalike you're using looks 
nice, too bad it's written in Go. ;)


Re: LLVM 7.0.0 no mention of D anymore

2018-09-19 Thread Joakim via Digitalmars-d-announce
On Wednesday, 19 September 2018 at 13:10:07 UTC, Daniel Kozak 
wrote:

http://releases.llvm.org/7.0.0/docs/ReleaseNotes.html#external-open-source-projects-using-llvm-7

no mention of D anymore :(

http://releases.llvm.org/6.0.0/docs/ReleaseNotes.html#external-open-source-projects-using-llvm-6

http://releases.llvm.org/5.0.0/docs/ReleaseNotes.html#external-open-source-projects-using-llvm-5

http://releases.llvm.org/4.0.0/docs/ReleaseNotes.html#external-open-source-projects-using-llvm-4-0-0


I think Kai used to make sure every LLVM release worked and 
notified them to mention ldc. I don't he's had time to do much 
with ldc lately, so maybe it slipped through.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-18 Thread Joakim via Digitalmars-d
On Tuesday, 18 September 2018 at 18:06:37 UTC, Neia Neutuladh 
wrote:

On Tuesday, 18 September 2018 at 07:53:31 UTC, Joakim wrote:
On Monday, 17 September 2018 at 22:27:41 UTC, Neia Neutuladh 
wrote:

On Monday, 17 September 2018 at 15:47:14 UTC, Joakim wrote:
Not sure why that matters if you agree with Kay that HTML is 
an abortion? :) I actually think it's great that mobile is 
killing off the web, as the Comscore usage stats I linked 
earlier show.


HTML is a somewhat open standard. I'm more happy with HTML 
and Javascript, as ugly as they are and as dominated by 
Google and Microsoft as they are, than having to target 
private companies' frameworks.


So you'd rather target an incredibly badly designed open 
standard than a mostly open source "private company's" 
framework that's certainly not great, but much better? It's no 
contest for me, give me the latter any day. And then of 
course, there's always cross-platform OSS toolkits like 
Flutter or DlangUI.


Thinking about it a bit more, the openness of the platform is 
more important. Android and iOS are effectively closed 
platforms. You *can* sideload apps, but it's rare to find 
someone willing to do so. If you're not on the app stores, your 
app isn't going to get a thousandth as much traction.


I'll note that you wrote "app stores," and for Android there are 
actually multiple. There's the official Play store from google, 
the Amazon appstore, app stores for OSS apps like F-Droid or 
Fossdroid, and over 400 app stores in China, where those first 
two app stores are almost never used:


https://www.appinchina.co/market/app-stores/

Anyone can install any app store on their Android device and get 
any apps they want, though as you note, most outside China just 
go with the pre-installed Play or Amazon store.


Windows, on the other hand, has long been an open platform; you 
can develop for it and publish your programs and Microsoft 
won't get in the way.


Though that is now changing with their new UWP platform, which by 
default must be installed from their own app store, the Microsoft 
Store. The link for the Windows/AArch64 device in my original 
post notes that they expect most Windows/AArch64 apps to be UWP 
apps, and so you'd get them from an app store just like Android 
most of the time. I read that they do have similar allowances for 
side-loading UWP apps as Android though, and of course older 
win32/64 apps on normal Wintel devices isn't affected by this.


So an open source cross-platform toolkit controlled by a single 
entity isn't bad. I use GTK+ a lot, for instance. But the web 
and HTML is a better situation than Android and iOS and their 
toolkits.


I don't think the app stores are that big a deal as long as 
side-loading and multiple app stores are always allowed. Of 
course, that's not the case on iOS, one of the many reasons I've 
never really used an iOS device.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-18 Thread Joakim via Digitalmars-d
On Monday, 17 September 2018 at 22:27:41 UTC, Neia Neutuladh 
wrote:

On Monday, 17 September 2018 at 15:47:14 UTC, Joakim wrote:
Not sure why that matters if you agree with Kay that HTML is 
an abortion? :) I actually think it's great that mobile is 
killing off the web, as the Comscore usage stats I linked 
earlier show.


HTML is a somewhat open standard. I'm more happy with HTML and 
Javascript, as ugly as they are and as dominated by Google and 
Microsoft as they are, than having to target private companies' 
frameworks.


So you'd rather target an incredibly badly designed open standard 
than a mostly open source "private company's" framework that's 
certainly not great, but much better? It's no contest for me, 
give me the latter any day. And then of course, there's always 
cross-platform OSS toolkits like Flutter or DlangUI.


On Monday, 17 September 2018 at 23:42:03 UTC, Dave Jones wrote:

On Monday, 17 September 2018 at 15:47:14 UTC, Joakim wrote:

On Sunday, 16 September 2018 at 15:41:41 UTC, tide wrote:

On Sunday, 16 September 2018 at 15:11:42 UTC, Joakim wrote:

I say that almost 30% drop in PC sales over the last 7


Might be, but so is trying to convince everyone your 
predictions are correct so they will focus their work on the 
issues important to you.


Not at all, because if my predictions are correct, this 
language will disappear along with the PC platform it's built 
on. And I've never suggested anybody work on anything 
"important to [me]," my original post even stated that D may 
never do well on mobile.


You are making your arguments to fit your desires.


I can't make head nor tails of this claim, you have a talent for 
vague non sequiturs. My arguments are based on data, the 
overwhelming sales numbers I linked. I have no idea what desires 
you think are involved, I suspect you don't either. :)


Plateaus almost never happen, it's not the natural order of 
things.


OK the market stabilises.


I don't see how you changing the word you used changes 
anything about the underlying phenomenon: that doesn't happen.


You're seriously suggesting that markets never stabilise, say 
oil prices stay steady for a few years or some such?


Prices flit all over the place, that's not what we're talking 
about. Oil _production_ has been remarkably consistent and 
growing for many, many decades:


https://commons.m.wikimedia.org/wiki/File:Crude_NGPL_IEAtotal_1960-2008.svg

The only hiccup was in the early '80s because of extraordinary 
measures taken by governments, like price controls and cartel 
orders, which was still only a 15% drop.


If oil production ever drops 30% because some workable substitute 
comes along, as has happened to PCs now, yes, there is no chance 
of stabilization. It will be a steady decline from there, as 
these trends have a kind of momentum.


Most households have more devices than ever before, and 
hardware is only getting cheaper. The idea that people will 
have to choose just one device is plainly wrong.


You need to get out in the world a bit more. The majority of 
smartphones these days are bought in emerging markets where 
_nobody in their home has ever owned a PC or used the 
internet_. I've talked to these working stiffs in developing 
markets, you clearly haven't.


And what happens when the emerging markets mature? Do they 
still just cling on to one smart phone in the house? Or are 
they yearning for more technology?


They buy more mobile devices, the PC will be long since dead and 
gone.


I find it strange that you think the PC won't also be rolled 
up by mobile like this.


Can you put a 3GB hard drive in your phone?


Why would I ever want to do this when I noted my phone has 128 
GBs of space? ;) If you mean 3 _TB_, yes, I simply attach my 
slim 1 TB external drive and back up whatever I want over USB 
3.0.


So you're not averse to having some external hardware sat on 
your desk. Hmmm.


My original post links to examples of using your smartphone 
connected to a keyboard and monitor or a laptop shell, so I'm not 
sure where you ever got the idea I was against "external 
hardware."



Or a high end graphics card?


Smartphones come with very powerful graphics cards these days, 
plenty powerful enough to drive lots of graphic loads.


Not if you're into high end gaming.


The vast majority of PCs don't have cards capable of that either. 
For the few who want it, there will be specialized solutions, 
whether consoles or whatever.



Or a soundcard with balanced outputs?


Some phones come with high-end DACs and the like, or you could 
always attach something externally if you really needed to.


There's no such thing as professional audio breakout box for 
android AFAIK. Up until a few years ago the problem was Android 
couldn't do low latency audio, I'm not sure if the situation 
has changed.


If and when that becomes a market that actually matters, somebody 
will cater to it, just as google optimized the Android video 
stack for VR a couple years 

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-17 Thread Joakim via Digitalmars-d

On Sunday, 16 September 2018 at 15:41:41 UTC, tide wrote:

On Sunday, 16 September 2018 at 15:11:42 UTC, Joakim wrote:
I say that almost 30% drop in PC sales over the last 7 years 
is mostly due to the rise of mobile.


I think a large part of it is that PCs got fast enough for 
most people about 7-10 years ago. So it was a combination of 
mobile, and people no longer needing to get newer faster 
machines. The upgrade cycle moved from "I need a newer faster 
computer" to "I'll wait till my current system is worn out". 
(For a lot of people anyway)


Sure, that's part of it, but that suggests that once 
smartphones reach that performance threshold, they will 
replace PCs altogether. I think we've reached that threshold 
now.


I feel only looking at sales stats is irrelevant. I know people 
that have lost their phone and just bought a new phone. They 
get stolen a lot more easily. If your screen breaks you are 
better off buying a new phone as the cost of replacing the 
screen is going to be almost as much as a new one. Someone I 
know had to fight his boss to repair his phone cause he didn't 
want a brand new iPhone, he still has an Android device and 
they switched to Apple a while back. Note, it still costed more 
to buy the new phone than repair his old one.


Computers last much longer, I've had the one I have right now 
for 8 years. It runs everything I need it to. Faster than a 
smartphone or tablet, or even most newer laptops still. There's 
no reason to buy a new one, not that I would buy a prebuilt one 
anyways. Which I'm pretty sure are what those sales represent. 
Can't really count a CPU sale as a "PC" sale as it might just 
be someone upgrading from their old PC.


DIY PC sales are estimated at around 50 million a year, they 
don't move the needle compared to mobile sales. And yes, 
smartphones get broken easier and need to be upgraded more often, 
_just as the PC was once a shoddier product than a DEC 
minicomputer_, as Ken Olsen noted.


What _matters_ is that mobile is approaching 10X the sales of 
PCs. That pays for a lot of innovation and upgrades that the PC 
base simply cannot pay for: they just don't have the numbers. 
That is the _same_ way the PC swamped the minicomputer, and 
mobile is now doing it to the PC.


On Sunday, 16 September 2018 at 15:49:33 UTC, tide wrote:
That is, it is not just the performance that affects the sales 
of phones. There's a lot of factors that lead to there being 
new phones sales. Know someone that's gone through 3 phones in 
comparison to just the one I have. Treadmills eat phones for 
breakfast.


You're conflating my two arguments. Performance has nothing to do 
with why mobile sells a lot more already, that's all about 
battery life, mobility, 4G networks, etc. Performance is why 
mobile's about to kill off the PC too, because it's finally 
performant enough.


On Sunday, 16 September 2018 at 22:03:12 UTC, Gambler wrote:
You're right about APKs. Not sure whether it changed since I 
looked into it, or I didn't read the docs correctly in the 
first place. The overall dev/distribution process, though, 
still looks... uh, involved compared to compiling and running 
an executable on PC.


I suspect the 10-15 command-line steps listed there to build a 
GUI app on Android itself are _much less_ work than on any other 
platform, especially since you don't have to install any big SDK 
like VS, Xcode, or Qt where plenty of things can go wrong.


Of course, it can always be made simpler.

In general, I am still convinced of the overall negative effect 
of mobile devices on computing. They are designed to be used 
mostly for consumption and social sharing. They have a lot of 
limitations that currently drag the whole IT ecosystem down.


I think you want to cling to that opinion regardless of the 
evidence.



Some excellent high-level criticisms:

https://www.fastcompany.com/40435064/what-alan-kay-thinks-about-the-iphone-and-technology-now


An interesting interview, thanks for the link. Mostly not about 
mobile, but he seems to think the iPhone was too limiting and 
should have come with a stylus? Neither critique applies to 
Android, which is the vast majority of the mobile market, where 
Termux and the stylus of the Galaxy Note are available, if you 
want them.



http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/


He mostly states the obvious, of course touch is not the future 
of HCI interfaces. He mentions speech as a posibility in the 
addendum linked at the end, there are people working on it now (I 
can't believe it's been two years since this article was written):


https://arstechnica.com/gadgets/2016/11/google-home-review-a-step-forward-for-hotwords-a-step-backward-in-capability/

That excellent overview notes a problem with discoverability of 
voice commands in Google Home, so they'll have to come up with a 
kind of "manpage" for that. ;)


As for his preferred haptic approach, it's only really suited for 
certain kinds of spatial 

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-16 Thread Joakim via Digitalmars-d

On Sunday, 16 September 2018 at 10:25:30 UTC, Dave Jones wrote:

On Sunday, 16 September 2018 at 04:47:11 UTC, Joakim wrote:

On Sunday, 16 September 2018 at 01:03:27 UTC, Dave Jones wrote:
I know a lot of people who did, which explains the 28% drop 
in PC sales since they peaked in 2011, the year after the 
iPad came out. Many of those people who used to buy PCs have 
switched to tablets and other mobile devices.


Yet PC sales are up this year, mobile is down, and tablet 
sales have fallen for 3 years in a row.


Eh, these are all mostly mature markets now, so slight 
quarterly dips or gains don't matter much anymore. What does 
it matter that PC sales were up 2-3% last quarter when 7 times 
as many smartphones and mobile devices were sold in that same 
quarter?


Some analysts have predicted that PC sales will plateau at some 
point and if that's where we're at now then 30% drop in 
shipments is not death of the market.


I see no reason why they would plateau, looks like wishful 
thinking to me.


I say that almost 30% drop in PC sales over the last 7 years 
is mostly due to the rise of mobile.


I think a large part of it is that PCs got fast enough for most 
people about 7-10 years ago. So it was a combination of mobile, 
and people no longer needing to get newer faster machines. The 
upgrade cycle moved from "I need a newer faster computer" to 
"I'll wait till my current system is worn out". (For a lot of 
people anyway)


Sure, that's part of it, but that suggests that once smartphones 
reach that performance threshold, they will replace PCs 
altogether. I think we've reached that threshold now.


And just because there's been a trend for 5 or 6 years doesnt 
mean it will continue so inevitably.


Sure, but these trends almost never reverse. ;)


It doesnt need to reverse for "the PC is dead" to be false.


Plateaus almost never happen, it's not the natural order of 
things.


For example, newspapers hoped their ad revenue had plateaued from 
2000-2005, then they plunged:


https://en.m.wikipedia.org/wiki/File:Naa_newspaper_ad_revenue.svg

I've predicted that a similar plunge is about to happen to PC 
sales.


I actually think most people would prefer a separate desktop 
and mobile device, whether that desktop is just the size of 
pack of cigarettes, or a big box with 5 fans in it.


Why? Given how price-sensitive the vast majority of the 
computing-buying public is- that excludes the Apple sheeple 
who actually seem to get a hard-on from rising iPhone prices, 
all the better for them to show how much money they've lucked 
into by brandishing their "gold" iPhone ;) - I don't see most 
willing to spend twice on two devices, that could be replaced 
by just one. Until recently, they didn't have a choice, as you 
couldn't use your mobile device as a desktop, but the 
just-released devices I linked in the first post in this 
thread are starting to change that.


Because for about £300 you can get an intel NUC system with 
120GB SSD, which is more powerful and more upgradeable than 
your £700 mobile device. And some people still want that. And 
because most people have more than one TV, some have multiple 
phones, phones and tablets, and desktops, and multiple games 
consoles. And they still use them all in different situations.


That's more on the high end, where people use many devices. On 
the low- to mid-end of the market, where most of the sales 
happen, people are happy to buy fewer devices that get the job 
done.


This "one device" thing is your preference and you're 
projecting it onto everyone else.


Looks to me like you're the one projecting here. People used to 
buy standalone mp3 players, GPS devices, point-and-shoot cameras, 
handheld gaming consoles, etc., etc. Sales of all those 
standalone devices have been devastated by the smartphone; here's 
just one example of what happened to camera sales after the 
smartphone took over, which I've linked on this forum before:


https://petapixel.com/2017/03/03/latest-camera-sales-chart-reveals-death-compact-camera/

I find it strange that you think the PC won't also be rolled up 
by mobile like this.


Yes you can bring up examples of people who made mistakes 
predicting the future but that works both ways. You're just 
as guilty of seeing a two points and drawing a straight line 
though them.


Except none of these examples or my own prediction are based 
on simple extrapolation between data points. Rather, we're 
analyzing the underlying technical details and capabilities 
and coming to different conclusions about whether the status 
quo is likely to remain. So I don't think any of us are 
"guilty" of your charge.


Of course you are, you're making predictions and assuming the 
trends will continue, you assume the technical details are all 
important. Im saying they are only part of it, that people have 
requirements / preferences outside of how powerful the device 
is. Lots of people were predicting ebooks would kill the real 
book market a 

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-15 Thread Joakim via Digitalmars-d

On Sunday, 16 September 2018 at 01:03:27 UTC, Dave Jones wrote:

On Saturday, 15 September 2018 at 15:25:55 UTC, Joakim wrote:

On Friday, 14 September 2018 at 09:23:24 UTC, Dave Jones wrote:

On Thursday, 13 September 2018 at 22:56:31 UTC, Joakim wrote:

On Thursday, 13 September 2018 at 22:41:08 UTC, Nick


And people don't use PCs for such things? ;)


Sure, but they use them for a bunch of other stuff too. My 
point was that mobile growth has been in the "such things" but 
barely made a dent in the other stuff. So when you see 30% pc 
screen time and 70% mobile, its not a 70% drop in actual time 
spent in front of a PC. It's more a massive growth in time on 
mobile doing mostly banal pointless crap.


Sure, mobile has grown the market for digital entertainment and 
communication much more than taking away the time spent doing 
work on a PC, at least so far.


I know a lot of people who did, which explains the 28% drop in 
PC sales since they peaked in 2011, the year after the iPad 
came out. Many of those people who used to buy PCs have 
switched to tablets and other mobile devices.


Yet PC sales are up this year, mobile is down, and tablet sales 
have fallen for 3 years in a row.


Eh, these are all mostly mature markets now, so slight quarterly 
dips or gains don't matter much anymore. What does it matter that 
PC sales were up 2-3% last quarter when 7 times as many 
smartphones and mobile devices were sold in that same quarter?


More like when computers first started replacing typewriters, 
I'm sure many laughed at that possibility back then too. :)


Im not laughing at the idea of mobile eating into desktop PC 
share. What Im saying is that it hasnt done so as much as you 
think.


I say that almost 30% drop in PC sales over the last 7 years is 
mostly due to the rise of mobile. Not sure what you mean by "it 
hasnt done so as much as you think." You may argue that most 
using PCs aren't using them for entertainment, but this drop 
suggests that at least 30% of them were and have now moved to 
mobile.


And just because there's been a trend for 5 or 6 years doesnt 
mean it will continue so inevitably.


Sure, but these trends almost never reverse. ;)

I actually think most people would prefer a separate desktop 
and mobile device, whether that desktop is just the size of 
pack of cigarettes, or a big box with 5 fans in it.


Why? Given how price-sensitive the vast majority of the 
computing-buying public is- that excludes the Apple sheeple who 
actually seem to get a hard-on from rising iPhone prices, all the 
better for them to show how much money they've lucked into by 
brandishing their "gold" iPhone ;) - I don't see most willing to 
spend twice on two devices, that could be replaced by just one. 
Until recently, they didn't have a choice, as you couldn't use 
your mobile device as a desktop, but the just-released devices I 
linked in the first post in this thread are starting to change 
that.


You've probably heard of the possibly apocryphal story of how 
Blackberry and Nokia engineers disassembled the first iPhone 
and dismissed it because it only got a day of battery life, 
while their devices lasted much longer. They thought the 
mainstream market would care about such battery life as much 
as their early adopters, but they were wrong.


But here's a better story for this occasion, Ken Olsen, the 
head of DEC who built the minicomputers on which Walter got 
his start, is supposed to have disassembled the first IBM PC 
and this was his reaction:


"Ken Olsen bought one of the first IBM PCs and disassembled it 
on a table in Olsen’s office.


'He was amazed at the crappy power supply,' Avram said, 'that 
it was so puny.  Olsen thought that if IBM used such poor 
engineering then Digital didn’t have anything to worry about.'


Clearly Olsen was wrong."
https://www.cringely.com/2011/02/09/ken-olsen-and-post-industrial-computing/

You're making the same mistake as him. It _doesn't matter_ 
what people first use the new tool for, what matters is what 
it _can_ be used for, particularly over time. That time is 
now, as top and mid-range smartphone chips now rival mid-to 
low-end PC CPUs, which is the majority of the market. The 
x86/x64 PC's days are numbered, just as it once killed off the 
minicomputer decades ago.


Yes you can bring up examples of people who made mistakes 
predicting the future but that works both ways. You're just as 
guilty of seeing a two points and drawing a straight line 
though them.


Except none of these examples or my own prediction are based on 
simple extrapolation between data points. Rather, we're analyzing 
the underlying technical details and capabilities and coming to 
different conclusions about whether the status quo is likely to 
remain. So I don't think any of us are "guilty" of your charge.


Re: A Brief Intro to the SAoC Projects

2018-09-15 Thread Joakim via Digitalmars-d-announce

On Saturday, 15 September 2018 at 07:47:46 UTC, Mike Parker wrote:
I've posted to the blog a brief introduction to the projects 
that were selected for the Symmetry Autumn of Code. As the 
event goes on, I hope to provide more details about the 
projects and the individuals working on them.


The blog:
https://dlang.org/blog/2018/09/15/symmetry-autumn-of-code-is-underway/

Reddit:
https://www.reddit.com/r/d_language/comments/9fzrqd/symmetry_autumn_of_code_is_underway/?


Proggit post, I think they'll be interested in knowing what was 
chosen too:


https://www.reddit.com/r/programming/comments/9g2ifo/symmetry_autumn_of_code_is_underway_the_d_blog/


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-15 Thread Joakim via Digitalmars-d

On Friday, 14 September 2018 at 09:23:24 UTC, Dave Jones wrote:

On Thursday, 13 September 2018 at 22:56:31 UTC, Joakim wrote:
On Thursday, 13 September 2018 at 22:41:08 UTC, Nick 
Sabalausky (Abscissa) wrote:

On 09/10/2018 11:13 PM, tide wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
That's why PC sales keep dropping while mobile sales are 
now 6-7X that per year:


This shouldn't be misunderstood as such, which I think you 
as misunderstanding it. The reason mobile sales are so high 
is because of planned obsolescence and the walled garden 
that these devices are built around. I've gone through maybe 
3-4 phones in the time that I've had my Desktop, and I use 
my desktop every single day. I don't need to buy a new one 
cause it runs perfectly fine, there aren't operating system 
updates that purposely cause the CPU to run slower to "save 
battery life" when a new device and OS come out. That's not 
to say it isn't insignificant but the sales numbers are 
exacerbated.


Right. Basically, "sales stats" should never be misconstrued 
as "usage stats".


The usage stats are similarly overwhelming, two-thirds of 
digital time is spent on mobile, more for the young:


Yeah but 90% of the time people spend on mobile is just dicking 
about. Sending IMs, facebook, point and click games. And thats 
a huge part of the usage stats, people can now spend more time 
online wasting time in more situations than ever before.


And people don't use PCs for such things? ;) I know a lot of 
people who did, which explains the 28% drop in PC sales since 
they peaked in 2011, the year after the iPad came out. Many of 
those people who used to buy PCs have switched to tablets and 
other mobile devices.


PCs are generally seen a tool to accomplish tasks, for word 
processing or a high end gaming thing, audio / video editing, 
mobile is more entertainment. Not many people are doing what 
you are by using your mobile as a desktop.


I'm not saying that makes mobile worthless, what I'm saying is 
that your hypothesis is like saying TV has taken over from 
typewriters.


More like when computers first started replacing typewriters, I'm 
sure many laughed at that possibility back then too. :)


You've probably heard of the possibly apocryphal story of how 
Blackberry and Nokia engineers disassembled the first iPhone and 
dismissed it because it only got a day of battery life, while 
their devices lasted much longer. They thought the mainstream 
market would care about such battery life as much as their early 
adopters, but they were wrong.


But here's a better story for this occasion, Ken Olsen, the head 
of DEC who built the minicomputers on which Walter got his start, 
is supposed to have disassembled the first IBM PC and this was 
his reaction:


"Ken Olsen bought one of the first IBM PCs and disassembled it on 
a table in Olsen’s office.


'He was amazed at the crappy power supply,' Avram said, 'that it 
was so puny.  Olsen thought that if IBM used such poor 
engineering then Digital didn’t have anything to worry about.'


Clearly Olsen was wrong."
https://www.cringely.com/2011/02/09/ken-olsen-and-post-industrial-computing/

You're making the same mistake as him. It _doesn't matter_ what 
people first use the new tool for, what matters is what it _can_ 
be used for, particularly over time. That time is now, as top and 
mid-range smartphone chips now rival mid-to low-end PC CPUs, 
which is the majority of the market. The x86/x64 PC's days are 
numbered, just as it once killed off the minicomputer decades ago.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-14 Thread Joakim via Digitalmars-d

On Friday, 14 September 2018 at 16:53:16 UTC, Iain Buclaw wrote:
On 14 September 2018 at 09:51, Joakim via Digitalmars-d 
 wrote:
On Wednesday, 12 September 2018 at 22:41:31 UTC, Iain Buclaw 
wrote:


On 12 September 2018 at 10:09, Joakim via Digitalmars-d 
 wrote:


I think their model of having an open ISA with proprietary 
extensions
will inevitably win out for hardware, just as a similar 
model has basically
won already for software, but that doesn't mean that RISC-V 
will be the one

to do it. Someone else might execute that model better.



POWER9 has been making some headway, for instance finally 
they have a sensible real type (IEEE Quadruple).  Though the 
developers working on glibc support seem to be making a 
shambles of it, where they want to support both new and old 
long double types at the same time at run-time!  It seems 
that no one thought about Fortran, Ada, or D when it came to 
long double support in the C runtime library *sigh*.


For us, I think we can choose to ignore the old IBM 128-bit 
float, and so remove any supporting code from our library, 
focusing instead only on completing IEEE 128-bit float 
support (LDC, upstream your local patches before i start 
naming and shaming you).



All the pulls linked from that AArch64 tracker issue above 
were submitted upstream first before merging into the ldc 
repo. Only one patch that I know of hasn't been merged 
upstream yet: my commit to add IEEE Quadruple support to 
core.internal.convert, only because I want to add another 
Android commit to that pull soon, but the patch is available 
in the open druntime pulls.


If you know of some other patches that need to be upstreamed, 
let us know, AFAIK they were all upstreamed first.




Can you send me links to any open PR you have?  These should 
not be sitting around for months without merge.


That's on me: I had another commit in the works for Android 
that's mostly working, but put it aside for the ldc 1.11 release, 
updating the docs on the wiki, and now reworking the Android 
emulated TLS patch for the upcoming LLVM 7 release. Feel free to 
use the commit I submitted here a couple months ago or to review 
it, but I'd like to get that second Android commit in before that 
pull's merged:


https://github.com/dlang/druntime/pull/2257




Re: Mobile is the new PC and AArch64 is the new x64

2018-09-14 Thread Joakim via Digitalmars-d
On Wednesday, 12 September 2018 at 22:41:31 UTC, Iain Buclaw 
wrote:
On 12 September 2018 at 10:09, Joakim via Digitalmars-d 
 wrote:
I think their model of having an open ISA with proprietary 
extensions will inevitably win out for hardware, just as a 
similar model has basically won already for software, but that 
doesn't mean that RISC-V will be the one to do it. Someone 
else might execute that model better.




POWER9 has been making some headway, for instance finally they 
have a sensible real type (IEEE Quadruple).  Though the 
developers working on glibc support seem to be making a 
shambles of it, where they want to support both new and old 
long double types at the same time at run-time!  It seems that 
no one thought about Fortran, Ada, or D when it came to long 
double support in the C runtime library *sigh*.


For us, I think we can choose to ignore the old IBM 128-bit 
float, and so remove any supporting code from our library, 
focusing instead only on completing IEEE 128-bit float support 
(LDC, upstream your local patches before i start naming and 
shaming you).


All the pulls linked from that AArch64 tracker issue above were 
submitted upstream first before merging into the ldc repo. Only 
one patch that I know of hasn't been merged upstream yet: my 
commit to add IEEE Quadruple support to core.internal.convert, 
only because I want to add another Android commit to that pull 
soon, but the patch is available in the open druntime pulls.


If you know of some other patches that need to be upstreamed, let 
us know, AFAIK they were all upstreamed first.


ARM seems to be taking RISC-V seriously at least (this site was 
taken down after a couple days if I understand right: 
http://archive.fo/SkiH0).  There is currently a lot of 
investment going into ARM64 in the server space right now, but 
signals I'm getting from people working on those projects are 
that it just doesn't hold water.  With one comparison being a 
high end ARM64 server is no better than a cheap laptop bought 5 
years ago.


As Kagamin says, it depends on how many cores you're using and 
what benchmark you run, but most of the time, that's not true at 
all:


https://blog.cloudflare.com/arm-takes-wing/

And ARM does it with much less electric power used, as shown in 
that last graph, which you have to take into account when looking 
at the total costs. The ARM blog post I linked earlier in this 
thread shows they've gone ahead with using ARM too.


RISC-V got accepted into gcc-7, and runtime made it into glibc 
2.27, there's certainly a lot effort being pushed for it.  They 
have excellent simulator support on qemu, porting druntime only 
took two days.  Patches for RISCV64 will come soon, probably 
with some de-duplication of large blocks.


Great, but it's still in very nascent stages, with linux only 
running on it this year. I thought about using Qemu but figured 
the slowness and possible hardware compatibility issues weren't 
worth it.


I hope some open arch like these takes off sometime soon, as I 
don't like an ARM monopoly much better than the previous Intel 
one, but it's going to take awhile for POWER/RISC-V to get 
anywhere close.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-13 Thread Joakim via Digitalmars-d
On Thursday, 13 September 2018 at 22:41:08 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/10/2018 11:13 PM, tide wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
That's why PC sales keep dropping while mobile sales are now 
6-7X that per year:


This shouldn't be misunderstood as such, which I think you as 
misunderstanding it. The reason mobile sales are so high is 
because of planned obsolescence and the walled garden that 
these devices are built around. I've gone through maybe 3-4 
phones in the time that I've had my Desktop, and I use my 
desktop every single day. I don't need to buy a new one cause 
it runs perfectly fine, there aren't operating system updates 
that purposely cause the CPU to run slower to "save battery 
life" when a new device and OS come out. That's not to say it 
isn't insignificant but the sales numbers are exacerbated.


Right. Basically, "sales stats" should never be misconstrued as 
"usage stats".


The usage stats are similarly overwhelming, two-thirds of digital 
time is spent on mobile, more for the young:


https://www.searchforce.com/blog/the-comscore-u-s-mobile-app-report-2017/

I went all-mobile three years ago, haven't looked back.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-12 Thread Joakim via Digitalmars-d

On Wednesday, 12 September 2018 at 15:38:36 UTC, Joakim wrote:

the world is right now? It's not IBM, Apple,


Whoops, meant to write Intel here, but wrote Apple again. :D


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-12 Thread Joakim via Digitalmars-d

On Wednesday, 12 September 2018 at 06:41:38 UTC, Gambler wrote:

On 9/10/2018 9:43 AM, Joakim wrote:
Yes, I know, these devices won't replace your quad-core Xeon 
workstation with 32-64 GBs of RAM anytime soon, but most 
people don't need anywhere near that level of compute. That's 
why PC sales keep dropping while mobile sales are now 6-7X 
that per year:
I'm all for supporting modern open CPU architectures. At the 
same time,
I fear that the specific trend you're describing here (people 
ditching
PCs for cellphones/tablets) is effectively a reversal of the PC 
revolution.


For the last 30+ years people benefited from "trickle down 
computing". They had access to PCs that were equivalent to 
cutting edge servers of 6-7 years prior. They had ability to 
choose their operating system, expand and upgrade their 
hardware and install any software they wanted.


All of this is breaking down right now.


Yes and no, it is true that that is the way tech  _used_ to 
diffuse. However, do you know what the largest tech company in 
the world is right now? It's not IBM, Apple, HP, or Microsoft, ie 
none of the server or PC companies. It's Apple, which doesn't 
sell into the server or traditional enterprise markets almost at 
all and only has 15-20% unit share in the mobile market.


In other words, consumer tech markets are _much_ larger than the 
server/enterprise markets that used to lead tech R, which means 
consumer tech like mobile is what leads the way now.


As for choosing your own OS, that's still possible, but as 
always, it can be tough to get drivers for your hardware:


https://together.jolla.com/question/136143/wiki-available-devices-running-sailfish-os/

And if you simply want to tinker with the Android OS on your 
device, there are many ways to do that:


https://www.xda-developers.com/how-to-install-custom-rom-android/

No need to expand and upgrade your hardware when prices keep 
dropping in this Darwinian market. There's now a $500 phone with 
a faster chip than the one I got just 7 months back for $700:


https://m.newegg.com/products/N82E16875220078

As for installing any software you want, Android allows it: it's 
how I debug the apps I build on my phone or tablet. The iPhone 
doesn't, but it's a minority of the mobile market.


Intel got lazy without competition and high-end CPU 
architectures stagnated. All the cutting-edge computing is done 
on NVidia cards today. It requires hundreds of gigabytes of 
RAM, tens of terabytes of data and usage of specialized 
computing libraries. I very much doubt this will "trickle down" 
to mobile in foreseeable future. Heck, most developer laptops 
today have no CUDA capabilities to speak of.


I question the need for such "cutting-edge computing" in the 
first place, but regardless, it has already moved down to mobile 
and other edge devices:


https://arstechnica.com/gadgets/2017/10/the-pixel-2-contains-a-custom-google-soc-the-pixel-visual-core/
https://www.theverge.com/2018/7/26/17616140/google-edge-tpu-on-device-ai-machine-learning-devkit

Moreover, mobile devices are locked down by default and it's no 
trivial task to break out of those walled gardens. IIRC, Apple 
has an official policy of not allowing programming tools in 
their app store. Alan Kay had to personally petition Steve Jobs 
to allow Scratch to be distributed, so kids could learn 
programming. I believe the general policy is still in place.


They have their own app for that now:

https://www.apple.com/swift/playgrounds/

Android is better, but it's still a horror to do real work on, 
compared to any PC OS. Fine, you rooted it, installed some 
compilers and so on. How will you share your software with 
fellow Android users?


You seem to have missed all the posts I've made here before about 
native Android support for ldc: :) _I have never rooted any of my 
Android devices_. Compiling D code on most any Android device is 
as simple as installing an app from the official Play Store and 
typing a single command, `apt install ldc`:


https://wiki.dlang.org/Build_D_for_Android

The instructions there even show you how to package up an Android 
GUI app, an apk, on Android itself, by using some other packages 
available in that Android app.


In essence, we are seeing the rapid widening of two digital 
divides. The first one is between users and average developers. 
The second one is between average developers and researchers at 
companies like Google. I very much doubt that we will see an 
equivalent of today's high-end machine learning server on 
user's desk, let alone in anyone's pocket, within 7 years.


I disagree on both counts. First off, people were running 
supercomputers and UNIX workstations while you were piddling 
along on your PC decades ago. That changed nothing about what you 
were able to learn and accomplish on your PC. In fact, you were 
probably much better off than they were, as the PC skills you 
picked up were likely in much more demand than their 
supercomputing 

  1   2   3   4   5   6   7   8   9   10   >