Re: shared - i need it to be useful

2018-10-21 Thread Joakim via Digitalmars-d

On Monday, 22 October 2018 at 00:22:19 UTC, Manu wrote:
On Sun, Oct 21, 2018 at 2:35 PM Walter Bright via Digitalmars-d 
 wrote:


On 10/21/2018 2:08 PM, Walter Bright wrote:
> On 10/21/2018 12:20 PM, Nicholas Wilson wrote:
>> Yes, but the problem you describe is arises from implicit 
>> conversion in the other direction, which is not part of the 
>> proposal.

>
> It's Manu's example.

Then I don't know what the proposal is. Pieces of it appear to 
be scattered over numerous posts, mixed in with other text,


No no, they're repeated, not scattered, because I seem to have 
to keep repeating it over and over, because nobody is reading 
the text, or perhaps imaging there is a lot more text than 
there is.


I told you this is what happens with forum posts 4 days ago, yet 
you didn't listen:


https://forum.dlang.org/post/fokdcnzircoiuhrhz...@forum.dlang.org


opinions, and handwavy stuff.


You mean like every post in opposition which disregards the 
rules and baselessly asserts it's a terrible idea? :/



There's nothing to point to that is "the proposal".


You can go back to the OP, not a single detail is changed at 
any point, but I've repeated it a whole bunch of times 
(including in direct response to your last post) and the 
summary has become more concise, but not different.


1. Shared has no read or write access to data
2. Functions with shared arguments are threadsafe with respect 
to

those arguments
  a. This is a commitment that must be true in _absolute terms_ 
(there
exists discussion about the ways that neighbours must not 
undermine

this promise)
  b. There are numerous examples demonstrating how to configure 
this
(TL;DR: use encapsulation, and @trusted at the bottom of the 
stack)


If you can find a legitimate example where those rules don't 
hold, I

want to see it.
But every example so far has been based on a faulty premise 
where

those 2 simple rules were not actually applied.


Put it all together in a 2-3 page proposal elsewhere, so he 
doesn't have to hunt everything out in a blizzard of forum posts.


I responded to your faulty program directly with the correct 
program, and you haven't acknowledged it. Did you see it?


I suggest you and Manu write up a proper proposal. Something 
that is complete, has nothing else in it, has a rationale, 
illuminating examples, and explains why alternatives are 
inferior.


I have written this program a couple of times, including in 
direct

response to your last sample program.
You seem to have dismissed it... where is your response to that
program, or my last entire post?


For examples of how to do it:

https://github.com/dlang/DIPs/tree/master/DIPs

Trying to rewrite the semantics of shared is not a simple 
task, doing multithreading correctly is a minefield of "OOPS! 
I didn't think of that!" and if anything cries out for a DIP, 
your and Manu's proposal does.


Yes, I agree it's DIP worthy. But given the almost nothing but 
overt

hostility I've received here, why on earth would I waste my time
writing a DIP?
I put months into my other DIP which sits gathering dust... if 
this
thread inspired any confidence that it would be well-received I 
would
make the effort, but the critical reception we've seen here 
is... a

bit strange.
It's a 2-point proposal, the rules are **SO SIMPLE**, which is 
why I
love it. How it can be misunderstood is something I'm having 
trouble
understanding, and I don't know how to make it any clearer than 
I
already have; numerous times over, including in my last reply 
to you,

which you have ignored and dismissed it seems.

Please go back and read my response to your last program.


He did not say to write a full DIP, just a proposal, so he knows 
exactly what you mean, just as I said. It will require a DIP 
eventually, but he didn't ask you to write one now.


Re: Interesting Observation from JAXLondon

2018-10-20 Thread Joakim via Digitalmars-d
On Sunday, 21 October 2018 at 01:12:44 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 10/12/18 4:05 AM, Vijay Nayar wrote:
But the D community has also been very receptive of changes to 
the language




The community is. I don't feel like it's been true of the 
leadership for some years now (and I don't mean just W&A.)


One thing that does concern me, is the avenues in which people 
can discover D.  For me personally, after a particularly nasty 
C++ project, I just googled for "alternatives to C++" and 
that's how I found D back in 2009 or so.  But the same search 
today turns up nothing about D.  I'm not sure sure how people 
are supposed to find D.


This is a VERY important thing, and it's true for many of us 
(myself included). This why it was a HUGE mistake when the 
community decided it should become taboo to promote D as a 
redesigned C++. That was ALWAYS D's core strength, we all know 
it, that's why many (if not most) of us are here, and hell, 
that's literally what D was *intentionally designed* to be.


But then political correctness came and threw that angle out 
the window, in favor of this awkward "fast code fast" nonsense, 
and we've been fighting the uphill "I don't understand the 
point of D" image battle ever since.


Simple, C++ is increasingly seen as irrelevant by those choosing 
a new language, so D's real competition is now Go, Rust, Swift, 
Nim, Zig, etc. These are people who want to write "fast code 
fast," well except for Rust users, who value ownership more.


Also, D can pitch itself to Java/C# users who need more 
performance with that softer pitch, because many of them have 
been burned by C++ and would recoil if you made the explicit C++ 
comparison. It is well-known that Rust and Go are attracting 
users from the Java and scripting communities, D needs to attract 
them too, as the Tilix dev noted to me last year:


"[M]y background is in Java. I found it quite interesting at 
DConf when I asked how many people came from a non C/C++ 
background that only one other fellow raised his hand...


I tend to get more annoyed about the negativity in the forums 
with regards to GC. I do feel that sometimes people get so 
wrapped up in what D needs for it to be a perfect systems 
language (i.e. no GC, memory safety, etc.), it gets overlooked 
that it is a very good language for building native applications 
as it is now. While D is often compared to Rust, in some ways the 
comparison to Go is more interesting to me. Both are GC-based 
languages and both started as systems languages, however Go 
pivoted and doubled down on the GC and has seen success. One of 
the Red Hat products I support, OpenShift, leverages Kubernetes 
(a Google project) for container orchestration and it’s written 
in Go.


I think D as a language is far superior to Go, and I wish we 
would toot our horn a little more in this regard instead of the 
constant negative discussion around systems programming."

https://dlang.org/blog/2017/08/11/on-tilix-and-d-an-interview-with-gerald-nunn/


Re: Need help with setting up LDC to cross-compile to Android/ARM

2018-10-20 Thread Joakim via Digitalmars-d

On Friday, 19 October 2018 at 22:19:31 UTC, H. S. Teoh wrote:
On Fri, Oct 19, 2018 at 02:41:48PM -0700, H. S. Teoh via 
Digitalmars-d wrote: [...]
In the meantime, is there a particular version of the NDK that 
I should use?  Currently I have 
android-ndk-r13b-linux-x86_64.zip installed.  Will it work?

[...]

Haha, I feel so silly now.  NDK r13b does not seem to have the 
sysroot subdir required by the clang build command, that's why 
it couldn't find the system headers.  So I ditched r13b and 
installed r17b instead, and now I can build the runtime 
successfully!


Ah, that makes sense, that NDK is ancient, ;) it came out two 
years ago:


https://developer.android.com/ndk/downloads/revision_history

Official D support for Android was added to ldc 1.4 last 
September, which was after NDK r15c came out, when they switched 
to that sysroot directory with unified headers for all Android 
versions, so that's what ldc uses. Before that, each Android 
version had its headers in a separate directory, which isn't 
supported by LDC.



I tried ldc-build-runtime with --ninja and it
came back with a bunch of errors about "cortex-a8"
being an unsupported target, and then segfaulted.


That's likely because you left off the double-quotes around the 
list of semicolon-separated flags passed to ldc-build-runtime 
--dFlags: the double quotes are required, as shown in the docs.


On Saturday, 20 October 2018 at 04:01:37 UTC, H. S. Teoh wrote:
On Fri, Oct 19, 2018 at 08:50:36PM +, Joakim via 
Digitalmars-d wrote:
On Wednesday, 17 October 2018 at 21:23:21 UTC, H. S. Teoh 
wrote:

> I'm trying to follow the instructions on this page:
> 
> 	https://wiki.dlang.org/Build_D_for_Android

[...]

On a side note, the last section on that page mentions not 
knowing how to create a keystore from scratch; actually, it's 
very simple with the `keytool` utility that comes with the 
Oracle JRE.  I added the steps on the talk page.  The only 
thing I'm unsure about is whether keytool is available natively 
on Android.  If not, then you'll have to generate the keystore 
on a PC and copy it over to Android afterwards.


From scratch meaning without using keytool, ie OpenSSL or some 
other hashing/fingerprinting tool alone, because keytool isn't 
available in the Termux app. As mentioned at the end of the wiki 
page, I used to manually edit the apk hashed manifests using 
OpenSSL alone until that apksigner tool was added later.


Re: Need help with setting up LDC to cross-compile to Android/ARM

2018-10-19 Thread Joakim via Digitalmars-d

On Friday, 19 October 2018 at 20:50:36 UTC, Joakim wrote:

On Wednesday, 17 October 2018 at 21:23:21 UTC, H. S. Teoh wrote:

I'm trying to follow the instructions on this page:

https://wiki.dlang.org/Build_D_for_Android

[...]


Hmm, that's weird: can you extract the full compiler command 
for that file? For example, if you use Ninja, by appending 
--ninja to ldc-build-runtime, it will tell you the full command 
that failed. Not sure if Make has a way to get that too.


Also, if you're using a system-provided LDC, it may not support 
Android, if it wasn't built against our slightly tweaked llvm:


https://github.com/ldc-developers/llvm

In that case, use the LDC download from github instead:

https://github.com/ldc-developers/ldc/releases


Re: Need help with setting up LDC to cross-compile to Android/ARM

2018-10-19 Thread Joakim via Digitalmars-d

On Wednesday, 17 October 2018 at 21:23:21 UTC, H. S. Teoh wrote:

I'm trying to follow the instructions on this page:

https://wiki.dlang.org/Build_D_for_Android

[...]


Hmm, that's weird: can you extract the full compiler command for 
that file? For example, if you use Ninja, by appending --ninja to 
ldc-build-runtime, it will tell you the full command that failed. 
Not sure if Make has a way to get that too.


[OT] Android

2018-10-19 Thread Joakim via Digitalmars-d

On Thursday, 18 October 2018 at 19:37:24 UTC, H. S. Teoh wrote:
On Thu, Oct 18, 2018 at 07:09:42PM +, Patrick Schluter via 
Digitalmars-d wrote: [...]
I often have the impression that a lot of things are going 
slower than necessary because a mentality where the perfect is 
in the way of good.


That is indeed an all-too-frequent malady around these parts, 
sad to say. Which has the sad consequence that despite all 
efforts, there are still unfinished areas in D, and promises 
that haven't materialized in years (like multiple alias this).


Still, the parts of D that are working well form a very 
powerful and comfortable-to-use language.  Not quite the ideal 
we wish it to be, but IMO much closer than any other language 
I've seen yet.  Recently I began dabbling in Android 
programming, and the one thing that keeps sticking out to me is 
how painful writing Java is.  Almost every day of writing Java 
code has me wishing for this or that feature in D.  Slices. 
Closures.  Meta-programming.  I found most of my time spent 
fighting with language limitations rather than make progress 
with the problem domain.


Yes, this is why I began the Android port: I couldn't imagine 
writing Java.


Eventually I resorted to generating Java code from D for some 
fo the most painful repetitive parts, and the way things are 
looking, I'm likely to be doing a lot more of that.  I fear the 
way things are going will have be essentially writing a D to 
Java compiler at some point!


Why not just use the Android port of D?


Re: shared - i need it to be useful

2018-10-17 Thread Joakim via Digitalmars-d

On Wednesday, 17 October 2018 at 23:12:48 UTC, Manu wrote:
On Wed, Oct 17, 2018 at 2:15 PM Stanislav Blinov via 
Digitalmars-d  wrote:


On Wednesday, 17 October 2018 at 19:25:33 UTC, Manu wrote:
> On Wed, Oct 17, 2018 at 12:05 PM Stanislav Blinov via 
> Digitalmars-d  wrote:

>>
>> On Wednesday, 17 October 2018 at 18:46:18 UTC, Manu wrote:
>>
>> > I've said this a bunch of times, there are 2 rules:
>> > 1. shared inhibits read and write access to members
>> > 2. `shared` methods must be threadsafe
>> >
>> >>From there, shared becomes interesting and useful.
>>
>> Oh God...
>>
>> void atomicInc(shared int* i) { /* ... */ }
>>
>> Now what? There are no "methods" for ints, only UFCS. Those 
>> functions can be as safe as you like, but if you allow 
>> implicit promotion of int* to shared int*, you *allow 
>> implicit races*.

>
> This function is effectively an intrinsic. It's unsafe by 
> definition.


Only if implicit conversion is allowed. If it isn't, that's 
likely @trusted, and this:


void atomicInc(ref shared int);

can even be @safe.


In this case, with respect to the context (a single int) 
atomicInc()
is ALWAYS safe, even with implicit conversion. You can 
atomicInc() a

thread-local int perfectly safely.


> It's a tool for implementing threadsafe machinery.
> No user can just start doing atomic operations on random ints
> and say
> "it's threadsafe", you must encapsulate the threadsafe
> functionality
> into some sort of object that aggregates all concerns and
> presents an
> intellectually sound api.

Threadsafety starts and ends with the programmer. By your 
logic *all* functions operating on `shared` are unsafe then. 
As far as compiler is concerned, there would be no difference 
between these two:


struct S {}
void atomicInc(ref shared S);

and

struct S { void atomicInc() shared { /* ... */ } }

The signatures of those two functions are exactly the same. 
How is that different from a function taking a shared int 
pointer or reference?


It's not, atomicInc() of an int is always safe with respect to 
the int itself.
You can call atomicInc() on an unshared int and it's perfectly 
fine,
but now you need to consider context, and that's a problem for 
the

design of the higher-level scope.

To maintain thread-safety, the int in question must be 
appropriately contained.


The problem is that the same as the example I presented before, 
which I'll repeat:


struct InvalidProgram
{
  int x;
  void fun() { ++x; }
  void gun() shared { atomicInc(&x); }
}

The method gun() (and therefore the whole object) is NOT 
threadsafe by

my definition, because fun() violates the threadsafety of gun().
The situation applies equally here that:
int x;
atomicInc(&x);
++x; // <- by my definition, this 'API' (increment an int) 
violates
the threadsafety of atomicInc(), and atomicInc() is therefore 
not

threadsafe.

`int` doesn't present a threadsafe API, so int is by 
definition, NOT threadsafe. atomicInc() should be @system, and 
not @trusted.


If you intend to share an int, use Atomic!int, because it has a 
threadsafe API.
atomicInc(shared int*) is effectively just an unsafe intrinsic, 
and

its only use is at ground-level implementation of threadsafe
machinery, like malloc() and free().


> Let me try one:
>
> void free(void*) { ... }
>
> Now what? I might have dangling pointers... it's a 
> catastrophe!


One could argue that it should be void free(ref void* p) { /* 
...

*/ p = null; }


void *p2 = p;
free(p);
p2.crash();

As a matter of fact, in my own allocators memory blocks 
allocated by them are passed by value and are non-copyable, 
they're not just void[] as in std.experimental.allocator. One 
must 'move' them to pass ownership, and that includes 
deallocation. But that's another story altogether.


Right, now you're talking about move semantics to implement 
transfer of ownership... you might recall I was arguing this 
exact case to express transferring ownership of objects between 
threads earlier. This talk of blunt casts and "making sure 
everything is good" is all just great, but it doesn't mean 
anything interesting with respect to `shared`. It should be 
interesting even without unsafe casts.



> It's essentially the same argument.
> This isn't a function that professes to do something that
> people might
> misunderstand and try to use in an unsafe way, it's a 
> low-level

> implementation device, which is used to build larger *useful*
> constructs.

You're missing the point, again. You have an int. You pass a 
pointer to it to some API that takes an int*. You continue to 
use your int as just an int.


You have written an invalid program. I can think of an infinite 
number

of ways to write an invalid program.
In this case, don't have an `int`, instead, have an Atomic!int; 
you

now guarantee appropriate access, problem solved!
If you do have an int, don't pass it to other threads at random 
when
you don't have any idea what they intend to do with it! That's 
basic
common sense. You don't 

Re: Interesting Observation from JAXLondon

2018-10-12 Thread Joakim via Digitalmars-d

On Friday, 12 October 2018 at 07:13:33 UTC, Russel Winder wrote:
On Thu, 2018-10-11 at 13:00 +, bachmeier via Digitalmars-d 
wrote: […]

Suggestions?

My guess is that the reason they've heard of those languages 
is because their developers were writing small projects using 
Go and Rust, but not D.


I fear it may already be too late. Go, and now Rust, got 
marketing hype from an organisation putting considerable 
resources into the projects. This turned into effort from the 
community that increased rapidly, turning the hype into 
frameworks and libraries, and word of mouth marketing. It is 
the libraries and frameworks that make for traction. Now the 
hype is gone, Go and Rust, and their libraries and frameworks, 
are well positioned and with significant penetration into the 
minds of developers.


Talk to Java developers and they have heard of Go and Rust, but 
not D. Go is
more likely to them because of Docker and the context of The 
Web, for which Go
has a strong pitch. They have heard of Rust but usually see it 
as not relevant

to them, despite Firefox.

Talk to Python developers and they know of Go, many of them of 
Rust, but
almost never D. C and C++ are seen as the languages of 
performance extensions,

though Rust increasingly has a play there.

D has vibe.d, PyD, GtkD, and lots of other bits, but they've 
never quite had the resources of the equivalents in Go and Rust.


Also the D community as a whole is effectively introvert, 
whereas Go and Rust communities have been quite extrovert. 
"Build it and they will come" just doesn't work, you have to be 
pushy and market stuff, often using guerilla marketing, to get 
mindshare.


D has an excellent position against Python (for speed of 
development but without the performance hit) but no chance of 
penetrating the places where Python is strong due to lack of 
libraries and frameworks that people use – cf. Pandas, 
SciKit.Learn, etc.


D has an excellent position against Go as a language except 
that Go has goroutines and channels. The single threaded event 
loop and callback approach is losing favour. Kotlin is 
introducing Kotlin Coroutines which is a step on from the 
observables system of Rx. Structured concurrency abstracting 
away from fibres and threadpools. Java may well get this via 
Project Loom which is Quasar being inserted into the JVM 
directly. Whatever D has it doesn't seem to be going to compete 
in this space.


D without the GC has a sort of position against Rust, but I 
think that battle has been lost. Rust has won in the "the new C 
that isn't Go and doesn't have a garbage collector, and isn't 
C++, but does have all the nice monads stuff, oh and memory 
safety mostly".


When it comes down to it D will carry on as a niche language 
loved by a few unknown to most.


There is truth in much of what you say, but D has to pick its 
battles. Given the design of the language, I see two primary 
use-cases right now:


1. apps that need some level of performance, ie Tilix
2. Low-level tools that need a lot of performance, ie Weka or 
Sociomantic


Going after some established tool like Pandas and its mix of 
Python and C is likely to fail right now, as D is never going to 
be as easy as Python, and presumably Pandas has already sped up 
whatever it needs to in C. Maybe you could create a better tool 
in D some day when the D ecosystem is larger, but I don't think 
it would be the best approach today.


We need to think about what markets D would be best suited for 
and aim for those, while at the same time resisting the 
temptation to make D too specialized for those initial markets, 
which is a trap many other languages fall into.


Re: Interesting Observation from JAXLondon

2018-10-11 Thread Joakim via Digitalmars-d

On Thursday, 11 October 2018 at 12:22:19 UTC, Vijay Nayar wrote:

On Thursday, 11 October 2018 at 11:50:39 UTC, Joakim wrote:
On Thursday, 11 October 2018 at 07:58:39 UTC, Russel Winder 
wrote:

This was supposed to come to this list not the learn list.

On Thu, 2018-10-11 at 07:57 +0100, Russel Winder wrote:
It seems that in the modern world of Cloud and Kubernetes, 
and the charging
model of the Cloud vendors, that the startup times of JVMs 
is becoming a
financial problem. A number of high profile companies are 
switching from

Java
to Go to solve this financial difficulty.

It's a pity D is not in there with a pitch.

I suspect it is because the companies have heard of Go (and 
Rust), but not

D.


I doubt D could make a pitch that would be heard, no google 
behind it and all that jazz. D is better aimed at startups 
like Weka who're trying to disrupt the status quo than Java 
shops trying to sustain it, while shaving off some up-front 
time.


Personally I think this is going to change soon depending on 
what options are available.  The amount of time and money that 
companies, especially companies using Java and AWS, are putting 
in to saving money with Nomad or Kubernetics on the promise of 
having more services per server is quite high.  However, these 
JVM based services run in maybe 1-2GB of RAM at the minimum, so 
they get maybe 4 services per box.


A microservice built using D and vibe.d could easily perform 
the same work using less CPU and maybe only 500MB of RAM.  The 
scale of improvement is roughly the same as what you would get 
by moving to containerization.


If D has the proper libraries and integrations available with 
the tools that are commonly used, it could easily break through 
and become the serious language to use for the competitive 
business of the future.


But libraries and integrations will make or break that.  It's 
not just Java you're up against, it's all the libraries like 
SpringBoot and all the integrations with AWS systems like SQS, 
SNS, Kinesis, MySQL, PostGREs, Redis, etc.


My hope is that D will be part of that future and I'm trying to 
add libraries as time permits.


I'm skeptical of that cloud microservices wave building up right 
now. I suspect what's coming is a decentralized mobile wave, just 
as the PC once replaced big iron like mainframes and 
minicomputers, since top mobile CPUs now rival desktop CPUs:


"the [Apple] A12 outperforms a moderately-clocked Skylake CPU in 
single-threaded performance"

https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-unveiling-the-silicon-secrets/4

Many of the crypto-coins are trying to jumpstart a decentralized 
app ecosystem: someone will succeed.


Re: Interesting Observation from JAXLondon

2018-10-11 Thread Joakim via Digitalmars-d

On Thursday, 11 October 2018 at 07:58:39 UTC, Russel Winder wrote:

This was supposed to come to this list not the learn list.

On Thu, 2018-10-11 at 07:57 +0100, Russel Winder wrote:
It seems that in the modern world of Cloud and Kubernetes, and 
the charging
model of the Cloud vendors, that the startup times of JVMs is 
becoming a
financial problem. A number of high profile companies are 
switching from

Java
to Go to solve this financial difficulty.

It's a pity D is not in there with a pitch.

I suspect it is because the companies have heard of Go (and 
Rust), but not

D.


I doubt D could make a pitch that would be heard, no google 
behind it and all that jazz. D is better aimed at startups like 
Weka who're trying to disrupt the status quo than Java shops 
trying to sustain it, while shaving off some up-front time.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-04 Thread Joakim via Digitalmars-d

On Thursday, 4 October 2018 at 10:02:28 UTC, Russel Winder wrote:
On Thu, 2018-10-04 at 08:06 +, Joakim via Digitalmars-d 
wrote:

[…]

The link in my OP links to a guy who maintained a spreadsheet 
of Apple-related conferences as evidence. He lists several 
that went away and says nothing replaced them. If you don't 
even examine the evidence provided, I'm not sure why we should 
care about your opinions.


So Apple conferences are a dead end.


Remember though that this is the top developer ecosystem on the 
planet right now, as iOS apps bring in more revenue than Android 
still.


Python, C++, Go, Rust, all these languages have thriving 
conferences. You just have to look at the world-wide increase 
in the number of such conferences for the data required.


But then my opinion, and indeed my data, doesn't seem matter to 
you so we might as well just stop communicating since you are 
never going to change you mind about this issue, even though 
you are actually wrong.


I've presented evidence in a handy link, you give none.

I could be wrong about anything, including that the Earth is 
round and I'm not in the Matrix. But to convince me that I am, 
I'll need evidence, same as I've presented to you.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-04 Thread Joakim via Digitalmars-d

On Thursday, 4 October 2018 at 08:54:29 UTC, Iain Buclaw wrote:

On Thursday, 4 October 2018 at 08:06:24 UTC, Joakim wrote:

On Thursday, 4 October 2018 at 07:53:54 UTC, Iain Buclaw wrote:

[...]


Did anybody pay attention to the live talks either? ;) That's 
the real comparison.


Anyway, the reason I'm giving to prerecord talks is so you can 
watch them on your own time before the conference. Watching 
prerecorded talks with everybody else at a conference is 
layering stupid on top of stupid. :D


Sure, but you really think it's an appropriate use of my free 
time spending 22 hours (which may as well be half a month) 
watching prerecorded talks instead of contributing?


That's a strange question: do you prefer being forced to sit 
through all 22 hours live at DConf? At least with pre-recorded 
talks, you have a choice of which ones to watch.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-04 Thread Joakim via Digitalmars-d

On Thursday, 4 October 2018 at 07:12:03 UTC, Russel Winder wrote:
On Wed, 2018-10-03 at 18:46 +, Joakim via Digitalmars-d 
wrote:

[…]

I don't doubt that some are like you and prefer viewing live, 
but given how conferences keep dying off and online tech talks 
are booming, you're in an extreme minority that prefers that 
high-cost live version. That means the market inevitably stops 
catering to you, which is why the talk-driven conference 
format is dying off.


And new conferences keep being started and being successful. 
And many just keep on going, often getting more and more 
successful.


Your personal view of conferences cannot be stated as global 
truth, since it patently is not fact, and evidence indicates 
not true, it is just your opinion.


The link in my OP links to a guy who maintained a spreadsheet of 
Apple-related conferences as evidence. He lists several that went 
away and says nothing replaced them. If you don't even examine 
the evidence provided, I'm not sure why we should care about your 
opinions.


On Thursday, 4 October 2018 at 07:53:54 UTC, Iain Buclaw wrote:

On Wednesday, 3 October 2018 at 16:17:48 UTC, Joakim wrote:
On Wednesday, 3 October 2018 at 01:28:37 UTC, Adam Wilson 
wrote:

On 10/2/18 4:34 AM, Joakim wrote:
On Tuesday, 2 October 2018 at 09:39:14 UTC, Adam Wilson 
wrote:

[...]


It is not clear what you disagree with, since almost nothing 
you say has any bearing on my original post. To summarize, I 
suggest changing the currently talk-driven DConf format to 
either


1. a more decentralized collection of meetups all over the 
world, where most of the talks are pre-recorded, and the 
focus is more on introducing new users to the language or


2. at least ditching most of the talks at a DConf still held 
at a central location, maybe keeping only a couple panel 
discussions that benefit from an audience to ask questions, 
and spending most of the time like the hackathon at the last 
DConf, ie actually meeting in person.




This point has a subtle flaw. Many of the talks raise points 
of discussion that would otherwise go without discussion, and 
potentially unnoticed, if it were not for the person bringing 
it up. The talks routinely serve as a launchpad for the 
nightly dinner sessions. Benjamin Thauts 2016 talk about 
shared libraries is one such example. Indeed every single 
year has brought at least one (but usually more) talk that 
opened up some new line of investigation for the dinner 
discussions.


I thought it was pretty obvious from my original post, since I 
volunteered to help with the pre-recorded talks, but the idea 
is to have pre-recorded talks no matter whether DConf is held 
in a central location or not.




I went to a conference once where they had mixed live talks and 
prerecorded talks - questions where taken at the end to the 
speaker of the prerecorded talk via a sip call.


The organisers at the end admitted that the prerecorded talks 
experiment failed. No one really paid attention to any of the 
content in it.


Did anybody pay attention to the live talks either? ;) That's the 
real comparison.


Anyway, the reason I'm giving to prerecord talks is so you can 
watch them on your own time before the conference. Watching 
prerecorded talks with everybody else at a conference is layering 
stupid on top of stupid. :D


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 17:51:00 UTC, Russel Winder wrote:
On Wed, 2018-10-03 at 17:26 +, Joakim via Digitalmars-d 
wrote: […]
At least look at the first two bullet points in my post 
responding to Adam, because you're missing the entire point of 
my suggestions, which is that certain things like talks are 
better suited to online whereas conferences are more suited 
for in-person interaction.


In your opinion. In my opinion, online material is a waste of 
time, I never watch YouTube videos, for me it is a waste of my 
time. But that is the point, different people have a different 
view. This doesn't mean I am right or wrong, it means different 
people have different ways of dealing with material.


I like a live presentation that I can then ignore *or* take up 
with a gusto with the presenter, or other people, after the 
session. Conferences allow this. Presentations are an 
introduction to interaction with others. For me. Others prefer 
to consume videos and have no interactions about the material. 
Personal differences.


Except that you can also view the videos at home, then discuss 
them later at a conference, which is the actual suggestion here.


Since there is a population of people who like online stuff, 
then online stuff there must be. As there are people who like a 
live presentation and post session discussion, this must also 
happen. The two are not in conflict.


They are in conflict because the cost of doing it live is much, 
much higher. DConf organizers' goal should be to enable the 
widest reach at the lowest cost, not catering to off-the-wall 
requests from a select few like yourself.


I don't doubt that some are like you and prefer viewing live, but 
given how conferences keep dying off and online tech talks are 
booming, you're in an extreme minority that prefers that 
high-cost live version. That means the market inevitably stops 
catering to you, which is why the talk-driven conference format 
is dying off.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 17:13:51 UTC, Dejan Lekic wrote:

On Wednesday, 3 October 2018 at 16:21:45 UTC, Joakim wrote:
Like most of the responses in this thread, I have no idea why 
you're stumping for in-person interaction, when all my 
suggestions were geared around having _more in-person 
interaction_.


If you're still not sure what I mean, read this long post I 
wrote fisking Adam's similar post:


https://forum.dlang.org/post/eoygemytghynpogvl...@forum.dlang.org


Perhaps you did not get my point?


No, I got it, you didn't get mine.

- I have nothing against core D team having web-conferences as 
much as they please. It is up to them (and they may already 
have them?) how they want to communicate.


What I argued about was that, just because some antisocial geek 
argues that conferences are "dead" because we have 
web-conferencing and similar means of communication does not 
mean we all share that opinion... Everyone can record a "talk" 
with slides and put it on some video streaming site like Vimeo 
or YouTube, but I personally see that as ANOTHER way to reach 
the community, certainly NOT an alternative to a well-organised 
conference!


Do not get me wrong, I have nothing against the proposal - I 
think D community can have both good, annual conference, AND 
what web-conferencing between core D devs, and people who would 
record talks in their rooms or offices and make them public...


While my OP did mention some of those things, it only did so as a 
way to have _more in-person interaction_ at the two DConf 
alternative formats I suggested, neither of which was primarily 
about any of the stuff you mention.


At least look at the first two bullet points in my post 
responding to Adam, because you're missing the entire point of my 
suggestions, which is that certain things like talks are better 
suited to online whereas conferences are more suited for 
in-person interaction.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 11:48:06 UTC, Dejan Lekic wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:
I'm sure some thought and planning is now going into the next 
DConf, so I'd like to make sure people are aware that the 
conference format that DConf uses is dying off, as explained 
here:


https://marco.org/2018/01/17/end-of-conference-era


It is a matter of personal preference, and a view of a 
modern-day geek, in my humble opinion... I _highly disagree_. 
People go to conferences for different reasons. You know, even 
though we "computer people" tend to be branded as antisocial, 
there are still many of us who prefer to see someone in person, 
talk to him/her, meet new people, speak to them too, build the 
network, exchange phone numbers, etc...


As usual with conferences not all people are happy - you will 
ALWAYS have people who prefer more technical stuff, and people 
who prefer more business side - people who try to promote their 
products and services. - Conferences are brilliant places for 
them.


Another group of people interested in conferences and meetups 
are recruiters. My company found few new colleagues this way...


Yet another group are people who also want to see the town 
where the conference is held - it is a form of tourism if you 
like.


Yes, you can have all that interaction with some 
internet-conferencing software, but not at the level when 
people interact with each other directly!


Like most of the responses in this thread, I have no idea why 
you're stumping for in-person interaction, when all my 
suggestions were geared around having _more in-person 
interaction_.


If you're still not sure what I mean, read this long post I wrote 
fisking Adam's similar post:


https://forum.dlang.org/post/eoygemytghynpogvl...@forum.dlang.org


Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Wednesday, 3 October 2018 at 01:28:37 UTC, Adam Wilson wrote:

On 10/2/18 4:34 AM, Joakim wrote:

On Tuesday, 2 October 2018 at 09:39:14 UTC, Adam Wilson wrote:

On 10/1/18 11:26 PM, Joakim wrote:

[snip]


I disagree.


It is not clear what you disagree with, since almost nothing 
you say has any bearing on my original post. To summarize, I 
suggest changing the currently talk-driven DConf format to 
either


1. a more decentralized collection of meetups all over the 
world, where most of the talks are pre-recorded, and the focus 
is more on introducing new users to the language or


2. at least ditching most of the talks at a DConf still held 
at a central location, maybe keeping only a couple panel 
discussions that benefit from an audience to ask questions, 
and spending most of the time like the hackathon at the last 
DConf, ie actually meeting in person.




This point has a subtle flaw. Many of the talks raise points of 
discussion that would otherwise go without discussion, and 
potentially unnoticed, if it were not for the person bringing 
it up. The talks routinely serve as a launchpad for the nightly 
dinner sessions. Benjamin Thauts 2016 talk about shared 
libraries is one such example. Indeed every single year has 
brought at least one (but usually more) talk that opened up 
some new line of investigation for the dinner discussions.


I thought it was pretty obvious from my original post, since I 
volunteered to help with the pre-recorded talks, but the idea is 
to have pre-recorded talks no matter whether DConf is held in a 
central location or not.


Since both of these alternatives I suggest are much more about 
in-person interaction, which is what you defend, and the only 
big change I propose is ditching the passive in-person talks, 
which you do not write a single word in your long post 
defending, I'm scratching my head about what you got out of my 
original post.


There is much more to the conference than just a 4-day meetup 
with talks. The idea that it's just the core 8-15 people with 
a bunch of hangers-on is patently false. It's not about the 
conversations I have with the "core" people. It's 
Schveighoffer, or Atila, or Jonathan, or any of a long list 
of people who are interested enough in coming. Remember these 
people self-selected to invest non-trivial treasure to be 
there, they  are ALL worthy of conversing with.


Since both my mooted alternatives give _much more_ opportunity 
for such interaction, I'm again scratching my head at your 
reaction.




This is untrue. See responses further down.


It is true. You merely prefer certain interaction for yourself to 
the overall interaction of the community.


Is it a "mini-vaction"? Yea, sure, for my wife. For her it's 
a four day shopping spree in Europe. For me it's four days of 
wall-to-wall action that leaves me drop-dead exhausted at the 
end of the day.


So it's the talks that provide this or the in-person 
interaction? If the latter, why are you arguing against my 
pushing for more of it and ditching the in-person talks?




It's everything. The talks, the coding, the talking, the 
drinking. All of it has some social component I find valuable.


Please try to stay on the subject. Nobody's talking about getting 
rid of coding/talking/drinking, in fact, the idea is to have 
_more_ time for those, by ditching the in-person talks.


So the relevant info here would be what you find "social" about 
passively watching a talk in person with 100 other people in the 
same room, which as usual, you don't provide.


Every time I see somebody predicting the end of "X" I roll my 
eyes. I have a vivid memory of the rise of Skype and 
videoconferencing in the early 2000's giving way to 
breathless media reports about how said tools would kill the 
airlines because people could just meet online for a trivial 
fraction of the price.


People make stupid predictions all the time. Ignoring all such 
"end of" predictions because many predict badly would be like 
ignoring all new programming languages because 99% are bad. 
That means you'd never look at D.


And yes, some came true: almost nobody programs minicomputers 
or buys standalone mp3 players like the iPod anymore, compared 
to how many used to at their peak.




Sure, but the predictions about videoconferencing have yet to 
come true. As told but the data itself. The travel industry is 
setting new records yearly in spite of videoconferencing. 
That's not conjecture or opinion, go look for yourself. As I 
have previously suggested, the stock prices and order-books of 
Airbus and Boeing are are record highs. Airplanes are more 
packed than ever (called load-factor). For example, Delta's 
system-wide load-factor was 85.6% last year. Which means that 
85.6% of all available seats for the entire year were occupied. 
(Source: 
https://www.statista.com/statist

Re: Please don't do a DConf 2018, consider alternatives

2018-10-03 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 16:10:20 UTC, Johannes Loher wrote:

On Tuesday, 2 October 2018 at 15:42:20 UTC, Joakim wrote:
On Tuesday, 2 October 2018 at 15:03:45 UTC, Adam D. Ruppe 
wrote:
That is what Joakim is talking about - changing the main 
event to be more like the after-hours stuff everyone loves so 
much, to actually use all the time to maximize the potential 
of in-person time.


I'm talking about growing two different qualities much more, 
with my two suggested alternatives to the DConf format.


1. Ditch the talks, focus on in-person interaction. That's why 
I suggest having almost no talks, whether at a central DConf 
or not. You clearly agree with this.


2. Decentralize the DConf location, casting a much wider net 
over many more cities. Walter and Adam could rent a room and 
setup a Seattle DConf location, Andrei and Steven in Boston, 
Ali and Shammah in the bay area, and so on (only illustrative, 
I'm not imposing this on any of these people). Some of the 
money that went to renting out a large conference room in 
Munich can instead be spent on these much smaller rooms in 
each city.


Charge some minimal fee for entrance in some locations, if 
that means they can spend time with W&A and to cover costs. I 
wouldn't charge anything more than $2 in my city for my event, 
as event organizers here have found that that's low enough to 
keep anyone who's really interested while discouraging fake 
RSVPs, ie those who have no intent of ever showing up but 
strangely sign up anyway (I know an organizer who says he had 
150 people RSVP for a Meetup here and only 15 showed up).


By keeping travel and ticket costs much lower, you invite much 
more participation.


Obviously my second alternative to DConf listed above wouldn't 
be decentralized at all, only enabling in-person interaction 
at a still-central DConf.


Mix and match as you see fit.


I totally agree with you on your first point, i.e. making DConf 
more interactive. I have had very good experiences with formats 
like open space or barcamp. However, these formats only work if 
people are actually willing to participate and bring in their 
own ideas. Not having anything prepared can in rare cases lead 
to the situation where there is a lack of things to talk about 
(I doubt this would be the case for the D community, but it is 
something to keep in mind).


As long as you plan ahead and compile an online list of stuff to 
work on or discuss in the weeks preceding, I don't see this being 
a problem.


However, I must say I disagree with your second point, i.e. 
decentralising DConf. As many people here have already 
mentioned, DConf is about talking to people. And to me it is 
especially important to talk to lots of different people whom I 
otherwise don’t get the chance to talk to in person. By 
decentralising the conference, we would limit the number of 
different people you can get in touch with directly by a huge 
amount.


I doubt that, it would just be different people you're talking 
to. There are probably three types of current and potential D 
users worth talking about. There's the core team, power users, 
and everybody else, ie casual or prospective users.


A central DConf caters to the first two, almost nobody from the 
largest camp, ie casual/prospective users, is flying out or 
paying $400 to attend. A decentralized DConf tries to get much 
more casual/prospective users and power users who couldn't 
justify traveling so far before, but it has two potential costs:


1. The core team may be spread out and not mostly gathered in one 
spot anymore. That is why I have suggested having them meet 
separately from DConf or at one of the DConf locations earlier in 
this thread.


2. A power user who might have paid to travel to Berlin before 
doesn't get access to the entire core team at once, someone like 
you I'm guessing. I think there's some value there, but I suspect 
it's much less than the value gained from a decentralized DConf.


Just to use myself as an example, last Docnf I was able to talk 
to Andrei, Walter, Mike, Ali, Jonathan, Kai and lots of others 
and exchange ideas with them. This would not have been possible 
with a decentralised event (except for the off chance that all 
those people by chance attend the same local „meetup“).


Yes, but what did the D ecosystem concretely get out of it? Is it 
worth not having the hundreds of people who might have met them 
at decentralized DConf locations at Boston/SV/Seattle/Berlin not 
meeting them last year?


That's the kind of tough-minded calculation that needs to be made.

On the other hand, I have to admit that decentralising the 
event would open it up for a much bigger audience, which 
definitely is a good idea. However, I would much prefer to have 
something like a main DConf and if there are enough interested 
people in an area who will not go to the main event, they can 
host their

Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 15:03:45 UTC, Adam D. Ruppe wrote:
That is what Joakim is talking about - changing the main event 
to be more like the after-hours stuff everyone loves so much, 
to actually use all the time to maximize the potential of 
in-person time.


I'm talking about growing two different qualities much more, with 
my two suggested alternatives to the DConf format.


1. Ditch the talks, focus on in-person interaction. That's why I 
suggest having almost no talks, whether at a central DConf or 
not. You clearly agree with this.


2. Decentralize the DConf location, casting a much wider net over 
many more cities. Walter and Adam could rent a room and setup a 
Seattle DConf location, Andrei and Steven in Boston, Ali and 
Shammah in the bay area, and so on (only illustrative, I'm not 
imposing this on any of these people). Some of the money that 
went to renting out a large conference room in Munich can instead 
be spent on these much smaller rooms in each city.


Charge some minimal fee for entrance in some locations, if that 
means they can spend time with W&A and to cover costs. I wouldn't 
charge anything more than $2 in my city for my event, as event 
organizers here have found that that's low enough to keep anyone 
who's really interested while discouraging fake RSVPs, ie those 
who have no intent of ever showing up but strangely sign up 
anyway (I know an organizer who says he had 150 people RSVP for a 
Meetup here and only 15 showed up).


By keeping travel and ticket costs much lower, you invite much 
more participation.


Obviously my second alternative to DConf listed above wouldn't be 
decentralized at all, only enabling in-person interaction at a 
still-central DConf.


Mix and match as you see fit.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 14:49:31 UTC, bachmeier wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:

"Once the videos are all up, set up weekend meetups in several 
cities [all over the world], where a few livestreamed talks 
may talk place if some speakers don't want to spend more time 
producing a pre-recorded talk, but most time is spent like the 
hackathon, discussing various existing issues from bugzilla in 
smaller groups or brainstorming ideas, designs, and libraries 
for the future."


I can setup an event like this in my city, where AFAIK nobody 
uses D, so most of it would be geared towards introducing them 
to the language.


I estimate that you could do ten times better at raising 
awareness and uptake with this approach than the current DConf 
format, by casting a much wider net, and it would cost about 
10X less, ie you get two orders of magnitude better bang for 
the buck.


I think this is something that could be done *in addition to* 
DConf.


It depends what you mean by that. If DConf keeps running as it 
has, as you suggest below, but you simply add some satellite 
meetups around it in other cities watching the livestreamed talks 
from the main DConf, then you have addressed some of these 
concerns, but not very much.


If you go the decentralized approach I suggested, but maybe pick 
one of those locations as the one the core team goes to and don't 
do almost any in-person talks anywhere, that would address much 
more.


I honestly don't think DConf is very effective at promoting D, 
except perhaps to a small sliver of the overall population of 
programmers, due to the content of most of the presentations.


I agree. I'll go farther and say that it's a small sliver of 
existing D programmers too who get much value out of it.


{This is not intended to be a criticism or a statement that 
anything about DConf should be changed.}


Heh, of course it's a criticism and of course it should be 
changed. :)


I believe it would be a mistake to drop DConf. If we did that, 
the story that would be told is "D couldn't even support its 
own conference. Use Rust or Go or Julia instead." Our view 
would be "we're on the cutting edge" but everyone else's view 
would be "the language is dying".


Great. Everybody thought Apple was nuts when they released a $500 
iPhone in 2007, now Ballmer wishes he'd come up with the idea:


https://www.macrumors.com/2016/11/07/former-microsoft-ceo-steve-ballmer-wrong-iphone/

As long as you communicate that you're replacing one DConf 
location with several and why you're doing it, I don't see why we 
should care how they end up interpreting it. Our goal is to get 
users and adoption, not to look good to other 
programming-language developers.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 10:37:44 UTC, Nicholas Wilson wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:

[...]


As I'm sure has been said before, if it were just the talks it 
probably wouldn't be worth it. But conferences are sooo 
much more than just the talks. Its the conversations over 
breakfast/lunch/dinner/ between talks and long into the night 
(sometimes too long). Its the networking, the hacking, the face 
to face. The talks are usually pretty good too.


The conference is definitely not dead, I'm going to one in San 
José in 2 weeks, sure the talks look really interesting but the 
main reason is to talk to other people to get stuff done.


Then I'm not sure why you're saying any of this to me, as almost 
nothing you write contradicts anything I wrote.


If you're still not sure what I mean, read this long post I just 
wrote fisking Adam's similar post:


https://forum.dlang.org/post/eoygemytghynpogvl...@forum.dlang.org


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 09:39:14 UTC, Adam Wilson wrote:

On 10/1/18 11:26 PM, Joakim wrote:

[snip]


I disagree.


It is not clear what you disagree with, since almost nothing you 
say has any bearing on my original post. To summarize, I suggest 
changing the currently talk-driven DConf format to either


1. a more decentralized collection of meetups all over the world, 
where most of the talks are pre-recorded, and the focus is more 
on introducing new users to the language or


2. at least ditching most of the talks at a DConf still held at a 
central location, maybe keeping only a couple panel discussions 
that benefit from an audience to ask questions, and spending most 
of the time like the hackathon at the last DConf, ie actually 
meeting in person.


Since both of these alternatives I suggest are much more about 
in-person interaction, which is what you defend, and the only big 
change I propose is ditching the passive in-person talks, which 
you do not write a single word in your long post defending, I'm 
scratching my head about what you got out of my original post.


There is much more to the conference than just a 4-day meetup 
with talks. The idea that it's just the core 8-15 people with a 
bunch of hangers-on is patently false. It's not about the 
conversations I have with the "core" people. It's 
Schveighoffer, or Atila, or Jonathan, or any of a long list of 
people who are interested enough in coming. Remember these 
people self-selected to invest non-trivial treasure to be 
there, they  are ALL worthy of conversing with.


Since both my mooted alternatives give _much more_ opportunity 
for such interaction, I'm again scratching my head at your 
reaction.


Is it a "mini-vaction"? Yea, sure, for my wife. For her it's a 
four day shopping spree in Europe. For me it's four days of 
wall-to-wall action that leaves me drop-dead exhausted at the 
end of the day.


So it's the talks that provide this or the in-person interaction? 
If the latter, why are you arguing against my pushing for more of 
it and ditching the in-person talks?


Every time I see somebody predicting the end of "X" I roll my 
eyes. I have a vivid memory of the rise of Skype and 
videoconferencing in the early 2000's giving way to breathless 
media reports about how said tools would kill the airlines 
because people could just meet online for a trivial fraction of 
the price.


People make stupid predictions all the time. Ignoring all such 
"end of" predictions because many predict badly would be like 
ignoring all new programming languages because 99% are bad. That 
means you'd never look at D.


And yes, some came true: almost nobody programs minicomputers or 
buys standalone mp3 players like the iPod anymore, compared to 
how many used to at their peak.


However, it's 2018 and the airlines are reaping record profits 
on the backs of business travelers (ask me how I know). 
Airlines are even now flying planes with NO standard economy 
seats for routes that cater specifically to business travelers 
(e.g. Singapore Airlines A350-900ULR). The order books (and 
stock prices) of both Airbus and Boeing are at historic highs.


You know what is much higher? Business communication through 
email, video-conferencing, online source control, etc. that 
completely replaced old ways of doing things like business travel 
or sending physical packages. However, business travel might 
still be up- I don't know as I haven't seen the stats, and you 
provide nothing other than anecdotes- because all that virtual 
communication might have enabled much more collaboration and 
trade that also grew business travel somewhat.


There are more conferences, attendees, and business travelers 
than there has ever been in history, in spite of the great 
technological leaps in videoconferencing technology in the past 
two decades.


The market has spoken. Reports of the death of 
business/conference travel have been greatly exaggerated.


You are conflating two completely different markets here, 
business versus conference travel. Regarding conferences, your 
experience contradicts that of the iOS devs in the post I linked 
and the one he links as evidence, where that blogger notes 
several conferences that have shut down. In your field, it is my 
understanding that MS has been paring back and consolidating 
their conferences too, though I don't follow MS almost at all.


The reason for this is fundamental to human psychology and, as 
such, is unlikely to change in the future. Humans are social 
animals, and no matter how hard we have tried, nothing has been 
able to replace the face-to-face meeting for getting things 
done. Be it the conversations we have over beers after the 
talks, or the epic number of PR's that come out the hackathon, 
or even mobbing the speaker after a talk.


It is funny that you say this on a forum where we&

Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d
On Tuesday, 2 October 2018 at 08:21:11 UTC, maarten van damme 
wrote:
While I have never attended dconf itself, conferences itself 
usually aren't about the talks but about the people you meet 
and get to interact with.


Since this thread is about replacing the outdated DConf format 
with two possible in-person formats that feature _more_ 
interpersonal interaction, I have no idea why you're making this 
point to me.


On Tuesday, 2 October 2018 at 08:56:36 UTC, bauss wrote:

On Tuesday, 2 October 2018 at 07:32:58 UTC, Joakim wrote:
Ex. for D conf there is much more than just D. There is also 
the minor escape from reality to new surroundings, like a 
mini vacation etc.


Thank you for making clear that the real reason you and some 
others like the current format is because you want to have a 
fun "vacation"- as I pointed out in that earlier thread- 
rather than anything to do with D or advancing the ecosystem.


Thank you for not reading everything I said and literally only 
the past 5 words; I said it's also that, but not entirely.


Everything you wrote before that I addressed with a separate 
comment which you didn't cut-n-paste, maybe you missed that too.


As for this justification, the only reason you gave is that it's 
a "escape from reality"/"mini vacation", along with hand-waving 
about "much more." I can't address reasons you never gave.


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 08:08:38 UTC, Gary Willoughby wrote:

On Tuesday, 2 October 2018 at 07:32:58 UTC, Joakim wrote:
Thank you for making clear that the real reason you and some 
others like the current format is because you want to have a 
fun "vacation"- as I pointed out in that earlier thread- 
rather than anything to do with D or advancing the ecosystem.


Yes, please let's not have any fun at Dconf this year!!! /s


Then why are you sitting around listening to boring tech talks on 
your "super-fun" vacation? Get W&A and a bunch of other D devs 
and go on a boat tour of the Greek islands! You'll have a lot 
more fun!!! endSarcasm()


Re: Please don't do a DConf 2018, consider alternatives

2018-10-02 Thread Joakim via Digitalmars-d

On Tuesday, 2 October 2018 at 07:14:54 UTC, bauss wrote:

On Tuesday, 2 October 2018 at 06:26:30 UTC, Joakim wrote:

[...]


I highly disagree with this.

I love conferences and meetups.

It's good socially and a conference is not 100% just about the 
topic it hosts.


I think you didn't read what I wrote, as nowhere did I say not to 
gather people in conferences or meetups, but that the traditional 
conference _format_, as exemplified by previous DConfs, is a 
waste of time.


Ex. for D conf there is much more than just D. There is also 
the minor escape from reality to new surroundings, like a mini 
vacation etc.


Thank you for making clear that the real reason you and some 
others like the current format is because you want to have a fun 
"vacation"- as I pointed out in that earlier thread- rather than 
anything to do with D or advancing the ecosystem.


Please don't do a DConf 2018, consider alternatives

2018-10-01 Thread Joakim via Digitalmars-d
I'm sure some thought and planning is now going into the next 
DConf, so I'd like to make sure people are aware that the 
conference format that DConf uses is dying off, as explained here:


https://marco.org/2018/01/17/end-of-conference-era

There was a discussion about this in a previous forum thread:

https://forum.dlang.org/post/bnbldtdfeppzjuthx...@forum.dlang.org

Jonathan and Mike argue in that thread that DConf is great for 
the core team to get together in person and hash things out for D 
with very high-bandwidth interaction, but I pointed out that 
doesn't justify 95%+ of the attendees being there. If there's a 
real need for this, maybe get those 8-15 people together in an 
online video conference or offline retreat, without a bunch of 
hangers-on and talks.


People are now experimenting with what replaces conferences, we 
should be doing that too. I came up with some ideas in that 
thread:


"Have most talks prerecorded by the speaker on their webcam or 
smartphone, which produce excellent video these days with not 
much fiddling, and have a couple organizers work with them to get 
those home-brewed videos up to a certain quality level, both in 
content and presentation, before posting them online."


I volunteer to help presenters do this.

"Once the videos are all up, set up weekend meetups in several 
cities [all over the world], where a few livestreamed talks may 
talk place if some speakers don't want to spend more time 
producing a pre-recorded talk, but most time is spent like the 
hackathon, discussing various existing issues from bugzilla in 
smaller groups or brainstorming ideas, designs, and libraries for 
the future."


I can setup an event like this in my city, where AFAIK nobody 
uses D, so most of it would be geared towards introducing them to 
the language.


I estimate that you could do ten times better at raising 
awareness and uptake with this approach than the current DConf 
format, by casting a much wider net, and it would cost about 10X 
less, ie you get two orders of magnitude better bang for the buck.


At the very least, DConf should just be a big hackathon of 
self-organizing groups, rather than wasting any time passively 
imbibing talks next to a hundred other people. I still don't 
think the cost of getting a hundred people in the same room for 
3-4 days would be justified, but at least it would be a step in 
the right direction.


Re: Funny way to crash dmd and brick the whole computer

2018-10-01 Thread Joakim via Digitalmars-d

On Friday, 28 September 2018 at 11:58:25 UTC, Zardoz wrote:

CTE fib :

module fib_cte;
import std.stdio;

long fib(long n) {
  if (n <= 1) return 1;
  return fib(n - 1) + fib(n - 2);
}

static immutable valueFib = fib(46);

void main() {
writeln(valueFib);
}


I tried it on Android with LDC, it eventually just kills the 
process. You need to get a real OS. ;)


Re: Updating D beyond Unicode 2.0

2018-09-21 Thread Joakim via Digitalmars-d

On Friday, 21 September 2018 at 20:25:54 UTC, Walter Bright wrote:
When I originally started with D, I thought non-ASCII 
identifiers with Unicode was a good idea. I've since slowly 
become less and less enthusiastic about it.


First off, D source text simply must (and does) fully support 
Unicode in comments, characters, and string literals. That's 
not an issue.


But identifiers? I haven't seen hardly any use of non-ascii 
identifiers in C, C++, or D. In fact, I've seen zero use of it 
outside of test cases. I don't see much point in expanding the 
support of it. If people use such identifiers, the result would 
most likely be annoyance rather than illumination when people 
who don't know that language have to work on the code.


Extending it further will also cause problems for all the tools 
that work with D object code, like debuggers, disassemblers, 
linkers, filesystems, etc.


To wit, Windows linker error with Unicode symbol:

https://github.com/ldc-developers/ldc/pull/2850#issuecomment-422968161


Absent a much more compelling rationale for it, I'd say no.


I'm torn. I completely agree with Adam and others that people 
should be able to use any language they want. But the Unicode 
spec is such a tire fire that I'm leery of extending support for 
it.


Someone linked this Swift chapter on Unicode handling in an 
earlier forum thread, read the section on emoji in particular:


https://oleb.net/blog/2017/11/swift-4-strings/

I was laughing out loud when reading about composing "family" 
emojis with zero-width joiners. If you told me that was a tech 
parody, I'd have believed it.


I believe Swift just punts their Unicode support to ICU, like 
most any other project these days. That's a horrible sign, that 
you've created a spec so grotesquely complicated that most 
everybody relies on a single project to not have to deal with it.


Re: Jai compiles 80,000 lines of code in under a second

2018-09-20 Thread Joakim via Digitalmars-d

On Friday, 21 September 2018 at 00:47:27 UTC, Adam D. Ruppe wrote:
Of course, D can also take ages to compile one line of code. It 
all depends on that that line is doing... ctfe and templates 
are slow. C or Java style code compiling in D is very fast.


I was going to say this too, ie how much of that Jai code is run 
at compile-time, how much is uninstantiated templates that is 
just skipped over like D does, and how much is templates 
instantiated many times? Lines of code is not a good enough 
measure with those programming constructs.


I was just building the stdlib tests with LDC yesterday and they 
took so much memory on a new Linux/x64 VPS with 2GB of RAM that I 
had spun up that I couldn't even ssh in anymore. I eventually had 
to restart the VPS and add a swapfile, which I usually have but 
simply hadn't bothered with yet for this new Ubuntu 18.04 VPS. 
The stdlib tests instantiate a ton of templates.


dub auto-tester

2018-09-19 Thread Joakim via Digitalmars-d
On Thursday, 20 September 2018 at 04:16:41 UTC, Neia Neutuladh 
wrote:
On Thursday, 20 September 2018 at 02:51:52 UTC, Neia Neutuladh 
wrote:
On Monday, 10 September 2018 at 01:27:20 UTC, Neia Neutuladh 
wrote:
Not on dlang.org anywhere, but I built a crude version of 
this. Results are available at http://ikeran.org/report/.


A quick status update:


And source code is available at 
https://git.ikeran.org/dhasenan/dubautotester


Please don't judge me.


Nice, what will it take to get this integrated with the official 
dub website?


BTW, the gitea self-hosted github-workalike you're using looks 
nice, too bad it's written in Go. ;)


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-18 Thread Joakim via Digitalmars-d
On Tuesday, 18 September 2018 at 18:06:37 UTC, Neia Neutuladh 
wrote:

On Tuesday, 18 September 2018 at 07:53:31 UTC, Joakim wrote:
On Monday, 17 September 2018 at 22:27:41 UTC, Neia Neutuladh 
wrote:

On Monday, 17 September 2018 at 15:47:14 UTC, Joakim wrote:
Not sure why that matters if you agree with Kay that HTML is 
an abortion? :) I actually think it's great that mobile is 
killing off the web, as the Comscore usage stats I linked 
earlier show.


HTML is a somewhat open standard. I'm more happy with HTML 
and Javascript, as ugly as they are and as dominated by 
Google and Microsoft as they are, than having to target 
private companies' frameworks.


So you'd rather target an incredibly badly designed open 
standard than a mostly open source "private company's" 
framework that's certainly not great, but much better? It's no 
contest for me, give me the latter any day. And then of 
course, there's always cross-platform OSS toolkits like 
Flutter or DlangUI.


Thinking about it a bit more, the openness of the platform is 
more important. Android and iOS are effectively closed 
platforms. You *can* sideload apps, but it's rare to find 
someone willing to do so. If you're not on the app stores, your 
app isn't going to get a thousandth as much traction.


I'll note that you wrote "app stores," and for Android there are 
actually multiple. There's the official Play store from google, 
the Amazon appstore, app stores for OSS apps like F-Droid or 
Fossdroid, and over 400 app stores in China, where those first 
two app stores are almost never used:


https://www.appinchina.co/market/app-stores/

Anyone can install any app store on their Android device and get 
any apps they want, though as you note, most outside China just 
go with the pre-installed Play or Amazon store.


Windows, on the other hand, has long been an open platform; you 
can develop for it and publish your programs and Microsoft 
won't get in the way.


Though that is now changing with their new UWP platform, which by 
default must be installed from their own app store, the Microsoft 
Store. The link for the Windows/AArch64 device in my original 
post notes that they expect most Windows/AArch64 apps to be UWP 
apps, and so you'd get them from an app store just like Android 
most of the time. I read that they do have similar allowances for 
side-loading UWP apps as Android though, and of course older 
win32/64 apps on normal Wintel devices isn't affected by this.


So an open source cross-platform toolkit controlled by a single 
entity isn't bad. I use GTK+ a lot, for instance. But the web 
and HTML is a better situation than Android and iOS and their 
toolkits.


I don't think the app stores are that big a deal as long as 
side-loading and multiple app stores are always allowed. Of 
course, that's not the case on iOS, one of the many reasons I've 
never really used an iOS device.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-18 Thread Joakim via Digitalmars-d
On Monday, 17 September 2018 at 22:27:41 UTC, Neia Neutuladh 
wrote:

On Monday, 17 September 2018 at 15:47:14 UTC, Joakim wrote:
Not sure why that matters if you agree with Kay that HTML is 
an abortion? :) I actually think it's great that mobile is 
killing off the web, as the Comscore usage stats I linked 
earlier show.


HTML is a somewhat open standard. I'm more happy with HTML and 
Javascript, as ugly as they are and as dominated by Google and 
Microsoft as they are, than having to target private companies' 
frameworks.


So you'd rather target an incredibly badly designed open standard 
than a mostly open source "private company's" framework that's 
certainly not great, but much better? It's no contest for me, 
give me the latter any day. And then of course, there's always 
cross-platform OSS toolkits like Flutter or DlangUI.


On Monday, 17 September 2018 at 23:42:03 UTC, Dave Jones wrote:

On Monday, 17 September 2018 at 15:47:14 UTC, Joakim wrote:

On Sunday, 16 September 2018 at 15:41:41 UTC, tide wrote:

On Sunday, 16 September 2018 at 15:11:42 UTC, Joakim wrote:

I say that almost 30% drop in PC sales over the last 7


Might be, but so is trying to convince everyone your 
predictions are correct so they will focus their work on the 
issues important to you.


Not at all, because if my predictions are correct, this 
language will disappear along with the PC platform it's built 
on. And I've never suggested anybody work on anything 
"important to [me]," my original post even stated that D may 
never do well on mobile.


You are making your arguments to fit your desires.


I can't make head nor tails of this claim, you have a talent for 
vague non sequiturs. My arguments are based on data, the 
overwhelming sales numbers I linked. I have no idea what desires 
you think are involved, I suspect you don't either. :)


Plateaus almost never happen, it's not the natural order of 
things.


OK the market stabilises.


I don't see how you changing the word you used changes 
anything about the underlying phenomenon: that doesn't happen.


You're seriously suggesting that markets never stabilise, say 
oil prices stay steady for a few years or some such?


Prices flit all over the place, that's not what we're talking 
about. Oil _production_ has been remarkably consistent and 
growing for many, many decades:


https://commons.m.wikimedia.org/wiki/File:Crude_NGPL_IEAtotal_1960-2008.svg

The only hiccup was in the early '80s because of extraordinary 
measures taken by governments, like price controls and cartel 
orders, which was still only a 15% drop.


If oil production ever drops 30% because some workable substitute 
comes along, as has happened to PCs now, yes, there is no chance 
of stabilization. It will be a steady decline from there, as 
these trends have a kind of momentum.


Most households have more devices than ever before, and 
hardware is only getting cheaper. The idea that people will 
have to choose just one device is plainly wrong.


You need to get out in the world a bit more. The majority of 
smartphones these days are bought in emerging markets where 
_nobody in their home has ever owned a PC or used the 
internet_. I've talked to these working stiffs in developing 
markets, you clearly haven't.


And what happens when the emerging markets mature? Do they 
still just cling on to one smart phone in the house? Or are 
they yearning for more technology?


They buy more mobile devices, the PC will be long since dead and 
gone.


I find it strange that you think the PC won't also be rolled 
up by mobile like this.


Can you put a 3GB hard drive in your phone?


Why would I ever want to do this when I noted my phone has 128 
GBs of space? ;) If you mean 3 _TB_, yes, I simply attach my 
slim 1 TB external drive and back up whatever I want over USB 
3.0.


So you're not averse to having some external hardware sat on 
your desk. Hmmm.


My original post links to examples of using your smartphone 
connected to a keyboard and monitor or a laptop shell, so I'm not 
sure where you ever got the idea I was against "external 
hardware."



Or a high end graphics card?


Smartphones come with very powerful graphics cards these days, 
plenty powerful enough to drive lots of graphic loads.


Not if you're into high end gaming.


The vast majority of PCs don't have cards capable of that either. 
For the few who want it, there will be specialized solutions, 
whether consoles or whatever.



Or a soundcard with balanced outputs?


Some phones come with high-end DACs and the like, or you could 
always attach something externally if you really needed to.


There's no such thing as professional audio breakout box for 
android AFAIK. Up until a few years ago the problem was Android 
couldn't do low latency audio, I'm not sure if the situati

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-17 Thread Joakim via Digitalmars-d

On Sunday, 16 September 2018 at 15:41:41 UTC, tide wrote:

On Sunday, 16 September 2018 at 15:11:42 UTC, Joakim wrote:
I say that almost 30% drop in PC sales over the last 7 years 
is mostly due to the rise of mobile.


I think a large part of it is that PCs got fast enough for 
most people about 7-10 years ago. So it was a combination of 
mobile, and people no longer needing to get newer faster 
machines. The upgrade cycle moved from "I need a newer faster 
computer" to "I'll wait till my current system is worn out". 
(For a lot of people anyway)


Sure, that's part of it, but that suggests that once 
smartphones reach that performance threshold, they will 
replace PCs altogether. I think we've reached that threshold 
now.


I feel only looking at sales stats is irrelevant. I know people 
that have lost their phone and just bought a new phone. They 
get stolen a lot more easily. If your screen breaks you are 
better off buying a new phone as the cost of replacing the 
screen is going to be almost as much as a new one. Someone I 
know had to fight his boss to repair his phone cause he didn't 
want a brand new iPhone, he still has an Android device and 
they switched to Apple a while back. Note, it still costed more 
to buy the new phone than repair his old one.


Computers last much longer, I've had the one I have right now 
for 8 years. It runs everything I need it to. Faster than a 
smartphone or tablet, or even most newer laptops still. There's 
no reason to buy a new one, not that I would buy a prebuilt one 
anyways. Which I'm pretty sure are what those sales represent. 
Can't really count a CPU sale as a "PC" sale as it might just 
be someone upgrading from their old PC.


DIY PC sales are estimated at around 50 million a year, they 
don't move the needle compared to mobile sales. And yes, 
smartphones get broken easier and need to be upgraded more often, 
_just as the PC was once a shoddier product than a DEC 
minicomputer_, as Ken Olsen noted.


What _matters_ is that mobile is approaching 10X the sales of 
PCs. That pays for a lot of innovation and upgrades that the PC 
base simply cannot pay for: they just don't have the numbers. 
That is the _same_ way the PC swamped the minicomputer, and 
mobile is now doing it to the PC.


On Sunday, 16 September 2018 at 15:49:33 UTC, tide wrote:
That is, it is not just the performance that affects the sales 
of phones. There's a lot of factors that lead to there being 
new phones sales. Know someone that's gone through 3 phones in 
comparison to just the one I have. Treadmills eat phones for 
breakfast.


You're conflating my two arguments. Performance has nothing to do 
with why mobile sells a lot more already, that's all about 
battery life, mobility, 4G networks, etc. Performance is why 
mobile's about to kill off the PC too, because it's finally 
performant enough.


On Sunday, 16 September 2018 at 22:03:12 UTC, Gambler wrote:
You're right about APKs. Not sure whether it changed since I 
looked into it, or I didn't read the docs correctly in the 
first place. The overall dev/distribution process, though, 
still looks... uh, involved compared to compiling and running 
an executable on PC.


I suspect the 10-15 command-line steps listed there to build a 
GUI app on Android itself are _much less_ work than on any other 
platform, especially since you don't have to install any big SDK 
like VS, Xcode, or Qt where plenty of things can go wrong.


Of course, it can always be made simpler.

In general, I am still convinced of the overall negative effect 
of mobile devices on computing. They are designed to be used 
mostly for consumption and social sharing. They have a lot of 
limitations that currently drag the whole IT ecosystem down.


I think you want to cling to that opinion regardless of the 
evidence.



Some excellent high-level criticisms:

https://www.fastcompany.com/40435064/what-alan-kay-thinks-about-the-iphone-and-technology-now


An interesting interview, thanks for the link. Mostly not about 
mobile, but he seems to think the iPhone was too limiting and 
should have come with a stylus? Neither critique applies to 
Android, which is the vast majority of the mobile market, where 
Termux and the stylus of the Galaxy Note are available, if you 
want them.



http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/


He mostly states the obvious, of course touch is not the future 
of HCI interfaces. He mentions speech as a posibility in the 
addendum linked at the end, there are people working on it now (I 
can't believe it's been two years since this article was written):


https://arstechnica.com/gadgets/2016/11/google-home-review-a-step-forward-for-hotwords-a-step-backward-in-capability/

That excellent overview notes a problem with discoverability of 
voice commands in Google Home, so they'll have to 

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-16 Thread Joakim via Digitalmars-d

On Sunday, 16 September 2018 at 10:25:30 UTC, Dave Jones wrote:

On Sunday, 16 September 2018 at 04:47:11 UTC, Joakim wrote:

On Sunday, 16 September 2018 at 01:03:27 UTC, Dave Jones wrote:
I know a lot of people who did, which explains the 28% drop 
in PC sales since they peaked in 2011, the year after the 
iPad came out. Many of those people who used to buy PCs have 
switched to tablets and other mobile devices.


Yet PC sales are up this year, mobile is down, and tablet 
sales have fallen for 3 years in a row.


Eh, these are all mostly mature markets now, so slight 
quarterly dips or gains don't matter much anymore. What does 
it matter that PC sales were up 2-3% last quarter when 7 times 
as many smartphones and mobile devices were sold in that same 
quarter?


Some analysts have predicted that PC sales will plateau at some 
point and if that's where we're at now then 30% drop in 
shipments is not death of the market.


I see no reason why they would plateau, looks like wishful 
thinking to me.


I say that almost 30% drop in PC sales over the last 7 years 
is mostly due to the rise of mobile.


I think a large part of it is that PCs got fast enough for most 
people about 7-10 years ago. So it was a combination of mobile, 
and people no longer needing to get newer faster machines. The 
upgrade cycle moved from "I need a newer faster computer" to 
"I'll wait till my current system is worn out". (For a lot of 
people anyway)


Sure, that's part of it, but that suggests that once smartphones 
reach that performance threshold, they will replace PCs 
altogether. I think we've reached that threshold now.


And just because there's been a trend for 5 or 6 years doesnt 
mean it will continue so inevitably.


Sure, but these trends almost never reverse. ;)


It doesnt need to reverse for "the PC is dead" to be false.


Plateaus almost never happen, it's not the natural order of 
things.


For example, newspapers hoped their ad revenue had plateaued from 
2000-2005, then they plunged:


https://en.m.wikipedia.org/wiki/File:Naa_newspaper_ad_revenue.svg

I've predicted that a similar plunge is about to happen to PC 
sales.


I actually think most people would prefer a separate desktop 
and mobile device, whether that desktop is just the size of 
pack of cigarettes, or a big box with 5 fans in it.


Why? Given how price-sensitive the vast majority of the 
computing-buying public is- that excludes the Apple sheeple 
who actually seem to get a hard-on from rising iPhone prices, 
all the better for them to show how much money they've lucked 
into by brandishing their "gold" iPhone ;) - I don't see most 
willing to spend twice on two devices, that could be replaced 
by just one. Until recently, they didn't have a choice, as you 
couldn't use your mobile device as a desktop, but the 
just-released devices I linked in the first post in this 
thread are starting to change that.


Because for about £300 you can get an intel NUC system with 
120GB SSD, which is more powerful and more upgradeable than 
your £700 mobile device. And some people still want that. And 
because most people have more than one TV, some have multiple 
phones, phones and tablets, and desktops, and multiple games 
consoles. And they still use them all in different situations.


That's more on the high end, where people use many devices. On 
the low- to mid-end of the market, where most of the sales 
happen, people are happy to buy fewer devices that get the job 
done.


This "one device" thing is your preference and you're 
projecting it onto everyone else.


Looks to me like you're the one projecting here. People used to 
buy standalone mp3 players, GPS devices, point-and-shoot cameras, 
handheld gaming consoles, etc., etc. Sales of all those 
standalone devices have been devastated by the smartphone; here's 
just one example of what happened to camera sales after the 
smartphone took over, which I've linked on this forum before:


https://petapixel.com/2017/03/03/latest-camera-sales-chart-reveals-death-compact-camera/

I find it strange that you think the PC won't also be rolled up 
by mobile like this.


Yes you can bring up examples of people who made mistakes 
predicting the future but that works both ways. You're just 
as guilty of seeing a two points and drawing a straight line 
though them.


Except none of these examples or my own prediction are based 
on simple extrapolation between data points. Rather, we're 
analyzing the underlying technical details and capabilities 
and coming to different conclusions about whether the status 
quo is likely to remain. So I don't think any of us are 
"guilty" of your charge.


Of course you are, you're making predictions and assuming the 
trends will continue, you assume the technical details are all 
important. Im saying they are

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-15 Thread Joakim via Digitalmars-d

On Sunday, 16 September 2018 at 01:03:27 UTC, Dave Jones wrote:

On Saturday, 15 September 2018 at 15:25:55 UTC, Joakim wrote:

On Friday, 14 September 2018 at 09:23:24 UTC, Dave Jones wrote:

On Thursday, 13 September 2018 at 22:56:31 UTC, Joakim wrote:

On Thursday, 13 September 2018 at 22:41:08 UTC, Nick


And people don't use PCs for such things? ;)


Sure, but they use them for a bunch of other stuff too. My 
point was that mobile growth has been in the "such things" but 
barely made a dent in the other stuff. So when you see 30% pc 
screen time and 70% mobile, its not a 70% drop in actual time 
spent in front of a PC. It's more a massive growth in time on 
mobile doing mostly banal pointless crap.


Sure, mobile has grown the market for digital entertainment and 
communication much more than taking away the time spent doing 
work on a PC, at least so far.


I know a lot of people who did, which explains the 28% drop in 
PC sales since they peaked in 2011, the year after the iPad 
came out. Many of those people who used to buy PCs have 
switched to tablets and other mobile devices.


Yet PC sales are up this year, mobile is down, and tablet sales 
have fallen for 3 years in a row.


Eh, these are all mostly mature markets now, so slight quarterly 
dips or gains don't matter much anymore. What does it matter that 
PC sales were up 2-3% last quarter when 7 times as many 
smartphones and mobile devices were sold in that same quarter?


More like when computers first started replacing typewriters, 
I'm sure many laughed at that possibility back then too. :)


Im not laughing at the idea of mobile eating into desktop PC 
share. What Im saying is that it hasnt done so as much as you 
think.


I say that almost 30% drop in PC sales over the last 7 years is 
mostly due to the rise of mobile. Not sure what you mean by "it 
hasnt done so as much as you think." You may argue that most 
using PCs aren't using them for entertainment, but this drop 
suggests that at least 30% of them were and have now moved to 
mobile.


And just because there's been a trend for 5 or 6 years doesnt 
mean it will continue so inevitably.


Sure, but these trends almost never reverse. ;)

I actually think most people would prefer a separate desktop 
and mobile device, whether that desktop is just the size of 
pack of cigarettes, or a big box with 5 fans in it.


Why? Given how price-sensitive the vast majority of the 
computing-buying public is- that excludes the Apple sheeple who 
actually seem to get a hard-on from rising iPhone prices, all the 
better for them to show how much money they've lucked into by 
brandishing their "gold" iPhone ;) - I don't see most willing to 
spend twice on two devices, that could be replaced by just one. 
Until recently, they didn't have a choice, as you couldn't use 
your mobile device as a desktop, but the just-released devices I 
linked in the first post in this thread are starting to change 
that.


You've probably heard of the possibly apocryphal story of how 
Blackberry and Nokia engineers disassembled the first iPhone 
and dismissed it because it only got a day of battery life, 
while their devices lasted much longer. They thought the 
mainstream market would care about such battery life as much 
as their early adopters, but they were wrong.


But here's a better story for this occasion, Ken Olsen, the 
head of DEC who built the minicomputers on which Walter got 
his start, is supposed to have disassembled the first IBM PC 
and this was his reaction:


"Ken Olsen bought one of the first IBM PCs and disassembled it 
on a table in Olsen’s office.


'He was amazed at the crappy power supply,' Avram said, 'that 
it was so puny.  Olsen thought that if IBM used such poor 
engineering then Digital didn’t have anything to worry about.'


Clearly Olsen was wrong."
https://www.cringely.com/2011/02/09/ken-olsen-and-post-industrial-computing/

You're making the same mistake as him. It _doesn't matter_ 
what people first use the new tool for, what matters is what 
it _can_ be used for, particularly over time. That time is 
now, as top and mid-range smartphone chips now rival mid-to 
low-end PC CPUs, which is the majority of the market. The 
x86/x64 PC's days are numbered, just as it once killed off the 
minicomputer decades ago.


Yes you can bring up examples of people who made mistakes 
predicting the future but that works both ways. You're just as 
guilty of seeing a two points and drawing a straight line 
though them.


Except none of these examples or my own prediction are based on 
simple extrapolation between data points. Rather, we're analyzing 
the underlying technical details and capabilities and coming to 
different conclusions about whether the status quo is likely to 
remain. So I don't think any of us are "guilty" of your charge.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-15 Thread Joakim via Digitalmars-d

On Friday, 14 September 2018 at 09:23:24 UTC, Dave Jones wrote:

On Thursday, 13 September 2018 at 22:56:31 UTC, Joakim wrote:
On Thursday, 13 September 2018 at 22:41:08 UTC, Nick 
Sabalausky (Abscissa) wrote:

On 09/10/2018 11:13 PM, tide wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
That's why PC sales keep dropping while mobile sales are 
now 6-7X that per year:


This shouldn't be misunderstood as such, which I think you 
as misunderstanding it. The reason mobile sales are so high 
is because of planned obsolescence and the walled garden 
that these devices are built around. I've gone through maybe 
3-4 phones in the time that I've had my Desktop, and I use 
my desktop every single day. I don't need to buy a new one 
cause it runs perfectly fine, there aren't operating system 
updates that purposely cause the CPU to run slower to "save 
battery life" when a new device and OS come out. That's not 
to say it isn't insignificant but the sales numbers are 
exacerbated.


Right. Basically, "sales stats" should never be misconstrued 
as "usage stats".


The usage stats are similarly overwhelming, two-thirds of 
digital time is spent on mobile, more for the young:


Yeah but 90% of the time people spend on mobile is just dicking 
about. Sending IMs, facebook, point and click games. And thats 
a huge part of the usage stats, people can now spend more time 
online wasting time in more situations than ever before.


And people don't use PCs for such things? ;) I know a lot of 
people who did, which explains the 28% drop in PC sales since 
they peaked in 2011, the year after the iPad came out. Many of 
those people who used to buy PCs have switched to tablets and 
other mobile devices.


PCs are generally seen a tool to accomplish tasks, for word 
processing or a high end gaming thing, audio / video editing, 
mobile is more entertainment. Not many people are doing what 
you are by using your mobile as a desktop.


I'm not saying that makes mobile worthless, what I'm saying is 
that your hypothesis is like saying TV has taken over from 
typewriters.


More like when computers first started replacing typewriters, I'm 
sure many laughed at that possibility back then too. :)


You've probably heard of the possibly apocryphal story of how 
Blackberry and Nokia engineers disassembled the first iPhone and 
dismissed it because it only got a day of battery life, while 
their devices lasted much longer. They thought the mainstream 
market would care about such battery life as much as their early 
adopters, but they were wrong.


But here's a better story for this occasion, Ken Olsen, the head 
of DEC who built the minicomputers on which Walter got his start, 
is supposed to have disassembled the first IBM PC and this was 
his reaction:


"Ken Olsen bought one of the first IBM PCs and disassembled it on 
a table in Olsen’s office.


'He was amazed at the crappy power supply,' Avram said, 'that it 
was so puny.  Olsen thought that if IBM used such poor 
engineering then Digital didn’t have anything to worry about.'


Clearly Olsen was wrong."
https://www.cringely.com/2011/02/09/ken-olsen-and-post-industrial-computing/

You're making the same mistake as him. It _doesn't matter_ what 
people first use the new tool for, what matters is what it _can_ 
be used for, particularly over time. That time is now, as top and 
mid-range smartphone chips now rival mid-to low-end PC CPUs, 
which is the majority of the market. The x86/x64 PC's days are 
numbered, just as it once killed off the minicomputer decades ago.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-14 Thread Joakim via Digitalmars-d

On Friday, 14 September 2018 at 16:53:16 UTC, Iain Buclaw wrote:
On 14 September 2018 at 09:51, Joakim via Digitalmars-d 
 wrote:
On Wednesday, 12 September 2018 at 22:41:31 UTC, Iain Buclaw 
wrote:


On 12 September 2018 at 10:09, Joakim via Digitalmars-d 
 wrote:


I think their model of having an open ISA with proprietary 
extensions
will inevitably win out for hardware, just as a similar 
model has basically
won already for software, but that doesn't mean that RISC-V 
will be the one

to do it. Someone else might execute that model better.



POWER9 has been making some headway, for instance finally 
they have a sensible real type (IEEE Quadruple).  Though the 
developers working on glibc support seem to be making a 
shambles of it, where they want to support both new and old 
long double types at the same time at run-time!  It seems 
that no one thought about Fortran, Ada, or D when it came to 
long double support in the C runtime library *sigh*.


For us, I think we can choose to ignore the old IBM 128-bit 
float, and so remove any supporting code from our library, 
focusing instead only on completing IEEE 128-bit float 
support (LDC, upstream your local patches before i start 
naming and shaming you).



All the pulls linked from that AArch64 tracker issue above 
were submitted upstream first before merging into the ldc 
repo. Only one patch that I know of hasn't been merged 
upstream yet: my commit to add IEEE Quadruple support to 
core.internal.convert, only because I want to add another 
Android commit to that pull soon, but the patch is available 
in the open druntime pulls.


If you know of some other patches that need to be upstreamed, 
let us know, AFAIK they were all upstreamed first.




Can you send me links to any open PR you have?  These should 
not be sitting around for months without merge.


That's on me: I had another commit in the works for Android 
that's mostly working, but put it aside for the ldc 1.11 release, 
updating the docs on the wiki, and now reworking the Android 
emulated TLS patch for the upcoming LLVM 7 release. Feel free to 
use the commit I submitted here a couple months ago or to review 
it, but I'd like to get that second Android commit in before that 
pull's merged:


https://github.com/dlang/druntime/pull/2257




Re: Mobile is the new PC and AArch64 is the new x64

2018-09-14 Thread Joakim via Digitalmars-d
On Wednesday, 12 September 2018 at 22:41:31 UTC, Iain Buclaw 
wrote:
On 12 September 2018 at 10:09, Joakim via Digitalmars-d 
 wrote:
I think their model of having an open ISA with proprietary 
extensions will inevitably win out for hardware, just as a 
similar model has basically won already for software, but that 
doesn't mean that RISC-V will be the one to do it. Someone 
else might execute that model better.




POWER9 has been making some headway, for instance finally they 
have a sensible real type (IEEE Quadruple).  Though the 
developers working on glibc support seem to be making a 
shambles of it, where they want to support both new and old 
long double types at the same time at run-time!  It seems that 
no one thought about Fortran, Ada, or D when it came to long 
double support in the C runtime library *sigh*.


For us, I think we can choose to ignore the old IBM 128-bit 
float, and so remove any supporting code from our library, 
focusing instead only on completing IEEE 128-bit float support 
(LDC, upstream your local patches before i start naming and 
shaming you).


All the pulls linked from that AArch64 tracker issue above were 
submitted upstream first before merging into the ldc repo. Only 
one patch that I know of hasn't been merged upstream yet: my 
commit to add IEEE Quadruple support to core.internal.convert, 
only because I want to add another Android commit to that pull 
soon, but the patch is available in the open druntime pulls.


If you know of some other patches that need to be upstreamed, let 
us know, AFAIK they were all upstreamed first.


ARM seems to be taking RISC-V seriously at least (this site was 
taken down after a couple days if I understand right: 
http://archive.fo/SkiH0).  There is currently a lot of 
investment going into ARM64 in the server space right now, but 
signals I'm getting from people working on those projects are 
that it just doesn't hold water.  With one comparison being a 
high end ARM64 server is no better than a cheap laptop bought 5 
years ago.


As Kagamin says, it depends on how many cores you're using and 
what benchmark you run, but most of the time, that's not true at 
all:


https://blog.cloudflare.com/arm-takes-wing/

And ARM does it with much less electric power used, as shown in 
that last graph, which you have to take into account when looking 
at the total costs. The ARM blog post I linked earlier in this 
thread shows they've gone ahead with using ARM too.


RISC-V got accepted into gcc-7, and runtime made it into glibc 
2.27, there's certainly a lot effort being pushed for it.  They 
have excellent simulator support on qemu, porting druntime only 
took two days.  Patches for RISCV64 will come soon, probably 
with some de-duplication of large blocks.


Great, but it's still in very nascent stages, with linux only 
running on it this year. I thought about using Qemu but figured 
the slowness and possible hardware compatibility issues weren't 
worth it.


I hope some open arch like these takes off sometime soon, as I 
don't like an ARM monopoly much better than the previous Intel 
one, but it's going to take awhile for POWER/RISC-V to get 
anywhere close.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-13 Thread Joakim via Digitalmars-d
On Thursday, 13 September 2018 at 22:41:08 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/10/2018 11:13 PM, tide wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
That's why PC sales keep dropping while mobile sales are now 
6-7X that per year:


This shouldn't be misunderstood as such, which I think you as 
misunderstanding it. The reason mobile sales are so high is 
because of planned obsolescence and the walled garden that 
these devices are built around. I've gone through maybe 3-4 
phones in the time that I've had my Desktop, and I use my 
desktop every single day. I don't need to buy a new one cause 
it runs perfectly fine, there aren't operating system updates 
that purposely cause the CPU to run slower to "save battery 
life" when a new device and OS come out. That's not to say it 
isn't insignificant but the sales numbers are exacerbated.


Right. Basically, "sales stats" should never be misconstrued as 
"usage stats".


The usage stats are similarly overwhelming, two-thirds of digital 
time is spent on mobile, more for the young:


https://www.searchforce.com/blog/the-comscore-u-s-mobile-app-report-2017/

I went all-mobile three years ago, haven't looked back.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-12 Thread Joakim via Digitalmars-d

On Wednesday, 12 September 2018 at 15:38:36 UTC, Joakim wrote:

the world is right now? It's not IBM, Apple,


Whoops, meant to write Intel here, but wrote Apple again. :D


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-12 Thread Joakim via Digitalmars-d

On Wednesday, 12 September 2018 at 06:41:38 UTC, Gambler wrote:

On 9/10/2018 9:43 AM, Joakim wrote:
Yes, I know, these devices won't replace your quad-core Xeon 
workstation with 32-64 GBs of RAM anytime soon, but most 
people don't need anywhere near that level of compute. That's 
why PC sales keep dropping while mobile sales are now 6-7X 
that per year:
I'm all for supporting modern open CPU architectures. At the 
same time,
I fear that the specific trend you're describing here (people 
ditching
PCs for cellphones/tablets) is effectively a reversal of the PC 
revolution.


For the last 30+ years people benefited from "trickle down 
computing". They had access to PCs that were equivalent to 
cutting edge servers of 6-7 years prior. They had ability to 
choose their operating system, expand and upgrade their 
hardware and install any software they wanted.


All of this is breaking down right now.


Yes and no, it is true that that is the way tech  _used_ to 
diffuse. However, do you know what the largest tech company in 
the world is right now? It's not IBM, Apple, HP, or Microsoft, ie 
none of the server or PC companies. It's Apple, which doesn't 
sell into the server or traditional enterprise markets almost at 
all and only has 15-20% unit share in the mobile market.


In other words, consumer tech markets are _much_ larger than the 
server/enterprise markets that used to lead tech R&D, which means 
consumer tech like mobile is what leads the way now.


As for choosing your own OS, that's still possible, but as 
always, it can be tough to get drivers for your hardware:


https://together.jolla.com/question/136143/wiki-available-devices-running-sailfish-os/

And if you simply want to tinker with the Android OS on your 
device, there are many ways to do that:


https://www.xda-developers.com/how-to-install-custom-rom-android/

No need to expand and upgrade your hardware when prices keep 
dropping in this Darwinian market. There's now a $500 phone with 
a faster chip than the one I got just 7 months back for $700:


https://m.newegg.com/products/N82E16875220078

As for installing any software you want, Android allows it: it's 
how I debug the apps I build on my phone or tablet. The iPhone 
doesn't, but it's a minority of the mobile market.


Intel got lazy without competition and high-end CPU 
architectures stagnated. All the cutting-edge computing is done 
on NVidia cards today. It requires hundreds of gigabytes of 
RAM, tens of terabytes of data and usage of specialized 
computing libraries. I very much doubt this will "trickle down" 
to mobile in foreseeable future. Heck, most developer laptops 
today have no CUDA capabilities to speak of.


I question the need for such "cutting-edge computing" in the 
first place, but regardless, it has already moved down to mobile 
and other edge devices:


https://arstechnica.com/gadgets/2017/10/the-pixel-2-contains-a-custom-google-soc-the-pixel-visual-core/
https://www.theverge.com/2018/7/26/17616140/google-edge-tpu-on-device-ai-machine-learning-devkit

Moreover, mobile devices are locked down by default and it's no 
trivial task to break out of those walled gardens. IIRC, Apple 
has an official policy of not allowing programming tools in 
their app store. Alan Kay had to personally petition Steve Jobs 
to allow Scratch to be distributed, so kids could learn 
programming. I believe the general policy is still in place.


They have their own app for that now:

https://www.apple.com/swift/playgrounds/

Android is better, but it's still a horror to do real work on, 
compared to any PC OS. Fine, you rooted it, installed some 
compilers and so on. How will you share your software with 
fellow Android users?


You seem to have missed all the posts I've made here before about 
native Android support for ldc: :) _I have never rooted any of my 
Android devices_. Compiling D code on most any Android device is 
as simple as installing an app from the official Play Store and 
typing a single command, `apt install ldc`:


https://wiki.dlang.org/Build_D_for_Android

The instructions there even show you how to package up an Android 
GUI app, an apk, on Android itself, by using some other packages 
available in that Android app.


In essence, we are seeing the rapid widening of two digital 
divides. The first one is between users and average developers. 
The second one is between average developers and researchers at 
companies like Google. I very much doubt that we will see an 
equivalent of today's high-end machine learning server on 
user's desk, let alone in anyone's pocket, within 7 years.


I disagree on both counts. First off, people were running 
supercomputers and UNIX workstations while you were piddling 
along on your PC decades ago. That changed nothing about what you 
were able to learn and accomplish on your PC. In fact, you were 
pr

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-12 Thread Joakim via Digitalmars-d

On Tuesday, 11 September 2018 at 08:34:31 UTC, Chris wrote:

On Tuesday, 11 September 2018 at 07:23:53 UTC, Joakim wrote:



I agree with a lot of what you say here, but I'm not sure what 
you mean by "first class support for mobile." What exactly do 
you believe D needs to reach that level?


Basically the things you describe. I was thinking of a stable 
and easy build system, e.g.


$ dub init android [iOS]

$ dub --arch=arm64


Yes, something like that should be done, but I won't be doing 
much with dub till next year. If anyone else is interested in 
doing it earlier, feel free.


And and of course check which language features work (or don't 
work!) on ARM and write a documentation.


Cf. https://kotlinlang.org/docs/reference/native-overview.html


I don't see any language features listed for Kotlin there, but 
ldc does have an AArch64 tracker issue, which lists what else 
needs to be done:


https://github.com/ldc-developers/ldc/issues/2153

It might be a good idea to set up a funding target to get the 
iOS port back up to speed again. I don't use Apple products so 
it won't be me picking up that porting work, but maybe Dan 
could be enticed to finish it as a paid project, since he did 
most of the voluntary work so far. I'm purely speculating, no 
idea if money changes the equation for him, just know that 
he's been too busy to work on it for the last couple years.


That'd be part of the first class support. That a dedicated 
team works on it. Volunteers are not enough. Once it's polished 
it will still need loads of maintenance.


I don't think there's a "dedicated team" for any platform that D 
runs on, so we don't have "first class support" for any platform 
then.


D is largely a volunteer effort: if that's not enough, maybe D 
isn't right for you. This isn't Kotlin or Swift, where one of the 
largest companies in the world puts full-time devs on the 
language and gives everything away for free because it suits 
their agenda.


In Apple's case, that means Swift doesn't really support Android 
and definitely doesn't support Android/AArch64, because putting 
full-time devs on getting Swift working well with Android doesn't 
suit their agenda of pushing iOS:


https://github.com/apple/swift/blob/master/docs/Android.md
https://blog.readdle.com/why-we-use-swift-for-android-db449feeacaf

However, since Swift is largely open source, there is a small 
company that claims to have added Android/AArch64 support to the 
Swift compiler:


https://www.scade.io

Kotlin is becoming more cross-platform now since google is more 
cross-platform, but then you're depending on google continually 
funding development on an OSS project, which they've backed out 
of before:


https://arstechnica.com/gadgets/2018/07/googles-iron-grip-on-android-controlling-open-source-by-any-means-necessary/

I don't fault google for making those choices, as nobody has a 
right to their OSS contributions, but it is something to consider 
when using any platform, and even more so for an OSS project: who 
is funding this and why? Will their model be sustainable?


There are no easy answers here: if you want a free-priced, OSS 
toolchain, you're going to be marching to the beat of someone's 
drum.


As for ongoing maintenance, Android/ARM was done years ago and 
hasn't taken much in the way of maintenance to keep most of the 
stdlib/dmd tests passing, so I don't think that's much of an 
issue.


btw, it was a thread _you_ started that finally spurred me to 
begin this Android port five years back, though I'd enquired 
about and had been considering it earlier:


https://forum.dlang.org/thread/yhulkqvlwnxjklnog...@forum.dlang.org

On Tuesday, 11 September 2018 at 16:50:33 UTC, Dejan Lekic wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
LDC recently added a linux/AArch64 CI for both its main 
branches and 64-bit ARM, ie AArch64, builds have been put out 
for both linux and Android. It does not seem that many are 
paying attention to this sea change that is going on with 
computing though, so let me lay out some evidence. ...


I mostly agree with you, Joakim. I own a very nice (but now 
old) ODROID U2 (check the ODROID XU4 or C2!) so ARM support is 
important for me...


Also, check this: 
https://www.hardkernel.com/main/products/prdt_info.php?g_code=G152875062626


HOWEVER, I think Iain is right - PPC64 and RISC-V are becoming 
more and more popular nowadays and may become more popular than 
ARM in the future but that future is unclear.


If and when they do, I'm sure D and other languages will be 
ported to them, but right now they're most definitely not.


I know because I actually looked for a RISC-V VPS on which to 
port ldc and found nothing. Conversely, I was able to rent out an 
ARM Cubieboard2 remotely four years back when I

Re: Mobile is the new PC and AArch64 is the new x64

2018-09-11 Thread Joakim via Digitalmars-d

On Tuesday, 11 September 2018 at 07:42:38 UTC, passenger wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:

[...]


Is it possible to develop versus a NVidia Jetson, CUDA included?


I think so, but I doubt anyone has ever actually tried it:

https://www.nvidia.com/en-us/autonomous-machines/embedded-systems-dev-kits-modules/

As for CUDA, Nicholas Wilson said recently that he could do 
something with it for his DCompute project with ldc, but no idea 
what the current status is:


https://forum.dlang.org/post/slijjptlxdrfgvoya...@forum.dlang.org


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-11 Thread Joakim via Digitalmars-d

On Tuesday, 11 September 2018 at 06:42:26 UTC, Chris wrote:

On Monday, 10 September 2018 at 19:28:01 UTC, aberba wrote:

On Monday, 10 September 2018 at 16:09:41 UTC, rjframe wrote:




That's exactly whats happening in Africa. The continent is 
leapfrogging from nothing to a smart phone thanks to China. 
Many don'[t know how to even use a PC. Especially the young 
and upcoming generation.


The smart phone market is really significant.


That's why I'm trying to draw attention to first class ARM 
support by the D Foundation (not just some voluntary efforts, 
much as they are appreciated). ARM in dmd wouldn't be a bad 
idea perhaps, as Manu suggested. It's become more than obvious 
over the last couple of years that mobile devices have become 
very important and that people often use them instead of PCs or 
laptops. Fewer and fewer developers can "escape" development 
for mobile ("we want an app too"), and if a language doesn't 
offer first class support for mobile, then devs won't bother 
with it. A lot of other (new) languages are directing their 
efforts towards mobile for a reason.


I agree with a lot of what you say here, but I'm not sure what 
you mean by "first class support for mobile." What exactly do you 
believe D needs to reach that level?


I think most of the heavy lifting with core language and stdlib 
support is done. What remains is polish and integration for the 
build process and possibly IDEs and of course, making sure there 
are D libraries to make mobile dev easier.


I will be sporadically polishing the build process over the 
coming months, by getting much more building through dub. I won't 
be doing anything with IDEs, as I don't use them.


As for libraries, that all depends on what you're doing and how 
much others in the D community want that too.


It might be a good idea to set up a funding target to get the iOS 
port back up to speed again. I don't use Apple products so it 
won't be me picking up that porting work, but maybe Dan could be 
enticed to finish it as a paid project, since he did most of the 
voluntary work so far. I'm purely speculating, no idea if money 
changes the equation for him, just know that he's been too busy 
to work on it for the last couple years.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-10 Thread Joakim via Digitalmars-d

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
LDC recently added a linux/AArch64 CI for both its main 
branches and 64-bit ARM, ie AArch64, builds have been put out 
for both linux and Android. It does not seem that many are 
paying attention to this sea change that is going on with 
computing though, so let me lay out some evidence.


[...]


Oh, I reported the AArch64 release of LDC to this blog a month 
ago, and I just saw that they wrote up an entry about it last 
week:


https://www.worksonarm.com/blog/woa-issue-65/


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-10 Thread Joakim via Digitalmars-d

On Monday, 10 September 2018 at 15:06:46 UTC, Claude wrote:

On Monday, 10 September 2018 at 13:43:46 UTC, Joakim wrote:
Despite all this, D may never do very well on mobile or 
AArch64, even though I think it's well-suited for that market. 
But at the very least, you should be looking at mobile and 
AArch64, as they're taking over the computing market.


Coming from ARM system programming for embedded systems, I'm 
also looking into AArch64. Having done some x86 assembly, ARM 
assembly was a bliss, and AArch64 looks even better!


I also wish D will do well for embedded systems.


Radu has done good work getting D working with uClibc, for 
example with OpenWRT:


https://github.com/ldc-developers/ldc/issues/2810

yshui added a Musl port too, which can be used with the Alpine 
build of ldc available at the above ldc 1.11 link.


There have been a couple reports of companies trying to use ldc 
for this, but there are likely still bugs that need to be ironed 
out.


Re: Mobile is the new PC and AArch64 is the new x64

2018-09-10 Thread Joakim via Digitalmars-d

On Monday, 10 September 2018 at 14:00:43 UTC, Iain Buclaw wrote:
On 10 September 2018 at 15:43, Joakim via Digitalmars-d 
 wrote:
LDC recently added a linux/AArch64 CI for both its main 
branches and 64-bit ARM, ie AArch64, builds have been put out 
for both linux and Android. It does not seem that many are 
paying attention to this sea change that is going on with 
computing though, so let me lay out some evidence.




I've just got back from a conference where AArch64 was declared 
a disaster


Why?


and the future is now PPC64 and RISC-V.


If you're not joking, it still stands that AArch64 is the 
_present_, as the currently most widely-deployed CPU arch used 
for personal computing, ie in mobile devices.


Those other two arches are a joke right now, but you never know 
in the future. ARM was a joke compared to Intel a couple decades 
ago too. ;)


Mobile is the new PC and AArch64 is the new x64

2018-09-10 Thread Joakim via Digitalmars-d
LDC recently added a linux/AArch64 CI for both its main branches 
and 64-bit ARM, ie AArch64, builds have been put out for both 
linux and Android. It does not seem that many are paying 
attention to this sea change that is going on with computing 
though, so let me lay out some evidence.


At my workplace six years ago, the developers were all allocated 
a core i5 ultrabook- likely with 4 GBs of RAM and a 128 GB SSD, 
though I don't remember those specs- and a 27" 2560X1440 display 
with which to get our work done. I was pretty happy with the 
display, the best I'd ever used to that point. I'm guessing the 
setup cost my employer a couple thousand dollars per developer.


I picked up an Android/AArch64 smartphone earlier this year, with 
6 GBs of RAM, 128 GBs of flash, a Snapdragon 835 octa-core CPU, 
and a 5.5" 2560X1440 display. This is the fastest computer I've 
ever owned, and it fits in 6 cubic inches and weighs a little 
more than a third of a pound. It cost me approximately $700.


That is a stunning change in mobile capabilities in just six 
years, where what used to be a mobile developer workstation now 
comes packed into a smartphone at a fraction of the cost.


If you think the phone doesn't actually perform, I borrowed a 
2015-model Macbook Air with a core i5 and 4 GBs of RAM and built 
the last pure C++ version of ldc, 0.17, using both cores with 
`-ninja -j5`. It took two minutes with clang from Homebrew, the 
same amount of time it takes me to build the same source on my 
smartphone with clang by running `ninja -j9`.


This phone has been my development hardware since early this 
year, by pairing it with a $30 bluetooth keyboard and a $5 stand 
to hold it up. I'm typing this long forum post up on it now.


Tech companies are starting to realize this and going after the 
desktop/laptop PC markets with various 64-bit ARM products:


https://www.engadget.com/2018/08/09/samsung-galaxy-note-9-dex/
https://arstechnica.com/gadgets/2018/08/samsungs-tab-s4-is-both-an-android-tablet-and-a-desktop-computer/
https://youtube.com/watch?v=uLvIAskVSUM
https://www.anandtech.com/show/13309/lenovo-yoga-c630-snapdragon-850-windows

That last link notes 25 hours of battery life with a 
Windows/AArch64 laptop, one of the key benefits of ARM, which is 
why even Microsoft has come around.


Yes, I know, these devices won't replace your quad-core Xeon 
workstation with 32-64 GBs of RAM anytime soon, but most people 
don't need anywhere near that level of compute. That's why PC 
sales keep dropping while mobile sales are now 6-7X that per year:


https://www.businessinsider.com/PC-sales-are-continuing-to-slump-fewer-are-sold-now-than-when-the-iPhone-launched/articleshow/62547330.cms
https://www.androidauthority.com/smartphone-sales-idc-2018-871363/

Most of those mobile devices running iOS have AArch64 CPUs, and 
google said last December that "over 40% of Android devices 
coming online have 64-bit support," which is why they're 
requiring apps with native libraries to support it by next fall:


https://android-developers.googleblog.com/2017/12/improving-app-security-and-performance.html

D now has mostly working AArch64 support, with the ldc 1.11 
release last month:


https://github.com/ldc-developers/ldc/releases/tag/v1.11.0

That is the result of years of intermittent AArch64 patches added 
by the core teams of ldc and gdc- David, Iain, Kai, Johannes, 
Dan, and others- to which I recently added some Android patches. 
You too can pitch in with the few remaining issues or try out the 
AArch64 support with your own D code.


This company provides a free linux/AArch64 CI for OSS projects, 
LDC uses it:


http://blog.shippable.com/shippable-arm-packet-deliver-native-ci-cd-for-arm-architecture

Despite all this, D may never do very well on mobile or AArch64, 
even though I think it's well-suited for that market. But at the 
very least, you should be looking at mobile and AArch64, as 
they're taking over the computing market.


Re: This is why I don't use D.

2018-09-10 Thread Joakim via Digitalmars-d
On Monday, 10 September 2018 at 01:27:20 UTC, Neia Neutuladh 
wrote:
On Wednesday, 5 September 2018 at 05:44:38 UTC, H. S. Teoh 
wrote:

To me, this strongly suggests the following idea:
- add *all* dlang.org packages to our current autotester / CI
  infrastructure.
- if a particular (version of a) package builds successfully, 
log the
  compiler version / git hash / package version to a database 
and add
  a note to dlang.org that this package built successfully 
with this

  compiler version.


Not on dlang.org anywhere, but I built a crude version of this. 
Results are available at http://ikeran.org/report/.


The current backfill is taking the three most recent versions 
of each package on the ~40 most recent versions of dmd, using a 
list of dub packages from a couple days ago. Each compiler 
version takes about six hours, so the backfill will be finished 
in about ten days. The report should update automatically every 
100 builds.


Once that's done, I'll extend the backfill to handle all 
package versions, have it get new package versions from dub, 
and get it to discover and download new DMD versions 
automatically. This shouldn't be a huge change.


Nice work. I wonder about some of your results, as it says that 
dub itself doesn't build with all of the dmd versions, but 
somehow the tests pass sometimes (shouldn't be possible if you 
can't build dub itself). I just tested with `dub fetch dub; dub 
build dub` with the dub 1.10 that comes alongside dmd 2.081.1 on 
linux/x64 and it built the latest dub 1.11 without a problem.


Regardless, you should talk to Sonke about getting this 
integrated with code.dlang.org and to Mike about putting up a 
funding target to pay for both permanent server resources and 
your time coding this up. I've already said I'd donate towards 
such a target:


https://forum.dlang.org/post/acxedxzzesxkyomrs...@forum.dlang.org


Re: This is why I don't use D.

2018-09-08 Thread Joakim via Digitalmars-d

On Saturday, 8 September 2018 at 01:32:19 UTC, Everlast wrote:
On Saturday, 8 September 2018 at 00:53:33 UTC, Neia Neutuladh 
wrote:

[...]


There are ways around this:

[...]


Is there any other language that does any of this? I don't think 
any language does all of it, so do you plan to wait a couple 
decades for any language to do it before you start programming?


Some of these are worthwile suggestions, but nobody's doing them 
yet. I suggest you start implementing them somewhere- maybe here, 
dub is an OSS project: https://github.com/dlang/dub - rather than 
lecturing others about how it's all so obvious. That way, you can 
actually get some work done before the cows come home. ;)


Re: What changes to D would you like to pay for?

2018-09-06 Thread Joakim via Digitalmars-d

On Friday, 7 September 2018 at 05:31:22 UTC, Mike Franklin wrote:

On Wednesday, 5 September 2018 at 07:00:49 UTC, Joakim wrote:
The D foundation is planning to add a way for us to pay for 
changes we'd like to see in D and its ecosystem, rather than 
having to code everything we need ourselves or find and hire a 
D dev to do it:


"[W]e’re going to add a page to the web site where we can 
define targets, allow donations through Open Collective or 
PayPal, and track donation progress. Each target will allow us 
to lay out exactly what the donations are being used for, so 
potential donors can see in advance where their money is 
going. We’ll be using the State of D Survey as a guide to 
begin with, but we’ll always be open to suggestions, and we’ll 
adapt to what works over what doesn’t as we go along."

https://dlang.org/blog/2018/07/13/funding-code-d/

I'm opening this thread to figure out what the community would 
like to pay for specifically, so we know what to focus on 
initially, whether as part of that funding initiative or 
elsewhere. I am not doing this in any official capacity, just 
a community member who would like to hear what people want.


Please answer these two questions if you're using or would 
like to use D, I have supplied my own answers as an example:


1. What D initiatives would you like to fund and how much 
money would you stake on each? (Nobody is going to hold you to 
your numbers, but please be realistic.)


I'd be willing to pay at least $100 each for these two:

https://issues.dlang.org/show_bug.cgi?id=19159
https://issues.dlang.org/show_bug.cgi?id=18788

Quite honestly, though, I probably wouldn't do it myself for 
$100.  These bounties really need to be $500 or more.


If D is to be funded by individuals, there needs to be some way 
to organize individuals around common interest and raise funds 
for those tasks.


Yes, that's the point of the funding targets mentioned in the 
quoted blog post and this thread, to see who's interested in 
collectively pooling towards certain common goals. Obviously no 
one's person contribution would be enough to get any non-trivial 
goal funded.


Given the anemic response to this thread and the Opencollective 
so far, I suspect we wouldn't raise much though. OTOH, maybe the 
people who would pay don't read the forum.


For example, the D Language Foundation has a "Corporate Bronze" 
offer on its OpenCollective page that includes 3 priority bug 
fixes per month for $12,000.  If we could get 24 like-minded 
people, willing to contribute $500 each, and vote on priority 
bugs, that could potentially get things moving in the right 
direction.  That would be 1 1/2 bugs per contributor.  I don't 
think that's bad.  I'd be willing to join such a collective if 
I got at least 1 priority bug fix out of it.


Yes, these types of paid bugfix schemes are what I describe above 
too.


Even better, IMO, it'd be nice if the "Individual Sponsor" or 
"Organizational Sponsor" offers on the OpenCollective page 
included at least 1 priority bug fix.


That would make sense for the latter offer.

Btw, if anyone is under any illusion that I'm offering to 
implement any of this for the money, I have zero interest in 
doing this work. I _am_ interested in paying others to do it. I 
may tinker with enabling the GC on the DMD frontend some day, but 
that wouldn't be for any bounties, just OSS.


Re: This is why I don't use D.

2018-09-06 Thread Joakim via Digitalmars-d
On Thursday, 6 September 2018 at 18:20:05 UTC, Bastiaan Veelo 
wrote:
On Wednesday, 5 September 2018 at 05:44:38 UTC, H. S. Teoh 
wrote:

To me, this strongly suggests the following idea:
- add *all* dlang.org packages to our current autotester / CI
  infrastructure.
- if a particular (version of a) package builds successfully, 
log the
  compiler version / git hash / package version to a database 
and add
  a note to dlang.org that this package built successfully 
with this

  compiler version.
- if a particular (version of a) package fails to build for 
whatever
  reason, log the failure and have a bot add a note to 
dlang.org that

  this package does NOT build with that compiler version.
   - possibly add the package to a blacklist for this compiler 
version
 so that we don't consume too many resources on outdated 
packages

 that no longer build.
- periodically update dlang.org (by bot) to indicate the last 
known

  compiler version that successfully built this package.
- in the search results, give preference to packages that built
  successfully with the latest official release.


Yes please!


Ah, but would you actually pay for such a service to be set up?

https://forum.dlang.org/thread/acxedxzzesxkyomrs...@forum.dlang.org

It's all well and good to hope for such services, but they're 
unlikely to happen unless paid for.


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-09-06 Thread Joakim via Digitalmars-d

On Thursday, 6 September 2018 at 16:44:11 UTC, H. S. Teoh wrote:
On Thu, Sep 06, 2018 at 02:42:58PM +, Dukc via 
Digitalmars-d wrote:

On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
> // D
> auto a = "á";
> auto b = "á";
> auto c = "\u200B";
> auto x = a ~ c ~ a;
> auto y = b ~ c ~ b;
> 
> writeln(a.length); // 2 wtf

> writeln(b.length); // 3 wtf
> writeln(x.length); // 7 wtf
> writeln(y.length); // 9 wtf

[...]

This is an unfair comparison.  In the Swift version you used 
.count, but here you used .length, which is the length of the 
array, NOT the number of characters or whatever you expect it 
to be.  You should rather use .count and specify exactly what 
you want to count, e.g., byCodePoint or byGrapheme.


I suspect the Swift version will give you unexpected results if 
you did something like compare "á" to "a\u301", for example 
(which, in case it isn't obvious, are visually identical to 
each other, and as far as an end user is concerned, should only 
count as 1 grapheme).


Not even normalization will help you if you have a string like 
"a\u301\u302": in that case, the *only* correct way to count 
the number of visual characters is byGrapheme, and I highly 
doubt Swift's .count will give you the correct answer in that 
case. (I expect that Swift's .count will count code points, as 
is the usual default in many languages, which is unfortunately 
wrong when you're thinking about visual characters, which are 
called graphemes in Unicode parlance.)


No, Swift counts grapheme clusters by default, so it gives 1. I 
suggest you read the linked Swift chapter above. I think it's the 
wrong choice for performance, but they chose to emphasize 
intuitiveness for the common case.


I agree with most of the rest of what you wrote about programmers 
having no silver bullet to avoid Unicode's and languages' 
complexity.


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-09-06 Thread Joakim via Digitalmars-d

On Thursday, 6 September 2018 at 09:35:27 UTC, Chris wrote:

On Thursday, 6 September 2018 at 08:44:15 UTC, nkm1 wrote:

On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright 
wrote:


Autodecode - I've suffered under that, too. The solution was 
fairly simple. Append .byCodeUnit to strings that would 
otherwise autodecode. Annoying, but hardly a showstopper.


import std.array : array;
import std.stdio : writefln;
import std.uni : byCodePoint, byGrapheme;
import std.utf : byCodeUnit;

void main() {

  string first = "á";

  writefln("%d", first.length);  // prints 2

  auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` 
(!)


  writefln("%d", firstCU.length);  // prints 2

  auto firstGr = "á".byGrapheme.array;  // type is 
`Grapheme[]`


  writefln("%d", firstGr.length);  // prints 1

  auto firstCP = "á".byCodePoint.array; // type is `dchar[]`

  writefln("%d", firstCP.length);  // prints 1

  dstring second = "á";

  writefln("%d", second.length);  // prints 1 (That was easy!)

  // DMD64 D Compiler v2.081.2
}


And this has what to do with autodecoding?


Nothing. I was just pointing out how awkward some basic things 
can be. autodecoding just adds to it in the sense that it's a 
useless overhead but will keep string handling in a limbo 
forever and ever and ever.




TBH, it looks like you're just confused about how Unicode 
works. None of that is something particular to D. You should 
probably address your concerns to the Unicode Consortium. Not 
that they care.


I'm actually not confused since I've been dealing with Unicode 
(and encodings in general) for quite a while now. Although I'm 
not a Unicode expert, I know what the operations above do and 
why. I'd only expect a modern PL to deal with Unicode correctly 
and have some guidelines as to the nitty-gritty.


Since you understand Unicode well, enlighten us: what's the best 
default format to use for string iteration?


You can argue that D chose the wrong default by having the stdlib 
auto-decode to code points in several places, and Walter and a 
host of the core D team would agree with you, and you can add me 
to the list too. But it's not clear there should be a default 
format at all, other than whatever you started off with, 
particularly for a programming language that values performance 
like D does, as each format choice comes with various speed vs. 
correctness trade-offs.


Therefore, the programmer has to understand that complexity and 
make his own choice. You're acting like there's some obvious 
choice for how to handle Unicode that we're missing here, when 
the truth is that _no programming language knows how to handle 
unicode well_, since handling a host of world languages in a 
single format is _inherently unintuitive_ and has significant 
efficiency tradeoffs between the different formats.


And once again, it's the user's fault as in having some basic 
assumptions about how things should work. The user is just too 
stpid to use D properly - that's all. I know this type of 
behavior from the management of pubs and shops that had to 
close down, because nobody would go there anymore.


Do you know the book "Crónica de una muerte anunciada" 
(Chronicle of a Death Foretold) by Gabriel García Márquez?


"The central question at the core of the novella is how the 
death of Santiago Nasar was foreseen, yet no one tried to stop 
it."[1]


[1] 
https://en.wikipedia.org/wiki/Chronicle_of_a_Death_Foretold#Key_themes


You're not being fair here, Chris. I just saw this SO question 
that I think exemplifies how most programmers react to Unicode:


"Trying to understand the subtleties of modern Unicode is making 
my head hurt. In particular, the distinction between code points, 
characters, glyphs and graphemes - concepts which in the simplest 
case, when dealing with English text using ASCII characters, all 
have a one-to-one relationship with each other - is causing me 
trouble.


Seeing how these terms get used in documents like Matthias 
Bynens' JavaScript has a unicode problem or Wikipedia's piece on 
Han unification, I've gathered that these concepts are not the 
same thing and that it's dangerous to conflate them, but I'm kind 
of struggling to grasp what each term means.


The Unicode Consortium offers a glossary to explain this stuff, 
but it's full of "definitions" like this:


Abstract Character. A unit of information used for the 
organization, control, or representation of textual data. ...


...

Character. ... (2) Synonym for abstract character. (3) The basic 
unit of encoding for the Unicode character encoding. ...


...

Glyph. (1) An abstract form that represents one or more glyph 
images. (2) A synonym for glyph image. In displaying Unicode 
character data, one or more glyphs may be selected to depict a 
particular character.


...

Grapheme. (1) A minimally distinctive unit of writing in the 
context of a particular writing system. ...


Mo

Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-09-06 Thread Joakim via Digitalmars-d

On Thursday, 6 September 2018 at 07:23:57 UTC, Chris wrote:
On Wednesday, 5 September 2018 at 22:00:27 UTC, H. S. Teoh 
wrote:




//

Seriously, people need to get over the fantasy that they can 
just use Unicode without understanding how Unicode works.  
Most of the time, you can get the illusion that it's working, 
but actually 99% of the time the code is actually wrong and 
will do the wrong thing when given an unexpected (but still 
valid) Unicode string.  You can't drive without a license, and 
even if you try anyway, the chances of ending up in a nasty 
accident is pretty high.  People *need* to learn how to use 
Unicode properly before complaining about why this or that 
doesn't work the way they thought it should work.



T


Python 3 gives me this:

print(len("á"))
1

and so do other languages.


The same Python 3 that people criticize for having unintuitive 
unicode string handling?


https://learnpythonthehardway.org/book/nopython3.html

Is it asking too much to ask for `string` (not `dstring` or 
`wstring`) to behave as most people would expect it to behave 
in 2018 - and not like Python 2 from days of yore? But of 
course, D users should have a "Unicode license" before they do 
anything with strings. (I wonder is there a different license 
for UTF8 and UTF16 and UTF32, Big / Little Endian, BOM? Just 
asking.)


Yes and no, unicode is a clusterf***, so every programming 
language is having problems with it.


So again, for the umpteenth time, it's the users' fault. I see. 
Ironically enough, it was the language developers' lack of 
understanding of Unicode that led to string handling being a 
nightmare in D in the first place. Oh lads, if you were 
politicians I'd say that with this attitude you're gonna the 
next election. I say this, because many times the posts by 
(core) developers remind me so much of politicians who are 
completely detached from the reality of the people. Right oh!


You have a point that it was D devs' ignorance of unicode that 
led to the current auto-decoding problem. But let's have some 
nuance here, the problem ultimately is unicode.


What changes to D would you like to pay for?

2018-09-05 Thread Joakim via Digitalmars-d
The D foundation is planning to add a way for us to pay for 
changes we'd like to see in D and its ecosystem, rather than 
having to code everything we need ourselves or find and hire a D 
dev to do it:


"[W]e’re going to add a page to the web site where we can define 
targets, allow donations through Open Collective or PayPal, and 
track donation progress. Each target will allow us to lay out 
exactly what the donations are being used for, so potential 
donors can see in advance where their money is going. We’ll be 
using the State of D Survey as a guide to begin with, but we’ll 
always be open to suggestions, and we’ll adapt to what works over 
what doesn’t as we go along."

https://dlang.org/blog/2018/07/13/funding-code-d/

I'm opening this thread to figure out what the community would 
like to pay for specifically, so we know what to focus on 
initially, whether as part of that funding initiative or 
elsewhere. I am not doing this in any official capacity, just a 
community member who would like to hear what people want.


Please answer these two questions if you're using or would like 
to use D, I have supplied my own answers as an example:


1. What D initiatives would you like to fund and how much money 
would you stake on each? (Nobody is going to hold you to your 
numbers, but please be realistic.)


$50 - Parallelize the compiler, particularly ldc, so that I can 
pass it -j5 and have it use five cores _and_ not have the bloat 
of separate compiler invocation for each module/package, ie 
taking up more memory or time.


$30 - Implement H.S. Teoh's suggestion of having an automated 
build system to periodically check which dub packages are 
building with official compiler releases:


https://forum.dlang.org/post/mailman.3611.1536126324.29801.digitalmar...@puremagic.com

$25 - Enable GC for the DMD frontend, so that dmd/gdc/ldc use 
less memory


I would also stake smaller amounts on various smaller bugs, if 
there were a better interface than bountysource and people were 
actually using it, ie users knew about and were staking money and 
D core devs were fixing those bugs and claiming that money.


2. Would you be okay with the patches you fund not being 
open-sourced for a limited time, with the time limit or funding 
threshold for open source release specified ahead of time, to 
ensure that funding targets are hit?


Yes, as long as everything is open-sourced eventually, I'm good.


Re: extern(C++, ns) is wrong

2018-09-04 Thread Joakim via Digitalmars-d

On Wednesday, 5 September 2018 at 01:20:26 UTC, Manu wrote:
On Tue, 4 Sep 2018 at 17:50, tide via Digitalmars-d 
 wrote:


On Wednesday, 5 September 2018 at 00:35:50 UTC, Manu wrote:
> On Tue, 4 Sep 2018 at 17:30, tide via Digitalmars-d 
>  wrote:

>> [...]
>
> And yes, the example is actually complete. Again, but I'll 
> simplify the filenames:

>
> ns/bar.d
> -
> module ns.bar;
> import ns.baz;
> extern(C++, ns):
>
> ns/baz.d
> -
> module ns.baz;
> import ns.bar;
> extern(C++, ns):
>
>
>> [...]

Judging by the name of the modules are you working on an 
Entity Component System for D :o ?


Well, I'm clearly trying to call C++ code >_<


I suggest you privately email Walter the exact code files you're 
writing, with the exact reasons you think his workarounds are too 
onerous. These piecemeal forum posts are going nowhere.


Re: D is dead (was: Dicebot on leaving D: It is anarchy driven development in all its glory.)

2018-09-04 Thread Joakim via Digitalmars-d
On Tuesday, 4 September 2018 at 13:34:03 UTC, 
TheSixMillionDollarMan wrote:

On Tuesday, 4 September 2018 at 01:36:53 UTC, Mike Parker wrote:

On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:

And of course, low manpower and funding aren't the complete 
picture. Management also play a role. Both Walter and Andrei 
have freely admitted they are not managers and that they're 
learning as they go. Mistakes have been made. In hindsight, 
some decisions should have gone a different way. But that is 
not the same as not caring, or not understanding/


So please, don't attribute any disingenuous motives to any of 
the core team members. They all want D to succeed. Identifying 
core problems and discussing ways to solve them is a more 
productive way to spend our bandwidth.


I think D's 'core' problem, is that it's trying to compete 
with, what are now, widely used, powerful, and well supported 
languages, with sophisticate ecosystems in place already. 
C/C++/Java/C# .. just for beginners.


Then it's also trying to compete with startup languages (Go, 
Rust ) - and some of those languages have billion dollar 
organisations behind them, not to mention the talent levels of 
their *many* designers and contributors.


C++ is much more than just a langauge. It's an established, 
international treaty on what the language must be.


Java is backed by Oracle (one the of the largest organisations 
in the world).


Go is backed by Google...Rust by Mozilla...(both billion dollar 
global companies).


So one has to wonder, what would motivate a person (or an 
organisation) to focus their attention on D.


That is not a statement about the quality of D. It's a 
statement about the competitive nature of programming languages.


If you've ever read 'No Contest - the case against competition' 
by Alfie Kohn, then you'd know (or at least you might agree 
with that statement) that competition  is not an inevitable 
part of human nature. "It warps recreation by turning the 
playing into a battlefield."


I wonder has already happened to D.

D should, perhaps, focus on being a place for recreation, where 
one can focus on technical excellence, instead of trying to 
compete in the battlefield.


I just do not see, how D can even defeat its' major competitors.

Instead D could be a place where those competitors come to look 
for great ideas (which, as I understand it, does occur .. 
ranges for example).


In any case, you have to work out what it is, that is going to 
motivate people to focus their attention on D.


You seem to be saying that, raising money so you can pay 
people, is enough.


But I wonder about that.


That's a good question, let me see if I can answer it.

Do you know what the first search engine for the web was and when 
it was created? It wasn't Yahoo, google, or Bing:


https://en.m.wikipedia.org/wiki/Web_search_engine#History

The first search engines were created in 1993, google came along 
in 1998 after at least two dozen others in that list, and didn't 
make a profit till 2001. Some of those early competitors were 
giant "billion dollar global companies," yet it's google that 
dominates the web search engine market today.


Why is that? Well, for one, resources don't matter for software 
on the internet as much as ideas. It's not that resources don't 
matter, but that they take a back seat to your fundamental design 
and the ideas behind it.


And coming up with that design and ideas takes time, the 
"developmental stage" that Laeeth refers to above. In that 
incubation stage, you're better off _not_ having a bunch of 
normal users who want a highly polished product, just a bunch of 
early adopters who can give you good feedback and are okay with 
rough edges. For D, that means all the advanced features don't 
fully play together well yet, and there are various bugs here and 
there. To use it, you have to be okay with that.


Now, it's a fair question to ask when D will leave that 
developmental stage and get more resources towards that polish, 
as Chris asks, and I'm not saying I know the answers to those 
questions. And let me be clear: as long as you don't push the 
envelope with mixing those advanced D features and are okay 
working around some bugs here and there, you're probably good now.


But simply asserting that others are rushing full-speed ahead 
with more resources and therefore they will win completely 
misunderstands how the game has changed online. Resources do 
matter, but they're not the dominant factor like they used to be 
for armies or manufacturing. Ideas are now the dominant factor, 
and D has plenty of those. ;)


D IDE

2018-09-03 Thread Joakim via Digitalmars-d
On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis 
wrote:
But if you're ever expecting IDE support to be a top priority 
of many of the contributors, then you're going to be sorely 
disappointed. It's the sort of thing that we care about because 
we care about D being successful, but it's not the sort of 
thing that we see any value in whatsoever for ourselves


Why is that? I've never used an IDE much, but I wonder why you 
don't and what your impressions are of why many other core D 
users don't either.


Re: DMD cross compiler

2018-09-03 Thread Joakim via Digitalmars-d

On Monday, 3 September 2018 at 09:21:21 UTC, bauss wrote:

On Sunday, 2 September 2018 at 01:52:18 UTC, Joakim wrote:

On Saturday, 1 September 2018 at 20:12:24 UTC, Manu wrote:

[...]


What specifically do you want to cross-compile to, something 
like Windows to macOS? LDC already does all this, ie the one 
compiler cross-compiles to every other platform with a single 
flag, may just want to use it.


Not everyone can or want to use LDC.


Why not? If you're not optimizing or iterating on your code, it's 
a reasonable replacement. If you're optimizing, you should only 
be using LDC or gdc.


Re: [OT] college

2018-09-03 Thread Joakim via Digitalmars-d
On Sunday, 2 September 2018 at 19:30:58 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/02/2018 05:43 AM, Joakim wrote:
Most will be out of business within a decade or two, as online 
learning takes their place.


I kinda wish I could agree with that, but schools are too much 
of a sacred cow to be going anywhere anytime soon. And for that 
matter, the online ones still have to tackle many of the same 
challenges anyway, WRT successful and effective teaching.


Really the only difference is "physical classroom vs no 
physical classroom". Well, that and maybe price, but the 
community colleges have had the uni's well beat on price for a 
long time (even manage to do a good job teaching certain 
things, depending on the instructor), but they haven't made the 
uni's budge: The best they've been able to do is establish 
themselves as a supplement to the uni's, where people start out 
with some of their gen-ed classes at the (comparatively) cheap 
community colleges for the specific purpose of later 
transferring to a uni.


That's because what the current online efforts do is simply slap 
the in-class curricula online, whereas what really needs to be 
done is completely change what's taught, away from the incoherent 
mix of theory and Java that basically describes every degree 
(non-CS too), and how it's tested and certified. When that 
happens, the unis will collapse, because online learning will be 
so much better at a fraction of the cost.


As for sacred cows, the newspaper business was one of them, ie 
Journalism, but it's on death's door, as I pointed out in this 
forum years ago:


https://en.m.wikipedia.org/wiki/File:Naa_newspaper_ad_revenue.svg

There are a lot of sacred cows getting butchered by the internet, 
college will be one of the easier ones to get rid of.


On Sunday, 2 September 2018 at 21:07:20 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/01/2018 03:47 PM, Everlast wrote:


It's because programming is done completely wrong. All we do 
is program like it's 1952 all wrapped up in a nice box and bow 
tie. WE should have tools and a compiler design that all work 
interconnected with complete graphical interfaces that aren't 
based in the text gui world(an IDE is just a fancy text 
editor). I'm talking about 3D code representation using 
graphics so projects can be navigated  visually in a dynamic 
way and many other things.


There are really two main, but largely independent, aspects to 
what you're describing: Visual representation, and physical 
interface:


A. Visual representation:
-

By visual representation, I mean "some kind of text, or UML-ish 
diagrams, or 3D environment, etc".


What's important to keep in mind here is: The *fundamental 
concepts* involved in programming are inherently abstract, and 
thus equally applicable to whatever visual representation is 
used.


If you're going to make a diagram-based or VR-based programming 
tool, it will still be using the same fundamental concepts that 
are already established in text-based programming: Imperative 
loops, conditionals and variables. Functional/declarative 
immutability, purity and high-order funcs. Encapsulation. 
Pipelines (like ranges). Etc. And indeed, all GUI based 
programming tools have worked this way. Because how *else* are 
they going to work?


If what you're really looking for is something that replaces or 
transcends all of those existing, fundamental programming 
concepts, then what you're *really* looking for is a new 
fundamental programming concept, not a visual representation. 
And ance you DO invent a new fundamental programming concept, 
being abstract, it will again be applicable to a variety of 
possible visual representations.


That said, it is true some concepts may be more readily 
amenable to certain visual representations than others. But, at 
least for all the currently-known concepts, any combination of 
concept and representation can certainly be made to work.


B. Physical interface:
--

By this I mean both actual input devices (keyboards, 
controllers, pointing devices) and also the mappings from their 
affordances (ie, what you can do with them: push button x, tilt 
stick's axis Y, point, move, rotate...) to specific actions 
taken on the visual representation (navigate, modify, etc.)


The mappings, of course, tend to be highly dependant on the 
visual representation (although, theoretically, they don't 
strictly HAVE to be). The devices themselves, less so: For 
example, many of us use a pointing device to help us navigate 
text. Meanwhile, 3D modelers/animators find it's MUCH more 
efficient to deal with their 3D models and environments by 
including heavy use of the keyboard in their workflow instead 
of *just* a mouse and/or wacom alone.


An important point here, is that using a keyboard has a 
tendency to be

[OT] college

2018-09-02 Thread Joakim via Digitalmars-d
On Sunday, 2 September 2018 at 07:56:09 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/02/2018 02:06 AM, Joakim wrote:
On Sunday, 2 September 2018 at 05:16:43 UTC, Nick Sabalausky 
(Abscissa) wrote:


Smug as I may have been at the at the time, it wasn't until 
later I realized the REAL smart ones were the ones out 
partying, not the grads or the nerds like me.


Why? Please don't tell me you believe this nonsense:

"Wadhwa... argues (I am not joking) that partying is a 
valuable part of the college experience because it teaches 
students interpersonal skills."

https://www.forbes.com/sites/jerrybowyer/2012/05/22/a-college-bubble-so-big-even-the-new-york-times-and-60-minutes-can-see-it-sort-of/



Learning skills from partying? Hah hah, no, no, it's not about 
anything like that. :) (Social skills matter, but obviously 
plenty of other ways to practice those.)


No, it's just that honing skills isn't the only thing in life 
that matters. Simply living life while you're here is important 
too, for its own sake, even if you only realize it after the 
fact.


Eh, having fun should be part of the college experience of 
course, but I suspect most of those out partying were taking it 
far beyond that. I bet many of them regret that today.


You're right that college is largely an irrelevant playground, 
precisely because of the incoherent combination of theory and 
bows to what they think are popular industry practices that you 
laid out. Most will be out of business within a decade or two, as 
online learning takes their place.


Re: This thread on Hacker News terrifies me

2018-09-01 Thread Joakim via Digitalmars-d
On Sunday, 2 September 2018 at 05:16:43 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/02/2018 12:53 AM, Jonathan M Davis wrote:


Ouch. Seriously, seriously ouch.



Heh, yea, well...that particular one was state party school, 
so, what y'gonna do? *shrug*


Smug as I may have been at the at the time, it wasn't until 
later I realized the REAL smart ones were the ones out 
partying, not the grads or the nerds like me.


Why? Please don't tell me you believe this nonsense:

"Wadhwa... argues (I am not joking) that partying is a valuable 
part of the college experience because it teaches students 
interpersonal skills."

https://www.forbes.com/sites/jerrybowyer/2012/05/22/a-college-bubble-so-big-even-the-new-york-times-and-60-minutes-can-see-it-sort-of/


Re: DMD cross compiler

2018-09-01 Thread Joakim via Digitalmars-d

On Saturday, 1 September 2018 at 20:12:24 UTC, Manu wrote:
I know there's been discussion on this before, I just want a 
definitive reference.


It looks like it would be relatively straight forward for DMD 
to be a

cross-compiler.
A few version() statements could be runtime if's, and that's 
pretty much it.
When hacking on parts of DMD, I frequently make hacks that turn 
such
versions into runtime if's to test multiple targets from the 
one dev

workflow.

It would be about 100 times more convenient to supply an arg, 
than make hacks all over the code... so, why not?


What specifically do you want to cross-compile to, something like 
Windows to macOS? LDC already does all this, ie the one compiler 
cross-compiles to every other platform with a single flag, may 
just want to use it.


Re: Who can make Phobos faster to import?

2018-08-30 Thread Joakim via Digitalmars-d

On Wednesday, 27 December 2017 at 15:12:28 UTC, Joakim wrote:

On Wednesday, 27 December 2017 at 10:40:38 UTC, RazvanN wrote:

[...]


I think I did all this with a dead-simple patch to dmd three 
years ago:


[...]


Hey Razvan, are you doing anything with this? Never heard back 
after my last post, so figured you were going your own way with 
this tool.


Re: [OT] Leverage Points

2018-08-30 Thread Joakim via Digitalmars-d

On Monday, 20 August 2018 at 12:26:25 UTC, Laeeth Isharc wrote:

On Monday, 20 August 2018 at 11:55:33 UTC, Joakim wrote:
"So how do you change paradigms? Thomas Kuhn, who wrote the 
seminal book about the great paradigm shifts of science, has 
a lot to say about that. In a nutshell, you keep pointing at 
the anomalies and failures in the old paradigm, you keep 
coming yourself, and loudly and with assurance from the new 
one, you insert people with the new paradigm in places of 
public visibility and power. You don’t waste time with 
reactionaries; rather you work with active change agents and 
with the vast middle ground of people who are open-minded."


(Quoting from the article I think).

Kuhn and Lakatos.  Paradigm shifts don't take place when the 
dominant paradigm is defeated by logical or empirical means.  
Paradigm shifts take place when for some reason people say "how 
about we stop talking about that, and start talking about this 
instead".


Not sure why you'd call that anything other than defeat. :)

I think he described certain political changes in the Western 
World beginning in the mid to late 60s rather well.  I don't 
think it describes how changes in the sphere of voluntary 
(non-political ie market and genuine civil society) activity 
unfold.  Supposing it were a good idea (which it isn't), how 
would one be able to to insert people in places of public 
visibility and power who put forward a point of view that is 
very different from the prevailing one?  Only via a program of 
entryism, and I don't think that in the end much good will come 
of that.


By convincing those with power/visibility that the contrary view 
is worth integrating? Look at Microsoft's about-face on open 
source over a couple decades, going from denigrating it to buying 
open-source producing or supporting companies like Xamarin and 
Github and open-sourcing several of their own projects, as an 
example.


So I think the original author has cause and effect the wrong 
way around (not too surprisingly because he is talking about 
things that relate to politics and activism).  [NB one 
shouldn't mention the Club of Rome without mentioning what a 
failure their work was, and it was predictably and indeed 
predicted to be a failure for the exact same reasons it failed].


It isn't that you insert people representing the new paradigm 
in positions of influence and power.


It is that people from the emerging new paradigm - which is 
nothing, a bunch of no-hopers, misfits and losers viewed from a 
conventional perspective - by virtue of the fact that it has 
something useful to say and has drawn highly talented people 
who recognise that start ever so slowly to begin things and 
eventually to accomplish things - still on the fringes - and 
over time this snowballs.  After a while turns out that they 
are no longer on the fringes but right at the centre of things, 
in part because the centre has moved.


The best illustration of this phenomenon was I think in a work 
of fiction - Neal Stephenson's Cryptonomicon.  I never expected 
someone to write a novel based on a mailing list - the 
cypherpunks.  It was about as surprising to me then as it would 
be to see Dlang - the movie - today.  And of course that itself 
was an early indication that the ideas and ways of thinking 
represented by what was originally quite a small community were 
on the ascent.


I agree that she's looking at it from the point of view of 
governmental change for her environmental agenda, whereas the 
market is more likely to have entirely new institutions- it used 
be new _companies_, but with the internet it's now increasily 
decentralized operations like the community behind bitcoin or bit 
torrent... or D- form that become much more important than the 
old ones: creative destruction. So, significantly open-source 
Android replaces mostly closed Windows as the dominant OS used by 
most consumers for personal computing, rather than Microsoft 
really getting the new religion much.


This pretty much reflects what Laeeth always says about 
finding principals who can make their own decisions about 
using D. "Places of public visibility and power" for D are 
commercial or open-source projects that attract attention 
for being well done or at least popular.


Well - I understand what you mean, but I don't recognise this 
as being my point.  Principals who can make their own decisions 
probably aren't today highly visible and visibly powerful.  The 
latter comes much later on in the development of a project, 
movement or scene and if you're visible it's a tax that takes 
time away from doing real work.  By the time you're on the 
front cover of Time or The Economist, it's as often as not the 
beginning of the end - at least for anything vital.


You're misreading what she wrote: she only said that you place 
new people in positions w

Re: Go 2 draft

2018-08-29 Thread Joakim via Digitalmars-d

On Wednesday, 29 August 2018 at 07:03:13 UTC, JN wrote:
Found this interesting link on proggit - 
https://go.googlesource.com/proposal/+/master/design/go2draft.md


D is mentioned in the generics part:

https://go.googlesource.com/proposal/+/master/design/go2draft-generics-overview.md


Interesting document, it's a good sign that the D generics syntax 
reads the best to me. Good to see Go recognize that and reuse it, 
though without the exclamation point, which marks D template 
instantiations and they should use too.


Re: D is dead (was: Dicebot on leaving D: It is anarchy driven development in all its glory.)

2018-08-28 Thread Joakim via Digitalmars-d

On Tuesday, 28 August 2018 at 13:39:40 UTC, Iain Buclaw wrote:

On Thursday, 23 August 2018 at 15:35:45 UTC, Joakim wrote:

On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:

On Thursday, 23 August 2018 at 06:58:13 UTC, Joakim wrote:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:

[...]


Can you list what you or other Weka devs believe those fatal 
flaws to be? Because you've not listed any here, which makes 
you no better than some noob that comes in here, says D has 
to get better or it will die, then can't articulate what 
they mean by "better" or worse, mentions something trivial. 
Of course, you've actually used the language for years, so 
presumably you've got some real concerns, but do you really 
think the bug you just posted is "fatal" to the language?


If you think there are fatal flaws, you might as well list 
them, whether technical or the development process, or you 
will just be ignored like any other noob who talks big and 
can't back it up. You may be ignored anyway, ;) but at least 
you'll have made a case that shows you know what you're 
talking about.


I'd define fatal as some that can be fixed, but breaks 100% 
of everyone's code, even if the change is net positive all 
round.


However how big a problem really is is in the eye of the 
beholder. An example:


Symptom: The compiler can't discard unused symbols at compile 
time, and so it will spend a lot of time pointlessly 
optimising code.


Problem: D has no notion of symbol visibility.

Possible Solution: Make all globals hidden by default unless 
'export'.


Side effects: Everyone will be spending weeks to months 
fixing their libraries in order to only mark what should be 
visible outside the current compilation unit as 'export'.


Benefits: Faster compile times, as in, in the most extreme 
example I've built one project on github with gdc -O2 and 
build time went from 120 seconds to just 3!


So your example of a fatal flaw is that D could be 100X faster 
at compilation instead of just 10X than most every other 
native language out there?! C'mon.




But that's not true. D isn't a fast language to compile, dmd is 
just a fast compiler.


You may get a little leading edge with codebases that are 
effectively C. Once you throw templates into the mix though, 
your problems become exponential.


Spending 4 seconds in the front end and codegen, only to wait 2 
minutes in the optimizer is horrific.


The alternative of discarding what seem to be unused symbols 
only results in linker error of the obscure edge cases sort.


Template emission strategy is a mess, we're better off just 
instantiating all templates in all compilation units, and let 
the compiler decide whatever to discard. Even -allinst does not 
instantiate enough to allow the compiler to make such decisions 
that C++ has no problem with (most of the time).


I think I've hit a variation of this problem before, where 
pulling in a single selective import in Phobos somewhere meant 
the entire module was compiled into the executable (though I 
suppose that could be a linker issue?):


https://forum.dlang.org/thread/gmjqfjoemwtvgqrtd...@forum.dlang.org

I guess this is why scoped/selective imports didn't help that 
much in disentangling Phobos. I figured it wasn't a big deal if 
it was just causing bigger executables, but even though I 
mentioned compilation speed there, I didn't think of how that's 
slowing down the compiler too, as you now note.


Pruning what's evaluated by the compiler based on 
scoped/selective imports, rather than apparently including the 
whole module, and getting D compilers to compile parallelized 
without separately invoking each module/package, ie a -j flag for 
the compiler when you invoke it with all your source at once, 
might be good projects for us to crowdfund, as discussed in this 
and my earlier Nim thread. Separate parallel compilation works 
wonders on my octa-core Android/AArch64 phone, where I mostly 
build D now, would be good to be able to combine that with 
invoking ldc with all source at once.


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-08-27 Thread Joakim via Digitalmars-d

On Monday, 27 August 2018 at 16:15:37 UTC, 12345swordy wrote:

On Monday, 27 August 2018 at 14:26:08 UTC, Chris wrote:

On Monday, 27 August 2018 at 13:48:42 UTC, 12345swordy wrote:

On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc 
wrote:



[...]


I think D has reached the point where that'd make perfect 
sense. Move from the garage to a proper factory :)


Who's going to pay for the factory?
-Alex


That's for the D Foundation to figure out. There's a reason we 
have a D Foundation now, isn't there?


The annual monthly budget is around 4K$.
https://opencollective.com/dlang#
-Alex


"annual monthly?" Look again:

https://wiki.dlang.org/Vision/2018H1#H2_2017_Review


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-08-24 Thread Joakim via Digitalmars-d

On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:

On 8/24/2018 6:04 AM, Chris wrote:
For about a year I've had the feeling that D is moving too 
fast and going nowhere at the same time. D has to slow down 
and get stable. D is past the experimental stage. Too many 
people use it for real world programming and programmers value 
and _need_ both stability and consistency.


Every programmer who says this also demands new (and breaking) 
features.


Heh, thought this proggit comment thread was funny given this 
complaint, some C++ users feel it's moving too fast now:


"In the last few years it has basically become a different 
language, the feature creep is insane. I stopped caring about new 
features since C++11, and progressively used the language less 
and less."


Another user:

"I remember being really excited about C++11 - and I think it 
really did add some much needed features. But it's been getting 
more and more out of hand since then..."

https://www.reddit.com/r/programming/comments/99rnuq/comment/e4q8iqn


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-08-23 Thread Joakim via Digitalmars-d

On Thursday, 23 August 2018 at 18:27:27 UTC, Abdulhaq wrote:
On Thursday, 23 August 2018 at 09:51:43 UTC, rikki cattermole 
wrote:
Good luck getting W&A to agree to it, especially when there 
is yet another "critical D opportunity" on the table ;)


No. They have power for as long as we the community say that 
they do.
We are at the point where they need a check and balance to 
keep everybody going smoothly. And I do hope that they listen 
to us before somebody decides its forkin' time.


No fork of D can be successful, it won't have the manpower, 
skills or willpower to draw on. Even with W and A it's already 
short.


'Threatening' W and A with a fork is an empty threat that just 
p***es them off. Bad move on your part.


Agreed, W&A's "power" is that they have done or are doing much of 
the work and that enough people trust they're better at how they 
work than those forking. That would be a very high bar for any 
forking team to surpass.


Re: D is dead

2018-08-23 Thread Joakim via Digitalmars-d

On Thursday, 23 August 2018 at 17:52:54 UTC, bachmeier wrote:

On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
wrote:

On 23/08/18 17:01, Steven Schveighoffer wrote:
My main job is to develop for Weka, not develop D itself.


Weka, at some point, made the strategic decision to use a non 
mainstream language


I dont think Weka, have a choice, they have to invest in the 
development of D itself


I hope a startup can choose D without having to do that. 
Otherwise D is not really a viable option for startups because 
they need to focus on survival rather than language development.


What a joke: are you really arguing that every startup should 
have all their suppliers give them everything for free? Most 
startups pay a ton of money for their critical tools, money that 
pays for further development of those tools, including for 
bugfixes and features in the OSS projects they use (which they 
don't always open source). I'd wager that whatever Weka has spent 
on Johan to fix D is much less. Maybe Weka is simply learning 
they can't get away with that anymore.


What you _could_ argue is that the cost/benefit ratio of D ends 
up being too high compared to some mooted alternative with a 
bigger community, say C++ or Rust, but I think you'd have a tough 
time making that case.


Re: D is dead

2018-08-23 Thread Joakim via Digitalmars-d
On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh 
wrote:



On 23/08/18 18:35, Joakim wrote:


So your example of a fatal flaw is that D could be 100X faster 
at compilation instead of just 10X than most every other 
native language out there?! C'mon.


Have you tried Stephan's example yet?

static foreach(i; 0..16384) {}


I don't see any posts by a "Stephan" in this thread. I don't 
doubt that there are degenerate cases in D's compile-time 
features, particularly the new ones.


Each problem on its own, maybe not. All together? Most 
definitely.


A language needs to be coherent. A programmer needs to be able 
to look at code and know what the compiler will make of that 
code. The less that can happen, the less useful the language is.


This is, in fact, precisely the criticism the D community 
levels against C++.


Ah, I see none of these are "fatal flaws," but they all combine 
together to create a fatal situation. That's an argument that can 
actually be defended, but not the one you started off making.


By this rationale, C++ should be dead by now. Why do you think 
it's fatal to D?


C++ does not suffer from this *kind* of complexity. For the 
most part, C++'s complexity is feature centric. You use a 
feature, you need to really learn that feature in order to get 
it to work.


D's complexity is waiting to pounce you behind street corners. 
You use a feature, and all's well. And then, when you're doing 
stuff completely tangential to your old code, things suddenly 
break.


You can avoid C++'s complexity by not using features. The same 
is not true of D.


Sorry, it sounds like you're simply saying you prefer one type of 
complexity to another. I'm not a C++ developer, but my 
understanding is it has many of the same problems as D, more 
because of backwards compatibility.



With that said, who said these problems don't affect C++?


Nobody, that's my point in bringing it up.


Had C++ not being plagued by these problems, D (and Rust, and Go,
and so on) would probably never had been born. These are 
languages written with the explicit hope of killing C++.


They do not seem like they are going to, but D lacks quite a 
few things that C++ has going for it. To name a few:


* Large community
* excellent tooling
* large use base


My point was that C++ has always had what you believe to be the 
"fatal flaw" of incoherence, yet was able to build all those up. 
Walter has been there from the beginning and seen how C++ did it, 
perhaps his anarchy-driven development isn't so misguided after 
all.



* Critical bugs aren't being solved

People keep advertising D as supporting RAII. I'm sorry, but 
"supports RAII" means "destructors are always run when the 
object is destroyed". If the community (and in this case, 
this includes Walter) sees a bug where that doesn't happen as 
not really a bug, then there is a deep problem, at least, 
over-promising. Just say you don't support RAII and 
destructors are unreliable and live with the consequences.


BTW: Python's destructors are unworkable, but they advertise 
it and face the consequences. The D community is still 
claiming that D supports RAII.


Maybe they're not critical to everyone else?


Maybe. Just don't lie to users.


Is it a lie if you don't know about some bug in the 
implementation?


How much time or money exactly has Weka spent on getting this 
issue and other "critical" bugs fixed?


Weka is paying prominent D developers as contractors. We've had 
David Nadlinger and currently employ Johan Engelen. Both said 
they are cannot fix this particular bug.


If you can, feel free to contact me off-list, and I'm fairly 
sure we can get the budget for you to work on it. The same goes 
for anyone else on this list.


I'm sorry, I wouldn't know how to and am not interested in 
learning.


We also contribute our own workarounds for D's shortcomings for 
everyone to enjoy. This include DIP-1014 and Mecca, as well as 
the less obvious upstreaming of bugs our contractors fix. This 
is beyond the fact that our "fork" of the compiler is, itself, 
public (https://github.com/weka-io/ldc). I think claiming that 
Weka is leaching off the community is simply unwarranted.


Sure, but nobody claimed you're just leeching: I know you've 
contributed in various ways. But the question stands: were you 
able to apply time/money to get any _critical_ bugs fixed and 
upstreamed? If so, why don't you believe you could get more 
fixed? Nobody's asking you to do it all yourself, you could work 
with Sociomantic and the community to raise bounties on those 
issues.


It is fairly laughable for a company that raised $42 million 
to complain that a bunch of unpaid volunteers aren't fixing 
bugs fast enough for them:


Fir

Re: D is dead (was: Dicebot on leaving D: It is anarchy driven development in all its glory.)

2018-08-23 Thread Joakim via Digitalmars-d

On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:

On Thursday, 23 August 2018 at 06:58:13 UTC, Joakim wrote:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:

On 22/08/18 21:34, Ali wrote:

On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
Pretty positive overall, and the negatives he mentions are 
fairly obvious to anyone paying attention.


Yea, I agree, the negatives are not really negative

Walter not matter how smart he is, he is one man who can 
work on the so many things at the same time


Its a chicken and egg situation, D needs more core 
contributors, and to get more contributors it needs more 
users, and to get more users it need more core contributors




No, no and no.

I was holding out on replying to this thread to see how the 
community would react. The vibe I'm getting, however, is that 
the people who are seeing D's problems have given up on 
affecting change.


It is no secret that when I joined Weka, I was a sole D 
detractor among a company quite enamored with the language. I 
used to have quite heated water cooler debates about that 
point of view.


Every single one of the people rushing to defend D at the 
time has since come around. There is still some debate on 
whether, points vs. counter points, choosing D was a good 
idea, but the overwhelming consensus inside Weka today is 
that D has *fatal* flaws and no path to fixing them.


And by "fatal", I mean literally flaws that are likely to 
literally kill the language.


And the thing that brought them around is not my power of 
persuasion. The thing that brought them around was spending a 
couple of years working with the language on an every-day 
basis.


And you will notice this in the way Weka employees talk on 
this forum: except me, they all disappeared. You used to see 
Idan, Tomer and Eyal post here. Where are they?


This forum is hostile to criticism, and generally tries to 
keep everyone using D the same way. If you're cutting edge D, 
the forum is almost no help at all. Consensus among former 
posters here is that it is generally a waste of time, so 
almost everyone left, and those who didn't, stopped posting.


And it's not just Weka. I've had a chance to talk in private 
to some other developers. Quite a lot have serious, 
fundamental issues with the language. You will notice none of 
them speaks up on this thread.


They don't see the point.

No technical project is born great. If you want a technical 
project to be great, the people working on it have to focus 
on its *flaws*. The D's community just doesn't do that.


To sum it up: fatal flaws + no path to fixing + no push from 
the community = inevitable eventual death.


Can you list what you or other Weka devs believe those fatal 
flaws to be? Because you've not listed any here, which makes 
you no better than some noob that comes in here, says D has to 
get better or it will die, then can't articulate what they 
mean by "better" or worse, mentions something trivial. Of 
course, you've actually used the language for years, so 
presumably you've got some real concerns, but do you really 
think the bug you just posted is "fatal" to the language?


If you think there are fatal flaws, you might as well list 
them, whether technical or the development process, or you 
will just be ignored like any other noob who talks big and 
can't back it up. You may be ignored anyway, ;) but at least 
you'll have made a case that shows you know what you're 
talking about.


I'd define fatal as some that can be fixed, but breaks 100% of 
everyone's code, even if the change is net positive all round.


However how big a problem really is is in the eye of the 
beholder. An example:


Symptom: The compiler can't discard unused symbols at compile 
time, and so it will spend a lot of time pointlessly optimising 
code.


Problem: D has no notion of symbol visibility.

Possible Solution: Make all globals hidden by default unless 
'export'.


Side effects: Everyone will be spending weeks to months fixing 
their libraries in order to only mark what should be visible 
outside the current compilation unit as 'export'.


Benefits: Faster compile times, as in, in the most extreme 
example I've built one project on github with gdc -O2 and build 
time went from 120 seconds to just 3!


So your example of a fatal flaw is that D could be 100X faster at 
compilation instead of just 10X than most every other native 
language out there?! C'mon.


On Thursday, 23 August 2018 at 09:09:40 UTC, Shachar Shemesh 
wrote:

On 23/08/18 09:58, Joakim wrote:
Because you've not listed any here, which makes you no better 
than some noob


Here's one: the forum does not respond well to criticism.


Sounds more like you don't respond well to criticism, as the 
point stands that your original post was co

Re: D is dead (was: Dicebot on leaving D: It is anarchy driven development in all its glory.)

2018-08-23 Thread Joakim via Digitalmars-d
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:

On 22/08/18 21:34, Ali wrote:

On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
Pretty positive overall, and the negatives he mentions are 
fairly obvious to anyone paying attention.


Yea, I agree, the negatives are not really negative

Walter not matter how smart he is, he is one man who can work 
on the so many things at the same time


Its a chicken and egg situation, D needs more core 
contributors, and to get more contributors it needs more 
users, and to get more users it need more core contributors




No, no and no.

I was holding out on replying to this thread to see how the 
community would react. The vibe I'm getting, however, is that 
the people who are seeing D's problems have given up on 
affecting change.


It is no secret that when I joined Weka, I was a sole D 
detractor among a company quite enamored with the language. I 
used to have quite heated water cooler debates about that point 
of view.


Every single one of the people rushing to defend D at the time 
has since come around. There is still some debate on whether, 
points vs. counter points, choosing D was a good idea, but the 
overwhelming consensus inside Weka today is that D has *fatal* 
flaws and no path to fixing them.


And by "fatal", I mean literally flaws that are likely to 
literally kill the language.


And the thing that brought them around is not my power of 
persuasion. The thing that brought them around was spending a 
couple of years working with the language on an every-day basis.


And you will notice this in the way Weka employees talk on this 
forum: except me, they all disappeared. You used to see Idan, 
Tomer and Eyal post here. Where are they?


This forum is hostile to criticism, and generally tries to keep 
everyone using D the same way. If you're cutting edge D, the 
forum is almost no help at all. Consensus among former posters 
here is that it is generally a waste of time, so almost 
everyone left, and those who didn't, stopped posting.


And it's not just Weka. I've had a chance to talk in private to 
some other developers. Quite a lot have serious, fundamental 
issues with the language. You will notice none of them speaks 
up on this thread.


They don't see the point.

No technical project is born great. If you want a technical 
project to be great, the people working on it have to focus on 
its *flaws*. The D's community just doesn't do that.


To sum it up: fatal flaws + no path to fixing + no push from 
the community = inevitable eventual death.


Can you list what you or other Weka devs believe those fatal 
flaws to be? Because you've not listed any here, which makes you 
no better than some noob that comes in here, says D has to get 
better or it will die, then can't articulate what they mean by 
"better" or worse, mentions something trivial. Of course, you've 
actually used the language for years, so presumably you've got 
some real concerns, but do you really think the bug you just 
posted is "fatal" to the language?


If you think there are fatal flaws, you might as well list them, 
whether technical or the development process, or you will just be 
ignored like any other noob who talks big and can't back it up. 
You may be ignored anyway, ;) but at least you'll have made a 
case that shows you know what you're talking about.


Re: Dicebot on leaving D: It is anarchy driven development in all its glory.

2018-08-22 Thread Joakim via Digitalmars-d
On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
wrote:

Just found by chance, if someone is interested [1] [2].

/Paolo

[1] 
https://gitlab.com/mihails.strasuns/blog/blob/master/articles/on_leaving_d.md
[2] 
https://blog.mist.global/articles/My_concerns_about_D_programming_language.html


Pretty positive overall, and the negatives he mentions are fairly 
obvious to anyone paying attention. D would really benefit from a 
project manager, which I think Martin Nowak has tried to do, and 
which the companies using D and the community should get together 
and fund as a paid position. Maybe it could be one of the funding 
targets for the Foundation.


If the job was well-defined, so I knew exactly what we're getting 
by hiring that person, I'd contribute to that.


Re: [OT] Leverage Points

2018-08-20 Thread Joakim via Digitalmars-d

On Monday, 20 August 2018 at 04:46:35 UTC, Laeeth Isharc wrote:

On Sunday, 19 August 2018 at 18:49:53 UTC, Joakim wrote:
On Saturday, 18 August 2018 at 13:33:43 UTC, Andrei 
Alexandrescu wrote:

A friend recommended this article:

http://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/

I found it awesome and would recommend to anyone in this 
community. Worth a close read - no skimming, no tl;rd etc. 
The question applicable to us - where are the best leverage 
points in making the D language more successful.


I read the whole thing, pretty much jibes with what I've 
already realized after decades of observation, but good to see 
it all laid out and prioritized, as Jonathan said.


I thought this paragraph was particularly relevant to D:

"So how do you change paradigms? Thomas Kuhn, who wrote the 
seminal book about the great paradigm shifts of science, has a 
lot to say about that. In a nutshell, you keep pointing at the 
anomalies and failures in the old paradigm, you keep coming 
yourself, and loudly and with assurance from the new one, you 
insert people with the new paradigm in places of public 
visibility and power. You don’t waste time with reactionaries; 
rather you work with active change agents and with the vast 
middle ground of people who are open-minded."


This pretty much reflects what Laeeth always says about 
finding principals who can make their own decisions about 
using D. "Places of public visibility and power" for D are 
commercial or open-source projects that attract attention for 
being well done or at least popular.


Read Vilfredo Pareto on the circulation of the elites, Toynbee 
on the role of creative minorities, and Ibn Khaldun on 
civilisational cycles.


There's not much point focusing on the influential and powerful 
people and projects of today - they have too much else going 
on; powerful people tend to become a bit disconnected from 
reality, complacent and they and hangers-on have too much 
vested in the status quo to change.  When you have nothing, you 
have not much to lose, but after considerable success most 
people start to move to wanting to keep what they have.  This 
doesn't bring open-mindedness to new ideas or approaches.


Sure, and though I've not read any of those books, where did I 
suggest going after the "influential and powerful?" I simply 
echoed your statement about going after principals who are free 
to make their own path, who as you've stated before are usually 
at startups or small projects where everything doesn't have to 
get past a committee.


But we live in a dynamic economy and time and the winners of 
tomorrow might look unremarkable today.  Linus said it was just 
a hobby project, nothing big like Minix.  Would you have 
thought a few German PhDs had a chance with no capital, 
starting amidst a bad financial crisis and using a language 
that was then of questionable stability and commercial 
viability?


Yes, the next great kernel developer or Sociomantic is looking 
for the language to write their project with now. Hopefully, D 
will be the right choice for them.


I'm not sure we're doing a good job of publicizing those we 
have though, here's a comment from the proggit thread on 
BBasile's recent post about writing a language in D:


"I keep seeing articles telling me why D is so great, but 
nothing of note ever gets written in D."

https://www.reddit.com/r/programming/comments/97q9sq/comment/e4b36st



I don't think it matters a lot what people like that think.  In 
aggregate yes, but as Andrei says people are looking for an 
excuse not to learn a new language.  Somebody actually ready to 
try D will sooner or later come across the organisations using 
D page and see that the situation is a bit different.


Looking at his proggit comment history now, he seems exactly like 
the kind of intelligent, opinionated sort D should be attracting: 
I don't think he was looking to dismiss D. He could have looked 
harder, we could have marketed harder: there's blame to go around.


I'll put out an email to Don. Maybe Laeeth would be willing to 
do an interview.


Sounds a good idea.


Alright, I'll email you soon.

On the OSS front, I've sent several interview questions to 
Iain earlier this year about gdc, after he agreed to an 
interview, no responses yet. Tough to blame others for being 
ignorant of D's successes when we don't do enough to market it.


I think we are still in very early stages.  Lots of companies 
in orgs using D I don't know much about.  The Arabia weather 
channel have a YouTube on their use of D, but I don't speak 
Arabic.  Hunt the Chinese toy company is interesting.  Chinese 
tech scene is huge and very creative, possibly more so than the 
US in some ways.


You might ask EMSI and also AdRoll.

By early days I mean it's better to look for intere

Re: [OT] Leverage Points

2018-08-19 Thread Joakim via Digitalmars-d
On Saturday, 18 August 2018 at 13:33:43 UTC, Andrei Alexandrescu 
wrote:

A friend recommended this article:

http://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/

I found it awesome and would recommend to anyone in this 
community. Worth a close read - no skimming, no tl;rd etc. The 
question applicable to us - where are the best leverage points 
in making the D language more successful.


I read the whole thing, pretty much jibes with what I've already 
realized after decades of observation, but good to see it all 
laid out and prioritized, as Jonathan said.


I thought this paragraph was particularly relevant to D:

"So how do you change paradigms? Thomas Kuhn, who wrote the 
seminal book about the great paradigm shifts of science, has a 
lot to say about that. In a nutshell, you keep pointing at the 
anomalies and failures in the old paradigm, you keep coming 
yourself, and loudly and with assurance from the new one, you 
insert people with the new paradigm in places of public 
visibility and power. You don’t waste time with reactionaries; 
rather you work with active change agents and with the vast 
middle ground of people who are open-minded."


This pretty much reflects what Laeeth always says about finding 
principals who can make their own decisions about using D. 
"Places of public visibility and power" for D are commercial or 
open-source projects that attract attention for being well done 
or at least popular.


I'm not sure we're doing a good job of publicizing those we have 
though, here's a comment from the proggit thread on BBasile's 
recent post about writing a language in D:


"I keep seeing articles telling me why D is so great, but nothing 
of note ever gets written in D."

https://www.reddit.com/r/programming/comments/97q9sq/comment/e4b36st

Of course, all he has to do is go to the front page of dlang.org 
and follow those links others gave him, but maybe he means 
something really big like google's search engine.


We could probably stand to publicize D's commercial successes 
more. I've been trying to put together an interview blog post 
with Weka about their use of D, got some answers this summer, but 
no response in months to a follow-up question about how they got 
their team trained up on D. We could stand to talk more about 
Sociomantic, D's biggest corporate success so far, I'll put out 
an email to Don. Maybe Laeeth would be willing to do an interview.


On the OSS front, I've sent several interview questions to Iain 
earlier this year about gdc, after he agreed to an interview, no 
responses yet. Tough to blame others for being ignorant of D's 
successes when we don't do enough to market it.


Finally, regarding leverage, I keep pointing out that mobile has 
seen a resurgence of AoT-compiled native languages, but nobody 
seems to be trying D out in that fertile terrain, other than me.


Re: Found on proggit: Nim receives funding from a company (D should be doing something like this)

2018-08-17 Thread Joakim via Digitalmars-d

On Friday, 17 August 2018 at 00:06:27 UTC, Laeeth Isharc wrote:

On Tuesday, 14 August 2018 at 07:05:12 UTC, Joakim wrote:

[...]


Have you read Peter Thiel's Zero to One and seen his YouTube 
talks on secrets etc?


[...]


Nothing here contradicts my stated goals and method: increasing 
the usage and adoption of the D language by the community paying 
for more work on the language and libraries.


I think there should be a marketplace in such paid work on D, one 
that largely doesn't exist right now. In your language of 
economics, that market isn't clearing. Why is this?


It could be there hasn't been enough sustained effort to create a 
market. It could be there are simply too few buyers. It could be 
there is a culture of doing everything for yourself. It could be 
some OSS people dislike money being involved. It could be people 
are happy to free-ride off major voluntary contributors like 
Walter, Andrei, Iain, and Martin and don't care or need to see D 
speed up its development.


It could be I'm an outlier in wanting D to get more resources and 
move faster. We shall see.


Found on proggit: Why D is a good choice for writing a language

2018-08-16 Thread Joakim via Digitalmars-d

By our very own BBasile of Coedit fame:

https://lambdahackers.com/@b4asile/why-d-is-a-good-choice-for-writing-a-toy-language-4vapyvas5a


Re: Found on proggit: Nim receives funding from a company (D should be doing something like this)

2018-08-16 Thread Joakim via Digitalmars-d
On Wednesday, 15 August 2018 at 20:45:34 UTC, Andrei Alexandrescu 
wrote:

On 8/13/18 5:50 AM, Joakim wrote:

[...]


Thanks for the info. That's good for Nim and something we could 
definitely benefit from as well. Currently, Sebastian Wilzbach 
and Razvan Nitu, both students, are working full time with the 
Foundation. Mike is our publishing and general PR person, 
working a reliable part time. We are in talks with a few more 
students from Romania and Brazil.


Good to know, didn't know Seb and Razvan were full-time.

Our early experiments with bountysource were sadly 
unsuccessful. I'm not writing it off but we'd probably need a 
new angle for a new round of experimentation. Suggestions are 
welcome.


I've never liked that bountysource website overall, but it does 
provide a way to fund specific issues.


Regarding corporate sponsorship, we have been public about 
being interested, but we haven't exactly beaten off offers with 
a stick. I have personally asked our top users in several 
instances for assistance. There has been some, but not to the 
extent of allocating one or more full-time engineers. Shout out 
to Laeeth Isharc whose enterprise has been far and away the 
most generous, and to Weka as well for sharing with us some 
time and resources at a crucial juncture for the company.


Good to hear those options are being explored.

We've always been glad to take suggestions from individual 
collaborators. Mike Parker would be the person to reach out to. 
What would be best is to get some concrete action; 
historically, the typical suggestion came in the form "here's 
this great idea, you go work on it". Even that is fine if the 
idea is fleshed out and argued convincingly. Better yet, 
there's no better proof that an idea is good than to actually 
execute it to demonstrable benefit.


OK, I'll try that then. The first step is to figure out what 
people want to pay for, so I will open a forum thread for that. 
Perhaps such feedback can also guide the funding targets on 
Opencollective, in addition to the survey data already collected.


Re: Found on proggit: Nim receives funding from a company (D should be doing something like this)

2018-08-14 Thread Joakim via Digitalmars-d

On Tuesday, 14 August 2018 at 02:49:58 UTC, Mike Franklin wrote:

On Monday, 13 August 2018 at 09:50:29 UTC, Joakim wrote:
Announced last week, the Nim team will be adding two full-time 
paid devs and setting up grants for needed projects with this 
new funding:


:jealous:

However, there are other ways to raise funds. Companies using 
D could use the existing bountysource page to put up bounties 
for features/fixes or projects they need, to which community 
members who need some particular feature/fix could also donate:


https://www.bountysource.com/teams/d


I think bountysource would work if the bounties were 
significantly higher, but there are also the funding options at 
https://opencollective.com/dlang


Yes, some of those bounties are too low for the amount of work, 
but nothing stops others who find them important to increase the 
bounty incrementally.


Looking on the right column of the page there are several D 
enthusiasts contributing their hard-earned money to D.  Maybe 
there's a better option for the masses, besides a T-shirt and a 
DConf discount, that might encourage more donors.  For example, 
I might contribute somewhere between $100 or more if I could 
get some attention on some bugs/features that I care about 
(assuming I couldn't implement them myself).  Maybe I'll post a 
bounty in the near future and see how it goes.


A variation on that appears to be in the cards, as they've said 
there will be more funding targets:


https://forum.dlang.org/post/orvcznlvraunkksjd...@forum.dlang.org

I don't really care which website is used, bountysource or 
opencollective or whatever, but the community is unlikely to 
contribute unless they have a clear idea of where the money is 
going, which bountysource does a better job of showing right now.


Right now, I'm the only one I know of working on the #dbugfix 
stuff, but I'm finding the bugs submitted this round 
exceptionally difficult.  I don't know if I'll succeed with a 
fix this round (Sorry!), but contact me directly, or post an 
announcement on the forum, if you have a bug that you're 
willing to motivate with a financial contribution to the D 
Foundation, and I'll personally take a look at it.  I'm 
generally only capable of fixing some of the more simple bugs, 
as my skills and understanding of DMD are quite limited, but I 
promise I'll try.


This is not about me: I personally don't have any blocker bugs 
that I'm worried about. I'm concerned about the general pace of D 
development: I don't think we're as focused or organized on 
gathering resources as we should be. My preferred model is to 
turn D into a partially proprietary product, but I guess the core 
team doesn't like that approach:


https://forum.dlang.org/thread/okuzksqzczprvuklp...@forum.dlang.org

Back when I was a little kid decades ago, I had a neighbor who 
used to build model trains in his garage, what he did in his 
spare time. I remember seeing it then and being thrilled that it 
snaked all over his work area. 99% of open source projects are 
the "model trains" of software devs, something they work on for 
fun in their spare time, and never get used widely.


To get into the 1% of OSS projects that are actually widely used, 
you need some way to gather resources to grow the project. 
There's the linux model where you get a bunch of consulting and 
support companies to use you. There's the llvm/clang model where 
you become a product in a large company, part of their portfolio 
alongside proprietary products or modules that pay the bills. 
There's the Firefox model where you sell ads alongside the OSS 
product. There's new models where you use crowdfunding sites like 
kickstarter or opencollective.


D has so far used very little of any of these models. This 
project can give off the impression that it is simply a big model 
train for Walter and Andrei, a hobby that they've retired to work 
on. Instead, I'd like to see D much more widely used, which means 
work needs to be done on gathering resources beyond what the 
project has now.


Re: dub is not able to install any package

2018-08-13 Thread Joakim via Digitalmars-d

On Monday, 13 August 2018 at 13:02:43 UTC, Adil wrote:

dub build
Package gelfd not found in registry at https://code.dlang.org/ 
(fallback ["registry at http://code.dlang.org/";, "registry at 
https://code-mirror.dlang.io/";, "registry at 
https://code-mirror2.dlang.io/";, "registry at 
https://dub-registry.herokuapp.com/";]): Failed to load curl, 
tried "libcurl.so", "libcurl.so.4", "libcurl-gnutls.so.4", 
"libcurl-nss.so.4", "libcurl.so.3".


Looks like you need to install a libcurl package also, in some 
place where dub can find it.


Found on proggit: Nim receives funding from a company (D should be doing something like this)

2018-08-13 Thread Joakim via Digitalmars-d
Announced last week, the Nim team will be adding two full-time 
paid devs and setting up grants for needed projects with this new 
funding:


https://our.status.im/status-partners-with-the-team-behind-the-programming-language-nim/
https://nim-lang.org/blog/2018/08/07/nim-partners-with-status.html

D should also be trying to raise resources like this, though it 
doesn't have to be corporate funding from one source. This 
company funding Nim raised $100 million in an ICO last year to 
build some kind of cryptocurrency-oriented mobile apps platform:


https://www.inc.com/brian-d-evans/status-ico-raised-over-100-million-for-ethereum-powered-dapps-on-ios-and-androi.html

There are risks, of course. This company could flame out, like 
many of these new cryptocurrency companies do, leaving Nim 
without ongoing funding. Their priorities may not align with the 
Nim core team.


However, there are other ways to raise funds. Companies using D 
could use the existing bountysource page to put up bounties for 
features/fixes or projects they need, to which community members 
who need some particular feature/fix could also donate:


https://www.bountysource.com/teams/d

There are two primary factors in the success of any project, 
design and resources. I'm reasonably happy with the design of D 
and how technical decisions are being made. I think this is a 
core strength of D.


However, it appears the D core team has so far been doing a 
horrible job in gathering resources for the project. I'm not 
privy to any internal discussions or if this is being discussed 
at all. But it needs to be a priority for the ongoing growth of 
this project.


Nice quote about D on twitter

2018-08-10 Thread Joakim via Digitalmars-d

Can we stick this in the testimonials webpage? ;)

"Barely figuring out the #dlang design choices, and I'm already 
perceiving C++ as a knapsack with stone picks of random sizes.."

https://mobile.twitter.com/nikos_maximus/status/1027519165937184768


Re: Automate the collection and publishing of data for monitoring D's progress as a software development project

2018-08-08 Thread Joakim via Digitalmars-d
On Tuesday, 7 August 2018 at 17:47:45 UTC, Venu Vardhan Reddy 
Tekula wrote:
Hello everyone, as I said before here, 
https://forum.dlang.org/post/dbmottqhsyxdizfkg...@forum.dlang.org, I am interested in "Automate the collection and publishing of data for monitoring D's progress as a software development project" which is part of SAOC.


I have been into Web Development and worked on few projects. 
Now I am actually learning data science as I am quite 
interested in the field. I want to apply for this project as it 
covers both of my interests, Web Development and Data Science.


The main crux of this project lies in scraping the data using 
the required API's and generating a web app, which can actually 
show a clear graph of what all things are happening to 
everyone. The graph thing can be done using some libraries in 
python. The web app can also be made using Python and Django.


Yes, everything _could_ be done in Python or other languages, but 
SAoC is being run by the D foundation and a company using D, 
which is why the first question in the FAQ says,


"Q: Can I use other languages alongside the D Language in my 
project?


A: Yes. Make sure that the focus is the D Language. We are likely 
to select applicants that prioritize the D Language."


If you don't want to participate in SAoC, but simply want to 
volunteer to do this because you're interested in data science, 
you are free to use any language you want.


I had some questions regarding the project and also needed some 
pointers to get started with the project. Also, more it would 
be great if more description of the project statement can be 
provided.


This is not a "project," it is merely an idea to spur SAoC 
proposals, apparently added by Mike:


https://wiki.dlang.org/?title=SAOC_2018_ideas&type=revision&diff=9293&oldid=9292

The ideas are intended to provide a base upon which you build a 
full proposal. If you'd like to know more about what Mike had in 
mind, you can ask him more questions here, as he just responded 
to you, or over private email, which you can probably get from 
his github profile or commits:


https://github.com/JinShil


Re: Whence came UFCS?

2018-07-27 Thread Joakim via Digitalmars-d

On Friday, 27 July 2018 at 05:22:17 UTC, Joakim wrote:

On Friday, 27 July 2018 at 03:41:29 UTC, Sameer Pradhan wrote:
During our Boston D Meetup today, we went through and 
deconstructed Walter's wonderfully elegant blog post from 2012 
called "Component Programming in D"


[...]


Extension methods were added to C# 3.0 in 2007, UFCS was 
discussed as a generalization of the concept to free functions 
at that year's DConf, fully implemented years later:


https://forum.dlang.org/thread/htjuut$av2$1...@digitalmars.com


The DConf slides linked from that old thread list this 2000 Scott 
Meyers article as an inspiration for UFCS:


http://www.drdobbs.com/cpp/how-non-member-functions-improve-encapsu/184401197


Re: Moving druntime into the DMD repository

2018-07-27 Thread Joakim via Digitalmars-d

On Friday, 27 July 2018 at 11:03:50 UTC, Seb wrote:

This a thread to explore whether it would be feasible to do so.

Motivation
--

[...]


Not much.


- Do you have a better suggestion?


No.


- Would this break your workflow in a drastic way?


No, don't really use the official repos anymore, been using the 
LDC forks instead, so doesn't matter to me how the official git 
repos for dmd are organized.


LDC splits off the DMD testsuite as its own git repo too, so any 
merged directory can always be split off to another git repo if 
wanted:


https://github.com/ldc-developers/dmd-testsuite


Re: Whence came UFCS?

2018-07-26 Thread Joakim via Digitalmars-d

On Friday, 27 July 2018 at 03:41:29 UTC, Sameer Pradhan wrote:
During our Boston D Meetup today, we went through and 
deconstructed Walter's wonderfully elegant blog post from 2012 
called "Component Programming in D"


[...]


Extension methods were added to C# 3.0 in 2007, UFCS was 
discussed as a generalization of the concept to free functions at 
that year's DConf, fully implemented years later:


https://forum.dlang.org/thread/htjuut$av2$1...@digitalmars.com


Re: C's Biggest Mistake on Hacker News

2018-07-23 Thread Joakim via Digitalmars-d

On Monday, 23 July 2018 at 11:51:54 UTC, Jim Balter wrote:

On Sunday, 22 July 2018 at 20:10:27 UTC, Walter Bright wrote:

On 7/21/2018 11:53 PM, Walter Bright wrote:
My article C's Biggest Mistake on front page of 
https://news.ycombinator.com !


Direct link:
https://news.ycombinator.com/item?id=17585357


The responses are not encouraging, but I suppose they're useful 
for sociologists studying fallacious thinking.


In my experience, people never learn, even from the blatantly 
obvious, _particularly_ when they're invested in the outdated. 
What inevitably happens is the new tech gets good enough to put 
them out of business, then they finally pick it up or retire. 
Until most system software is written in 
D/Go/Rust/Swift/Zig/etc., they will keep mouthing platitudes 
about how C is here to stay.


Re: How to define syscall() in freebsd?

2018-07-12 Thread Joakim via Digitalmars-d

On Thursday, 12 July 2018 at 13:55:58 UTC, Brian wrote:

the code is error:
extern (C) nothrow @nogc size_t syscall(size_t ident);
extern (C) nothrow @nogc size_t syscall(size_t ident, size_t 
arg0);
extern (C) nothrow @nogc size_t syscall(size_t ident, long* 
arg0);


long tid;
syscall(SYS_thr_self, &tid);
writeln(tid);

Error: Function type does not match previously declared 
function with the same mangled name: syscall


Just like in C, you cannot declare multiple extern(C) functions 
with the same name.



Change to:
extern (C) nothrow @nogc size_t syscall(size_t ident);
extern (C) nothrow @nogc size_t syscall(size_t ident, size_t 
arg0);
extern (C) nothrow @nogc size_t syscall(size_t ident, long* 
arg0);


long tid;
syscall(SYS_thr_self, &tid);
writeln(tid);


What did you change? I see no difference.

Error: none of the overloads of syscall are callable using 
argument types (int, long*), candidates are:

source/app.d(3,33):kiss.sys.syscall.syscall(ulong ident)
source/app.d(4,33):kiss.sys.syscall.syscall(ulong 
ident, ulong arg0)


Look at the types: the error says you're passing an int and long* 
on 64-bit to functions with different types.



Change to:
// extern (C) nothrow @nogc size_t syscall(size_t ident);
// extern (C) nothrow @nogc size_t syscall(size_t ident, size_t 
arg0);
extern (C) nothrow @nogc size_t syscall(size_t ident, long* 
arg0);


long tid;
syscall(SYS_thr_self, &tid);
writeln(tid);

result:

100567


Probably related to first issue mentioned- you cannot overload C 
functions- maybe the compiler finds the third now that you 
removed the disallowed overloads.


Re: Sutter's ISO C++ Trip Report - The best compliment is when someone else steals your ideas....

2018-07-11 Thread Joakim via Digitalmars-d

On Wednesday, 11 July 2018 at 12:45:40 UTC, crimaniak wrote:
On Tuesday, 10 July 2018 at 22:59:08 UTC, Jonathan M Davis 
wrote:


Or aside from that strawman that RangeError shouldn't be an 
Error even...


I suspect that we're going to have to agree to disagree on 
that one. ...

...
continuing to execute the program is risky by definition. ...
This error handling policy makes D not applicable for creating 
WEB applications and generally long-running services. I think 
anyone who has worked in the enterprise sector will confirm 
that any complex WEB service contains some number of errors 
that were not detected during the tests. These errors are 
detected randomly during operation. And the greatest 
probability of their detection - during the peak traffic of the 
site. Do you kill the whole application even in the case of 
undisturbed memory, with one suspicion of a logical error? At 
the peak of attendance? To prevent a potential catastrophe, 
which could theoretically arise as a result of this error? 
Congratulations! The catastrophe is already here.
And in the case of services, the strategy for responding to 
errors must be exactly the opposite. The error should be 
maximally localized, and the programmer should be able to 
respond to any type of errors. The very nature of the work of 
WEB applications contributes to this. As a rule, queries are 
handled by short-lived tasks that work with thread-local 
memory, and killing only the task that caused the error, with 
the transfer of the exception to the calling task, would 
radically improve the situation. And yes, RangeError shouldn't 
be an Error.


Sounds like you're describing the "Let it crash" philosophy of 
Erlang:


https://ferd.ca/the-zen-of-erlang.html

The crucial point is whether you can depend on the error being 
isolated, as in Erlang's lightweight processes. I guess D assumes 
it isn't.


Re: Adding more projects to the Project Tester

2018-07-05 Thread Joakim via Digitalmars-d

On Friday, 6 July 2018 at 03:19:44 UTC, Seb wrote:
So learning from the recent Vibe.d regression fiasco (we 
temporarily disabled a subconfiguration in Vibe.d and promptly 
got a regression in 2.081), I think we should try to add more 
projects to the Project Tester.


The current list is here:
https://github.com/dlang/ci/blob/master/vars/runPipeline.groovy#L443

Any suggestions?

Why should I add my project to the Project Tester?
--

Once a project is added to the Project Tester, DMD can't 
regress on it anymore as for every PR at dmd, druntime, phobos, 
tools and dub the testsuite of the added projects are run.


How does the Project Tester work?
-

- By default, it will run the same commands as Travis would do. 
Although, if necessary, custom commands can be used too.

- It will checkout the latest, stable git tag

Requirements


- moderately popular or was prone to regressions in the past
- rather easy to build (i.e. you don't need to download and 
recompile clang)
- no flaky testsuite (random errors in the testsuite due to 
network connectivity shouldn't happen. Though there's 
`DETERMINISTIC_HINT=1` set, s.t. you could disable such parts 
of your testsuite)
- reachable author or development (if there's ever a case where 
we need to push changes via a trivial PR to the repo, it 
shouldn't sit in the queue for weeks)


The LDC compiler? kinke recently had an issue because of all the 
C++ integration changes upstream:


https://github.com/ldc-developers/ldc/pull/2752#issuecomment-398897813

As perhaps the largest consumer of extern(C++), it may make sense 
to add it for all the C++ work being done. It would require the 
llvm package be pre-installed in the test environ.


List of current projects looks great, was tough to think of 
anything to add.


Re: dmd optimizer now converted to D!

2018-07-03 Thread Joakim via Digitalmars-d

On Tuesday, 3 July 2018 at 21:57:07 UTC, Walter Bright wrote:

A small, but important milestone has been achieved!

Many thanks for the help from Sebastian Wilzbach and Rainer 
Schuetze.


Fantastic, I see that 35 of 88 files in the backend have been 
translated or added in D, with more being done:


https://github.com/dlang/dmd/pulls?q=is%3Apr+is%3Aopen+label%3A"D+Conversion";

Hope we can get DMD 2.082 out as almost fully written in D. :)


Re: 64bit DMD on Windows

2018-06-30 Thread Joakim via Digitalmars-d

On Saturday, 30 June 2018 at 22:38:13 UTC, 0xEAB wrote:

On Friday, 29 June 2018 at 23:57:51 UTC, H. S. Teoh wrote:
This has been a showstopper for me on low-memory machines. If 
I hadn't been using a high-memory system when I first tried 
out D, I might have just walked away.


Confirmed. On a Raspberry Pi D compilers are completely useless.

For example, I can compile huge C projects like the PHP 
interpreter without a problem (it's horribly slow, though of 
course 😌), but I run out of memory with anything a bit bigger 
written in D.


Well, this applies to LDC, at least (I've got no plan if the 
LLVM backend frees memory).


LDC uses the same GC-less D frontend as DMD, that's likely the 
issue, not the llvm backend.


Re: Is it possible to set up DConf Asia?

2018-06-30 Thread Joakim via Digitalmars-d

On Saturday, 30 June 2018 at 08:27:30 UTC, 鲜卑拓跋枫 wrote:

On Friday, 29 June 2018 at 14:52:45 UTC, Joakim wrote:

On Friday, 29 June 2018 at 12:13:09 UTC, 鲜卑拓跋枫 wrote:

[...]


So do people in US and Europe, the vast majority of whom 
watching the livestream or online videos didn't attend DConf.


On Friday, 29 June 2018 at 12:30:49 UTC, Mike Parker wrote:

[...]


First off, I question there's much benefit to even the key 
devs beyond communicating through email and video conferencing 
to iron things out, as Andrei indicates he does with Walter.


And Jonathan only mentioned the key devs, so that does 
exclude. As for everybody else, see below.



[...]


Then spend all your time doing those things: why waste the 
majority of conference time sitting through talks that you 
don't bother defending?


Here's what a "conference" in Asia or Europe or wherever 
should probably look like in this day and age:


- Have most talks prerecorded by the speaker on their webcam 
or smartphone, which produce excellent video these days with 
not much fiddling, and have a couple organizers work with them 
to get those home-brewed videos up to a certain quality level, 
both in content and presentation, before posting them online.


- Once the videos are all up, set up weekend meetups in 
several cities in the region, such as Tokyo, Hong Kong, and 
Bangalore, where a few livestreamed talks may talk place if 
some speakers don't want to spend more time producing a 
pre-recorded talk, but most time is spent like the hackathon, 
discussing various existing issues from bugzilla in smaller 
groups or brainstorming ideas, designs, and libraries for the 
future.


This is just off the top of my head; I'm sure I'm missing some 
small details here and there, as I was coming up with parts of 
this as I wrote it, but I estimate it'd be an order of 
magnitude more productive than the current conference format 
while being vastly cheaper in total cost to all involved. 
Since D is not exactly drowning in money, it makes no sense to 
waste it on the antiquated conference format. Some American D 
devs may complain that they no longer essentially get to go on 
a vacation to Berlin or Munich- a paid vacation if their 
company compensates for such tech conferences- but that's not 
our problem.


Thanks for further clarification.
But there is still some limitation may exist, e.g., as you may 
note that
the latest Linaro Connect that held in Hong Kong add a new 
special "China Access" for sharing their conference resources 
like below:

http://connect.linaro.org/hkg18/resources/#1506759202543-a2113613-2111

I noted it because I am very interested in programming on ARM, 
so I hope LDC
(https://github.com/ldc-developers/ldc) could add the support 
for AARCH64 as soon as possible:).


Check out the ltsmaster branch of LDC from git and try it out, 
most tests passed for me on Ubuntu/AArch64 16.04:


https://github.com/ldc-developers/ldc/issues/2153#issuecomment-384264048

The few remaining exceptions are some math-related modules would 
need to be patched to support 128-bit floating-point real 
numbers, such as CustomFloat from std.numeric, 
std.internal.math.gammafunction, or the floating-point parser 
from std.conv (but only if you really need that extra precision, 
most of that code still works at 80-bit accuracy), though all the 
tests from std.math now pass. The other big issue is 
core.stdc.stdarg needs to be adapted for AArch64 varargs, which 
is what's holding back building the latest LDC 1.10 natively.


Re: Is it possible to set up DConf Asia?

2018-06-29 Thread Joakim via Digitalmars-d

On Saturday, 30 June 2018 at 05:36:52 UTC, Jonathan M Davis wrote:
On Saturday, June 30, 2018 02:34:00 Joakim via Digitalmars-d 
wrote:
On Saturday, 30 June 2018 at 02:23:57 UTC, Jonathan M Davis 
wrote:

> [...]

That's nice, but since you present no arguments other than 
simply stating that it's "valuable" or "a very good idea" 
that's "gone up"- why? who knows? That would require actually 
supplying an argument- the 99.9% of D users who've never 
attended Dconf are unlikely to be persuaded that it's ever 
worth attending DConf or wasting any more time with a language 
that is more focused on blowing time and money on that 
outdated conference format than getting work done on the 
language.


As I stated previously, having people meet in person can be a 
game changer. It gives you a different perspective on people 
and allows for much more efficient communication in many cases. 
Some stuff does work best when communicated online, but a lot 
of stuff works better when you have people in the same place 
discussing things.


It could certainly be argued that we should do more with less 
traditional stuff like birds of a feather sessions or other 
activities that are geared specifically towards folks 
interacting, but the talks convey lots of useful information 
and ideas, and there's a lot of discussions that go on about 
the talks and other topics during the time that talks aren't 
happening. It would be a real loss to the D community if we 
lost that.


As I stated previously and Adam reiterates, then do the actual 
in-person stuff that you find worthwhile and cut out the stuff 
that's "best when communicated online." I completely disagree 
that talks are in the former category and not the latter, 
particularly when a large majority of the scheduled time is spent 
on them.


Re: Is it possible to set up DConf Asia?

2018-06-29 Thread Joakim via Digitalmars-d

On Saturday, 30 June 2018 at 02:23:57 UTC, Jonathan M Davis wrote:
On Saturday, June 30, 2018 02:08:08 Joakim via Digitalmars-d 
wrote:

[...]


The response is that those of us who have gone to dconf have 
found it to be valuable. It's not just that we're doing what 
others have done or that we think that it might be a good idea. 
It's actually been valuable in practice.


Honestly, this is this first time that I've ever seen anyone 
try to argue that conferences like this are a bad idea. My 
experience has been that it has been a very good idea, and 
there are plenty of people out there who attend conferences 
regularly and try to get others to go because of how much value 
they see in it (and not just for dconf). If anything, the 
number of conferences that I've been hearing about has gone up, 
not down, and plenty of new conferences have started up in 
recent years (e.g. BSD Taiwan started up last year, the OpenZFS 
guys have started up a at least a couple of related conferences 
in the last few years, and RustConf is quite new). If you think 
that it's a bad sign that we have dconf, then that's certainly 
your choice, but the arguments that you've presented are 
unlikely to be persuasive to those of us who have actually 
attended dconf.


That's nice, but since you present no arguments other than simply 
stating that it's "valuable" or "a very good idea" that's "gone 
up"- why? who knows? That would require actually supplying an 
argument- the 99.9% of D users who've never attended Dconf are 
unlikely to be persuaded that it's ever worth attending DConf or 
wasting any more time with a language that is more focused on 
blowing time and money on that outdated conference format than 
getting work done on the language.


Re: Is it possible to set up DConf Asia?

2018-06-29 Thread Joakim via Digitalmars-d

On Saturday, 30 June 2018 at 01:52:15 UTC, Jonathan M Davis wrote:
On Saturday, June 30, 2018 01:43:32 Joakim via Digitalmars-d 
wrote:
On Saturday, 30 June 2018 at 01:33:34 UTC, Jonathan M Davis 
wrote:

> On Saturday, June 30, 2018 01:12:10 Joakim via Digitalmars-d
>
> wrote:
>> Yes, this is about those people, who as that blog post 
>> notes, are wasting a ton of money on an outdated ritual 
>> that no longer makes sense. If you believe the core team 
>> and a few key devs like you need to get together once a 
>> year in person and hash things out, then do that as an 
>> offline retreat somewhere, just don't sucker in a bunch of 
>> other paying DConf attendees to help defray your costs.

>>
>> The ultimate question here is what is the best use of the 
>> money that's being expended every year at DConf? Is that 
>> money best spent mostly on hotel/conference rooms and 
>> airline tickets for marginal benefit to most or on actually 
>> getting shit done? I think it's obvious that the model I've 
>> sketched out to Mike above would get a _lot_ more done.

>
> A lot of people would disagree with you. If you don't want 
> to go, then don't go. If others don't want to go, then they 
> don't have to go. No one is being forced to go. There are 
> clearly plenty of folks interested in going to dconf, and I 
> expect that it will continue to happen so long as there is 
> such interest. If folks aren't interested, then they won't 
> show up, and if attendance is too low, then presumably, 
> dconf won't be held anymore. However, the interest is 
> clearly there even if you aren't interested, and I don't 
> understand why you would be trying to get folks to stop 
> going when they're very much interested in going and see 
> value in doing so. If all you care about is being able to 
> get online content, then just watch the videos online.


My point is obvious from the arguments I've made, including 
the one you just responded to while ignoring the substance of 
the argument. And not that many people are actually interested 
in attending DConf as presently run, I counted what, maybe 
100-150 people at the one in Munich last month?


If you're going to keep ignoring Marco's and my arguments and 
simply repeatedly state that it's worth it for those who 
attend despite all the flaws, then there's no point in 
discussing it. Clearly the current conference format is like a 
religious ritual for you then, something that must be blindly 
done regardless of any considerations of value.


Those of us who take the time and spend the money to go to 
dconf consider it worth the expenditure, or we wouldn't take 
the time or spend the money to go. It's our money to spend, and 
we see real value in what we get out of it, or we wouldn't keep 
going. If you don't agree with us, fine, but I don't see how it 
makes sense to try and talk us out of doing what we see value 
in doing. If you want to spend your time and money on something 
else, then do so.


Simple, D is a collective effort. If the core team wants to waste 
one of its key funding sources in getting a bunch of hobbyists 
together in a room showing off to each other then going on a 
European vacation, completely ignoring how the world and tech has 
changed from back when that could actually be worthwhile, that 
signals to me and others that D is not a serious effort to build 
a viable programming language. Such an egregious waste of 
resources signals that this is just a bunch of boys having fun 
with their toys, only now out on the town in Europe.


I'm not saying that was the intent all along: I suspect that like 
most people and institutions, DConf simply blindly aped what was 
done in the past, which is why conferences still happen. However, 
I'm now presenting arguments for why that doesn't make sense and 
why that outdated ritual is dying off, as Marco notes, and if the 
response is merely, "That's the way things have been done and 
we'll just keep doing it regardless," well, congrats, you just 
explained the thinking for why C and C++ will never be displaced 
by D.


Re: Is it possible to set up DConf Asia?

2018-06-29 Thread Joakim via Digitalmars-d

On Saturday, 30 June 2018 at 01:33:34 UTC, Jonathan M Davis wrote:
On Saturday, June 30, 2018 01:12:10 Joakim via Digitalmars-d 
wrote:
Yes, this is about those people, who as that blog post notes, 
are wasting a ton of money on an outdated ritual that no 
longer makes sense. If you believe the core team and a few key 
devs like you need to get together once a year in person and 
hash things out, then do that as an offline retreat somewhere, 
just don't sucker in a bunch of other paying DConf attendees 
to help defray your costs.


The ultimate question here is what is the best use of the 
money that's being expended every year at DConf? Is that money 
best spent mostly on hotel/conference rooms and airline 
tickets for marginal benefit to most or on actually getting 
shit done? I think it's obvious that the model I've sketched 
out to Mike above would get a _lot_ more done.


A lot of people would disagree with you. If you don't want to 
go, then don't go. If others don't want to go, then they don't 
have to go. No one is being forced to go. There are clearly 
plenty of folks interested in going to dconf, and I expect that 
it will continue to happen so long as there is such interest. 
If folks aren't interested, then they won't show up, and if 
attendance is too low, then presumably, dconf won't be held 
anymore. However, the interest is clearly there even if you 
aren't interested, and I don't understand why you would be 
trying to get folks to stop going when they're very much 
interested in going and see value in doing so. If all you care 
about is being able to get online content, then just watch the 
videos online.


My point is obvious from the arguments I've made, including the 
one you just responded to while ignoring the substance of the 
argument. And not that many people are actually interested in 
attending DConf as presently run, I counted what, maybe 100-150 
people at the one in Munich last month?


If you're going to keep ignoring Marco's and my arguments and 
simply repeatedly state that it's worth it for those who attend 
despite all the flaws, then there's no point in discussing it. 
Clearly the current conference format is like a religious ritual 
for you then, something that must be blindly done regardless of 
any considerations of value.


  1   2   3   4   5   6   7   8   9   10   >