Re: D is crap

2016-07-12 Thread Alex via Digitalmars-d

On Tuesday, 12 July 2016 at 07:01:05 UTC, Paulo Pinto wrote:

Also contrary to what Microsoft tried to push with C++/CX on 
WinRT, besides

game developers not many decided to embrace it.



We didn't embrace it at all, we just have no choice but to use it 
for a lot of XBoxOne SDK calls. Any files with CX enabled compile 
much slower and we try to encapsulate them as much as possible.




Re: D is crap

2016-07-12 Thread Paulo Pinto via Digitalmars-d
On Tuesday, 12 July 2016 at 03:25:38 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 18:14:11 UTC, Paulo Pinto wrote:


I don't do Android programming, but NDK is actually fairly rich 
in comparison to Apple OSes without Objective-C bindings AFAIK. 
The problem seems to be more in the varying hardware 
configurations / quality of implementation.


Not really, it is a real pain to use and feals like an half-baked
solution developed by people that were forced by their manager to
support anything else other than Java.

The iOS and WP SDKs have much better support for C++, specially 
the

integration with native APIs via Objective-C++ and C++/CX and the
debugging tools.



Not using Java on Android sounds like a PITA to be honest.


Yes, the Android team goes to great lengths to make it feel like 
that.



I don't know much about .NET Native, does it apply to or will 
they bring it to .NET Core?


Yes, it is called CoreRT.



A change in recent years is that Microsoft appears to invest 
more in their C++ offering, so apparently they no longer see C# 
as a wholesale replacement.


Not really, the big looser is C.

After the OS Dev team won the political war against the DevTools 
team,
thanks to the Longhorn debacle, the wind changed into the whole 
going native

theme.

Parallel to that the whole Midori effort was ramped down and its 
learnings

brought back to the production side of Microsft.

Also contrary to what Microsoft tried to push with C++/CX on 
WinRT, besides

game developers not many decided to embrace it.

So the result is C# getting the nice features from System C#, AOT 
compilation to

native code via the Visual C++ backend.

At the same time, the internal efforts to clean C++ code where 
taken outside and the C++ Core Guidelines were born.


Also Kenny Kerr a very vocal C++ MVP (and MSDN Magazine 
collaborator) against C++/CX was hired, and is now driving the 
effort to create a WinRT projection using plain standard modern 
C++.




The WinRT, User Driver Framework, the new container model and 
Linux subsystem, the Checked C, input to the C++ Core


I haven't paid much attention to WinRT lately, they have a 
Linux subsystem?


Yes, will be available in the upcoming Windows 10 Anniversary 
edition.


It is built on top of the Drawbrige picoprocesses that are now a 
Windows 10 feature.


Basically it only supports x64 ELF binaries and makes use of the 
pico-processes infrastructure to redirect Linux syscalls into NT 
ones.


It is a collaboration between Microsoft and Ubuntu and there are 
quite a few Channel 9 videos describing how the whole stack works.




Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 18:14:11 UTC, Paulo Pinto wrote:

Actually NeXTStep drivers were written in Objective-C.



NeXT was a cool concept, but it was sad that  they picked such an 
annoying language to build it.


They are not alone, as of Android N, Google is making it pretty 
clear that if one tries to circuvent the constrained set of NDK 
APIs and workaround the JNI to
access existing shared objects, the application will be simply 
be killed.


I don't do Android programming, but NDK is actually fairly rich 
in comparison to Apple OSes without Objective-C bindings AFAIK. 
The problem seems to be more in the varying hardware 
configurations / quality of implementation.


Not using Java on Android sounds like a PITA to be honest.

If you check the latest BUILD, the current approach being 
evangelised is .NET Native for 90% of the code, C++/CX or plain 
C++ with WRL for glueing to low level code until C# gets the 
missing features from System C#, and C++ for everything else.


I don't know much about .NET Native, does it apply to or will 
they bring it to .NET Core?


A change in recent years is that Microsoft appears to invest more 
in their C++ offering, so apparently they no longer see C# as a 
wholesale replacement.


The WinRT, User Driver Framework, the new container model and 
Linux subsystem, the Checked C, input to the C++ Core


I haven't paid much attention to WinRT lately, they have a Linux 
subsystem?




Re: D is crap

2016-07-11 Thread Charles Hixson via Digitalmars-d
Garbage collection allows many syntax "liberalizations" that lack of 
garbage collection renders either impossible or highly dangerous.  (In 
this definition of "garbage collection" I'm including variations like 
reference counting.)  For an example of this consider the dynamic array 
type.  You MUST have garbage collection to use that safely...unless you 
require the freeing of memory with every change in size.  C++ does that 
with the STL, but if you want the dynamic types built into the language, 
then you need garbage collection built into the language.  (This is 
different from saying it needs to be active everywhere, but once you've 
got it, good places to use it keep showing up.)


One of the many advantages of the dynamic array type being built into 
the language is that arrays of different sizes are reasonably comparable 
by methods built into the language.  This is used all over the place.  
In D I almost never need to use "unchecked conversion".


On 07/11/2016 02:30 AM, Chris via Digitalmars-d wrote:

On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad wrote:


Just like there is no C++ book that does not rant about how great 
RAII is... What do you expect from a language evangelic? The first 
Java implementation Hotspot inherited its technology from StrongTalk, 
a Smalltalk successor. It was not a Java phenomenon, and FWIW both 
Lisp, Simula and Algol68 were garbage collected.


Please stop intentionally missing the point. I don't care if Leonardo 
Da Vinci already had invented GC - which wouldn't surprise me - but 
this is not the point. My point is that GC became a big thing in the 
late 90ies early 2000s which is in part owed to Java having become the 
religion of the day (not Lisp or SmallTalk)[1]. D couldn't have 
afforded not to have GC when it first came out. It was expected of a 
(new) language to provide GC by then - and GC had become a selling 
point for new languages.


[1] And of course computers had become more powerful and could handle 
the overhead of GC better than in the 80ies.


What was "new" with Java was compile-once-run-everywhere. Although, 
that wasn't new either, but it was at least marketable as new.


Java was the main catalyst for GC - or at least for people demanding 
it. Practically everybody who had gone through IT courses, college 
etc. with Java (and there were loads) wanted GC. It was a given for 
many people.


Well, yes, of course Java being used in universities created a demand 
for Java and similar languages. But GC languages were extensively 
used in universities before Java.


Yes, it didn't last long. But the fact that they bothered to 
introduce it, shows you how big GC was/is.


No, it shows how demanding manual reference counting was in 
Objective-C on regular programmers. GC is the first go to solution 
for easy memory management, and has been so since the 60s. Most high 
level languages use garbage collection.


It wasn't demanding. I wrote a lot of code in Objective-C and it was 
perfectly doable. You even have features like `autorelease` for return 
values. The thing is that Apple had become an increasingly popular 
platform and more and more programmers were writing code for OS X. So 
they thought, they'd make it easier and reduce potential memory leaks 
(introduced by not so experienced Objective-C coders) by adding GC, 
especially because a lot of programmers expected GC "in this day and 
age".






Re: D is crap

2016-07-11 Thread Jacob Carlborg via Digitalmars-d

On 2016-07-11 14:23, Luís Marques wrote:


Doesn't seem to work for me on 10.11.5. Maybe you need to enable that on
the latest OSes?


It works for me. I don't recall specifically enabling crash reports. Are 
you looking at "All Messages"? You can also look at 
~/Library/Logs/DiagnosticReports to see if a new file shows up.



In any case, that will probably get you a mangled stack
trace, right?


Well, OS X doesn't no anything about D mangling ;). But it will demangle 
C++ symbols.



It would still be useful (especially if the stack trace if
correct, in LLDB I get some crappy ones sometimes) but it would not be
as convenient as the stack trace on Windows generated by the druntime.


Yes, of course.

--
/Jacob Carlborg


Re: D is crap

2016-07-11 Thread Paulo Pinto via Digitalmars-d
On Monday, 11 July 2016 at 16:44:27 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 16:26:11 UTC, Paulo Pinto wrote:

Happy not to disappoint.  :)


You never disappoint in the GC department ;-)

OS vendors are the ones that eventually decided what is a 
systems programming language on their OSes.


To a large extent on Apple and Microsoft OSes. Not so much on 
open source OSes as you are not tied down to binary blobs.


And if they say so, like Apple is nowadays doing with Swift, 
developers will have no option other than accept it or move to 
other platform, regardless of their opinion what features a 
systems programming languages should offer.


It is true that there have been policy changes which makes it 
difficult to access features like GPU and Audio on OS-X/iOS 
without touching Objective-C or Swift. You don't have to use it 
much, but you need some binding stubs in Objective-C or 
Objective-C++ if you want to be forward compatible (i.e. link 
on future versions of the OS without recompiling).


But I _have_ noticed that Apple increasingly is making low 
level setup only available through Objective-C/Swift. It is 
probably a lock-in strategy to raise porting costs to Android.


Actually NeXTStep drivers were written in Objective-C.

http://www.cilinder.be/docs/next/NeXTStep/3.3/nd/OperatingSystem/Part3_DriverKit/Concepts/1_Overview/Overview.htmld/

They are not alone, as of Android N, Google is making it pretty 
clear that if one tries to circuvent the constrained set of NDK 
APIs and workaround the JNI to
access existing shared objects, the application will be simply be 
killed.


http://android-developers.blogspot.de/2016/06/android-changes-for-ndk-developers.html

Which basically boils down to OEMs, 3D rendering and low



Just like C developers that used to bash C++, now have to 
accept the two biggest C compilers are written in the language 
they love to hate.


There was a thread on reddit recently where some Microsoft 
employees admitted that parts of Windows now is implemented in 
C++ and C#, IIRC. I believe it is parts that run in user mode 
as separate processes, but still...


Yes, the trend started with Windows 8 and the new application 
model based on the initial design of COM+ Runtime, which was the 
genesis of .NET before they decided to ditch it for the CLR.


If you check the latest BUILD, the current approach being 
evangelised is .NET Native for 90% of the code, C++/CX or plain 
C++ with WRL for glueing to low level code until C# gets the 
missing features from System C#, and C++ for everything else.


On the UWP model, DirectX is probably the only user space API 
that doesn't have a WinRT projection fully available, but they 
have been slowly surfacing it in each release.


The WinRT, User Driver Framework, the new container model and 
Linux subsystem, the Checked C, input to the C++ Core Guidelines 
and new C# features all trace back to the MSR work in 
Singularity, Midori and Drawbridge.





Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 16:26:11 UTC, Paulo Pinto wrote:

Happy not to disappoint.  :)


You never disappoint in the GC department ;-)

OS vendors are the ones that eventually decided what is a 
systems programming language on their OSes.


To a large extent on Apple and Microsoft OSes. Not so much on 
open source OSes as you are not tied down to binary blobs.


And if they say so, like Apple is nowadays doing with Swift, 
developers will have no option other than accept it or move to 
other platform, regardless of their opinion what features a 
systems programming languages should offer.


It is true that there have been policy changes which makes it 
difficult to access features like GPU and Audio on OS-X/iOS 
without touching Objective-C or Swift. You don't have to use it 
much, but you need some binding stubs in Objective-C or 
Objective-C++ if you want to be forward compatible (i.e. link on 
future versions of the OS without recompiling).


But I _have_ noticed that Apple increasingly is making low level 
setup only available through Objective-C/Swift. It is probably a 
lock-in strategy to raise porting costs to Android.


Just like C developers that used to bash C++, now have to 
accept the two biggest C compilers are written in the language 
they love to hate.


There was a thread on reddit recently where some Microsoft 
employees admitted that parts of Windows now is implemented in 
C++ and C#, IIRC. I believe it is parts that run in user mode as 
separate processes, but still...




Re: D is crap

2016-07-11 Thread Guillaume Piolat via Digitalmars-d

On Monday, 11 July 2016 at 14:12:35 UTC, Chris wrote:


You focus on a small niche where people use all kinds of 
performance tricks even in C and C++. A lot of software doesn't 
care about GC overheads, however, and without GC a lot of 
people wouldn't even have considered it.




+1
A large majority of performance-heavy software can live with the 
GC.
GC is a blocker for people using micro-controllers with little 
memory, that usually don't get to choose a compiler.





Re: D is crap

2016-07-11 Thread Paulo Pinto via Digitalmars-d
On Monday, 11 July 2016 at 14:58:16 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 14:45:56 UTC, Paulo Pinto wrote:
The biggest problem with D isn't the GC, is lack of focus to 
make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, 
Java, C++17.


I knew you would chime in... Neither .NET, Swift or Java should 
be considered system level tools. Ada/Spark has a very narrow 
use case. Rust is still in it's infancy. C++17 is not yet 
finished. But yes, C++ currently owns system level programming, 
C is loosing terrain and Rust has an uncertain future.


The biggest problem with D is not GC, because we now how @nogc. 
But D is still lacking in memory management.


Happy not to disappoint.  :)

OS vendors are the ones that eventually decided what is a systems 
programming language on their OSes.


And if they say so, like Apple is nowadays doing with Swift, 
developers will have no option other than accept it or move to 
other platform, regardless of their opinion what features a 
systems programming languages should offer.


Just like C developers that used to bash C++, now have to accept 
the two biggest C compilers are written in the language they love 
to hate.





Re: D is crap

2016-07-11 Thread Paolo Invernizzi via Digitalmars-d

On Monday, 11 July 2016 at 14:45:56 UTC, Paulo Pinto wrote:


The biggest problem with D isn't the GC, is lack of focus to 
make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, 
Java, C++17.


How true!
That's the only real problem with this beautiful language!

/P




Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 14:12:35 UTC, Chris wrote:
Most certainly from a multi-purpose language. GC would have 
been demanded sooner or later. The mistake was not to make it 
optional from the beginning.


If D was designed as a high level language then it would be a 
mistake not to provide a GC in most scenarios. Yes.


care about GC overheads, however, and without GC a lot of 
people wouldn't even have considered it.


Lots of people have been happy with Perl and Python before they 
added GC to catch cycles... Most applications don't leak a lot of 
memory to cyclic references and they usually have to run for a 
while. (But constructing a worst case is easy, of course.)


(Btw, didn't mean to say that autorelease pools are the same as a 
region allocator, but they are similar in spirit.)



Go ahead, I'm sure it's fun. ;)


Oh, I didn't mean to say I have designed a language. I have many 
ideas and sketches, but far too many to implement and polish ;-).


I have started extending my knowledge on type systems, though, 
quite interesting. I think the change in computing power we now 
have is opening up for many new interesting opportunities.





Re: D is crap

2016-07-11 Thread Chris via Digitalmars-d

On Monday, 11 July 2016 at 14:03:36 UTC, Infiltrator wrote:

On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:

...
To have GC was definitely a good decision. What was not so
good was that it was not optional with a simple on/off switch.
...


I know that I'm missing something here, but what's wrong with 
the functions provided in core.memory?  Specifically, 
GC.disable()?


I was thinking of a compiler switch (as they did in Objective-C), 
and had D been designed with `-nogc` in mind from the start, 
Phobos would be GC free too. No GC is still a bit rough around 
the edges.


Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 14:45:56 UTC, Paulo Pinto wrote:
The biggest problem with D isn't the GC, is lack of focus to 
make it stand out versus .NET Native, Swift, Rust, Ada, SPARK, 
Java, C++17.


I knew you would chime in... Neither .NET, Swift or Java should 
be considered system level tools. Ada/Spark has a very narrow use 
case. Rust is still in it's infancy. C++17 is not yet finished. 
But yes, C++ currently owns system level programming, C is 
loosing terrain and Rust has an uncertain future.


The biggest problem with D is not GC, because we now how @nogc. 
But D is still lacking in memory management.




Re: D is crap

2016-07-11 Thread Paulo Pinto via Digitalmars-d
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
I bet you that if D hadn't had GC when it first came out, 
people would've mentioned manual memory management as a reason 
not to use GC. I never claimed that D was _propelled_ by GC, 
but that it was a feature that most users would expect. Not 
having it would probably have done more harm than having it.


Actually, I am certain that GC is a feature that _nobody_ would 
expect from a system level language, outside the Go-crowd.




I am no longer dabbling in D, but could not resist:

- UK Royal Navy with Algol 68 RS

- Xerox PARC with Mesa/Cedar

- DEC/Olivetti/Compaq with Modula-3

- ETHZ with Oberon, Oberon-2, Active Oberon, Component Pascal

- Microsoft with Spec#, System C# and the upcoming .NET Native C# 
7.0+ features
 (http://joeduffyblog.com/2015/12/19/safe-native-code/, 
https://www.infoq.com/news/2016/06/systems-programming-qcon)


- Astrobe with Oberon for micro-controlers (ARM Cortex-M4, 
Cortex-M3 and

Xilinx FPGA Systems)

- PTC Perc Ultra with Java

- IS2T with their MicroEJ OS Java/C platform


The biggest problem with D isn't the GC, is lack of focus to make 
it stand out versus .NET Native, Swift, Rust, Ada, SPARK, Java, 
C++17.






Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 14:19:07 UTC, ketmar wrote:
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:
Actually, I am certain that GC is a feature that _nobody_ 
would expect from a system level language, outside the 
Go-crowd.


hello. i am the man born to ruin your world.


Of course, you are the extra 1% that comes on top of the other 
100%.




Re: D is crap

2016-07-11 Thread ketmar via Digitalmars-d
On Monday, 11 July 2016 at 13:56:30 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 12:18:26 UTC, ketmar wrote:
and most of those people never even started to use D. took a 
brief look, maybe wrote "helloworld", and that's all. it


Where do you get this from?


from reading this NG and other parts of teh internets.


Quite a few D programmers have gone to C++ and Rust.


quite a few people who tried D... and anyway, the reasons were 
more complex, and GC usually just a nice excuse.


 C is primarily used for portability/system support/interfacing 
or because you have an existing codebase.


this is mostly what "i can't stand GC" people want to do.

C is increasingly becoming a marginal language (narrow 
application area).


'cause manual memory management is PITA. not only due to this, of 
course, but this is still something.


Re: D is crap

2016-07-11 Thread ketmar via Digitalmars-d
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:
Actually, I am certain that GC is a feature that _nobody_ would 
expect from a system level language, outside the Go-crowd.


hello. i am the man born to ruin your world.


Re: D is crap

2016-07-11 Thread Chris via Digitalmars-d
On Monday, 11 July 2016 at 14:02:09 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
I bet you that if D hadn't had GC when it first came out, 
people would've mentioned manual memory management as a reason 
not to use GC. I never claimed that D was _propelled_ by GC, 
but that it was a feature that most users would expect. Not 
having it would probably have done more harm than having it.


Actually, I am certain that GC is a feature that _nobody_ would 
expect from a system level language, outside the Go-crowd.


Most certainly from a multi-purpose language. GC would have been 
demanded sooner or later. The mistake was not to make it optional 
from the beginning.


You focus on a small niche where people use all kinds of 
performance tricks even in C and C++. A lot of software doesn't 
care about GC overheads, however, and without GC a lot of people 
wouldn't even have considered it.


By the way, have you ever designed a language, I'd love to see 
how it would look like ;)


Most programmers have designed DSL, so yes, obviously. If you 
are talking about a general purpose language then I wouldn't 
want to announce it until I was certain I got the basics right, 
like memory management.


Go ahead, I'm sure it's fun. ;)


Re: D is crap

2016-07-11 Thread Infiltrator via Digitalmars-d

On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:

...
To have GC was definitely a good decision. What was not so
good was that it was not optional with a simple on/off switch.
...


I know that I'm missing something here, but what's wrong with the 
functions provided in core.memory?  Specifically, GC.disable()?


Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 13:24:14 UTC, Chris wrote:
I bet you that if D hadn't had GC when it first came out, 
people would've mentioned manual memory management as a reason 
not to use GC. I never claimed that D was _propelled_ by GC, 
but that it was a feature that most users would expect. Not 
having it would probably have done more harm than having it.


Actually, I am certain that GC is a feature that _nobody_ would 
expect from a system level language, outside the Go-crowd.


By the way, have you ever designed a language, I'd love to see 
how it would look like ;)


Most programmers have designed DSL, so yes, obviously. If you are 
talking about a general purpose language then I wouldn't want to 
announce it until I was certain I got the basics right, like 
memory management.




Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 12:18:26 UTC, ketmar wrote:
and most of those people never even started to use D. took a 
brief look, maybe wrote "helloworld", and that's all. it


Where do you get this from? Quite a few D programmers have gone 
to C++ and Rust.


D *can* be used without GC. and it will still be "better C". it 
still will be less painful than C, but this is the price of 
doing "low-level things".


 C is primarily used for portability/system support/interfacing 
or because you have an existing codebase. Even Microsoft's is now 
using higher level languages than C in parts of their system 
level code (operating system).


Btw, C has changed quite a bit, it is at C11 now and even have 
"generics"... but I doubt many will us it. C is increasingly 
becoming a marginal language (narrow application area).


Re: D is crap

2016-07-11 Thread Chris via Digitalmars-d
I bet you that if D hadn't had GC when it first came out, 
people would've mentioned manual memory management as a reason 
not to use GC. I never claimed that D was _propelled_ by GC, 
but that it was a feature that most users would expect. Not 
having it would probably have done more harm than having it.


By the way, have you ever designed a language, I'd love to see 
how it would look like ;)


[snip]


s/not to use GC/not to use D


Re: D is crap

2016-07-11 Thread Chris via Digitalmars-d
On Monday, 11 July 2016 at 11:59:51 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC 
when it first came out. It was expected of a (new) language to 
provide GC by then - and GC had become a selling point for new 
languages.


This is not true, it is just wishful thinking. D was harmed by 
the GC, not propelled by it. I am not missing any point, sorry. 
Just go look at what people who gave up on D claim to be a 
major reason, the GC scores high...



No. Having GC attracts more users, because they either explicitly 
want it of they don't care for the overhead. To have GC was 
definitely a good decision. What was not so good was that it was 
not optional with a simple on/off switch. Neither was it a good 
idea not to spend more time on ways to optimize GC, so it was 
comparatively slow.


Keep in mind that the no GC crowd has very specialized needs 
(games, real time systems). Then again, to win this crowd over 
from C/C++ is not easy, regardless. And ... let's not forget that 
GC is often used as a handy excuse not to use D. "You don't use D 
because of a, b, c or because of GC?" - "Yeah, that one."


I bet you that if D hadn't had GC when it first came out, people 
would've mentioned manual memory management as a reason not to 
use GC. I never claimed that D was _propelled_ by GC, but that it 
was a feature that most users would expect. Not having it would 
probably have done more harm than having it.


By the way, have you ever designed a language, I'd love to see 
how it would look like ;)


[snip]


Re: D is crap

2016-07-11 Thread Luís Marques via Digitalmars-d

On Sunday, 10 July 2016 at 18:53:52 UTC, Jacob Carlborg wrote:
On OS X when an application segfaults a crash report will be 
generated. It's available in the Console application.


Doesn't seem to work for me on 10.11.5. Maybe you need to enable 
that on the latest OSes? In any case, that will probably get you 
a mangled stack trace, right? It would still be useful 
(especially if the stack trace if correct, in LLDB I get some 
crappy ones sometimes) but it would not be as convenient as the 
stack trace on Windows generated by the druntime.


Re: D is crap

2016-07-11 Thread ketmar via Digitalmars-d
On Monday, 11 July 2016 at 11:59:51 UTC, Ola Fosheim Grøstad 
wrote:
Just go look at what people who gave up on D claim to be a 
major reason, the GC scores high...


and most of those people never even started to use D. took a 
brief look, maybe wrote "helloworld", and that's all. it doesn't 
matter in this case which reason made 'em "turn away". if not GC, 
it would be something another: they just wanted their Ideal 
Lanugage, and found that D is not. those people just can't be 
satisfied, 'cause they are looking for something D isn't at all.


D *can* be used without GC. and it will still be "better C". it 
still will be less painful than C, but this is the price of doing 
"low-level things". or it can be used on a much higher level, 
where GC doesn't really matter anymore (and actually desirable).


Re: D is crap

2016-07-11 Thread Luís Marques via Digitalmars-d

On Saturday, 9 July 2016 at 08:40:00 UTC, Walter Bright wrote:

On 7/8/2016 2:36 PM, Luís Marques wrote:

On Friday, 8 July 2016 at 21:26:19 UTC, Walter Bright wrote:
Only on Windows, and that's a common source of frustration 
for me :(


Linux too.


Not by default, right?


-g


Well, it doesn't work for me on Linux with the latest DMD, even 
with -g.
To be clear, the whole context was "Not by default, right? Only 
with the magic import and call."


Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 08:55:06 UTC, ketmar wrote:
On Monday, 11 July 2016 at 08:45:21 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 08:43:03 UTC, ketmar wrote:
On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
wrote:

There aren't many people you trust then...

exactly. 99% of people are idiots.


100%


it depends of rounding mode.


101%


Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC 
when it first came out. It was expected of a (new) language to 
provide GC by then - and GC had become a selling point for new 
languages.


This is not true, it is just wishful thinking. D was harmed by 
the GC, not propelled by it. I am not missing any point, sorry. 
Just go look at what people who gave up on D claim to be a major 
reason, the GC scores high...



It wasn't demanding. I wrote a lot of code in Objective-C and 
it was perfectly doable.


Of course it was doable, but developers had trouble getting it 
right. In Objective-C Foundation you have to memorize what kind 
of ownership functions return. A responsibility which ARC is 
relieving the developer from. Autorelease-pools does not change 
that, and you have to take special measures to avoid running out 
of memory with autorelease pools as it is a very simple 
region-allocator (what Walter calls a bump--allocator) so 
autorelease pools are not a generic solution.


Objective-C had a very primitive manual RC solution that relied 
on conventions. They added a GC and ARC and only kept ARC. As 
simple as that.


C++ actually has much robust memory management that what 
Objective-C had.




Re: D is crap

2016-07-11 Thread Chris via Digitalmars-d
On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad 
wrote:


Just like there is no C++ book that does not rant about how 
great RAII is... What do you expect from a language evangelic? 
The first Java implementation Hotspot inherited its technology 
from StrongTalk, a Smalltalk successor. It was not a Java 
phenomenon, and FWIW both Lisp, Simula and Algol68 were garbage 
collected.


Please stop intentionally missing the point. I don't care if 
Leonardo Da Vinci already had invented GC - which wouldn't 
surprise me - but this is not the point. My point is that GC 
became a big thing in the late 90ies early 2000s which is in part 
owed to Java having become the religion of the day (not Lisp or 
SmallTalk)[1]. D couldn't have afforded not to have GC when it 
first came out. It was expected of a (new) language to provide GC 
by then - and GC had become a selling point for new languages.


[1] And of course computers had become more powerful and could 
handle the overhead of GC better than in the 80ies.


What was "new" with Java was compile-once-run-everywhere. 
Although, that wasn't new either, but it was at least 
marketable as new.


Java was the main catalyst for GC - or at least for people 
demanding it. Practically everybody who had gone through IT 
courses, college etc. with Java (and there were loads) wanted 
GC. It was a given for many people.


Well, yes, of course Java being used in universities created a 
demand for Java and similar languages. But GC languages were 
extensively used in universities before Java.


Yes, it didn't last long. But the fact that they bothered to 
introduce it, shows you how big GC was/is.


No, it shows how demanding manual reference counting was in 
Objective-C on regular programmers. GC is the first go to 
solution for easy memory management, and has been so since the 
60s. Most high level languages use garbage collection.


It wasn't demanding. I wrote a lot of code in Objective-C and it 
was perfectly doable. You even have features like `autorelease` 
for return values. The thing is that Apple had become an 
increasingly popular platform and more and more programmers were 
writing code for OS X. So they thought, they'd make it easier and 
reduce potential memory leaks (introduced by not so experienced 
Objective-C coders) by adding GC, especially because a lot of 
programmers expected GC "in this day and age".


Re: D is crap

2016-07-11 Thread ketmar via Digitalmars-d
On Monday, 11 July 2016 at 08:45:21 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 11 July 2016 at 08:43:03 UTC, ketmar wrote:
On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
wrote:

There aren't many people you trust then...

exactly. 99% of people are idiots.


100%


it depends of rounding mode.


Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Monday, 11 July 2016 at 08:43:03 UTC, ketmar wrote:
On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
wrote:

There aren't many people you trust then...

exactly. 99% of people are idiots.


100%



Re: D is crap

2016-07-11 Thread ketmar via Digitalmars-d
On Monday, 11 July 2016 at 07:16:57 UTC, Ola Fosheim Grøstad 
wrote:

There aren't many people you trust then...

exactly. 99% of people are idiots.


Re: D is crap

2016-07-11 Thread Ola Fosheim Grøstad via Digitalmars-d

On Sunday, 10 July 2016 at 19:12:46 UTC, ketmar wrote:

then i won't trust a word they said.


There aren't many people you trust then... Seriously, in academic 
contexts a statement like «X is a garbage collected language» 
always means tracing. It would be very odd to assume that X used 
reference counting.





Re: D is crap

2016-07-10 Thread ketmar via Digitalmars-d
On Sunday, 10 July 2016 at 17:06:20 UTC, Ola Fosheim Grøstad 
wrote:

On Sunday, 10 July 2016 at 17:03:26 UTC, ketmar wrote:
On Sunday, 10 July 2016 at 16:58:49 UTC, Ola Fosheim Grøstad 
wrote:
I've never been to a lecture/presentation where "garbage 
collection" did not mean "tracing garbage collection".

then you probably watched some... wrong lections. ;-)


Nah, they were experienced language designers and researchers.


then i won't trust a word they said.


Re: D is crap

2016-07-10 Thread Jacob Carlborg via Digitalmars-d

On 2016-07-08 19:07, Luís Marques wrote:


I was referring to the stack trace on segfault, but regarding the user
of debuggers on a Mac with D, most of the time it doesn't work very well
for me. I think last time I used lldb (maybe last week) when I tried to
print something in a D program nothing would happen, not even an error.
Now that lldc is more up-to-date I'll check if that helps lldb get less
confused.


On OS X when an application segfaults a crash report will be generated. 
It's available in the Console application.


--
/Jacob Carlborg


Re: D is crap

2016-07-10 Thread Ola Fosheim Grøstad via Digitalmars-d

On Sunday, 10 July 2016 at 17:03:26 UTC, ketmar wrote:
On Sunday, 10 July 2016 at 16:58:49 UTC, Ola Fosheim Grøstad 
wrote:
I've never been to a lecture/presentation where "garbage 
collection" did not mean "tracing garbage collection".

then you probably watched some... wrong lections. ;-)


Nah, they were experienced language designers and researchers.




Re: D is crap

2016-07-10 Thread ketmar via Digitalmars-d
On Sunday, 10 July 2016 at 16:58:49 UTC, Ola Fosheim Grøstad 
wrote:
I've never been to a lecture/presentation where "garbage 
collection" did not mean "tracing garbage collection".

then you probably watched some... wrong lections. ;-)


Re: D is crap

2016-07-10 Thread Ola Fosheim Grøstad via Digitalmars-d

On Sunday, 10 July 2016 at 09:05:47 UTC, ketmar wrote:
On Sunday, 10 July 2016 at 09:04:25 UTC, Ola Fosheim Grøstad 
wrote:
Nothing to do with hipsters. The common interpretation for 
«garbage collection» in informal context has always been a 
tracing collector. I've never heard anything else in any 
informal CS context.


i always heard that "garbage collection" is garbage collection, 
and it is irrelevant to algorithms used.


I've never been to a lecture/presentation where "garbage 
collection" did not mean "tracing garbage collection".  Attribute 
this to culture if you don't like it...




Re: D is crap

2016-07-10 Thread ketmar via Digitalmars-d
On Sunday, 10 July 2016 at 09:04:25 UTC, Ola Fosheim Grøstad 
wrote:
Nothing to do with hipsters. The common interpretation for 
«garbage collection» in informal context has always been a 
tracing collector. I've never heard anything else in any 
informal CS context.


i always heard that "garbage collection" is garbage collection, 
and it is irrelevant to algorithms used.


Re: D is crap

2016-07-10 Thread Ola Fosheim Grøstad via Digitalmars-d

On Sunday, 10 July 2016 at 06:19:28 UTC, ketmar wrote:
On Sunday, 10 July 2016 at 02:02:23 UTC, Ola Fosheim Grøstad 
wrote:
Reference counting is a technique for collecting garbage, but 
the term «garbage collection» is typically used for techniques 
that catch cycles by tracing down chains of pointers:


i don't care about hipsters redefining the terms for arbitrary 
reasons. refcounting IS GC.


Nothing to do with hipsters. The common interpretation for 
«garbage collection» in informal context has always been a 
tracing collector. I've never heard anything else in any informal 
CS context.




Re: D is crap

2016-07-09 Thread ketmar via Digitalmars-d
On Sunday, 10 July 2016 at 02:02:23 UTC, Ola Fosheim Grøstad 
wrote:
Reference counting is a technique for collecting garbage, but 
the term «garbage collection» is typically used for techniques 
that catch cycles by tracing down chains of pointers:


i don't care about hipsters redefining the terms for arbitrary 
reasons. refcounting IS GC.


Re: D is crap

2016-07-09 Thread ketmar via Digitalmars-d
On Sunday, 10 July 2016 at 02:28:58 UTC, Ola Fosheim Grøstad 
wrote:

So what D needs is:

1. local garbage collection (for a single fiber or a facade to 
a graph).


2. solid global ownership management (for both resources and 
memory).


ketmar doesn't need that. even for his real-time audio engine and 
videogame engines. not a high priority then, and adds ALOT of 
complexity (thing about complexity of passing values out of 
thread/fiber). no, thanks.


Re: D is crap

2016-07-09 Thread ketmar via Digitalmars-d
On Sunday, 10 July 2016 at 02:08:36 UTC, Ola Fosheim Grøstad 
wrote:
No, manual reference counting is not particularly slow. 
Automatic reference counting is also not considered to be 
slower than GC.


i keep insisting that refcounting IS GC. please, stop call it 
something else.


Re: D is crap

2016-07-09 Thread Ola Fosheim Grøstad via Digitalmars-d

On Saturday, 9 July 2016 at 09:15:19 UTC, Chris wrote:
Yes, of course the "write-once-run-everywhere" fairy tale 
helped to spread Java, but while it was gaining traction GC 
became a feature everybody wanted. Sorry, but there is not a 
single book or introduction to Java that doesn't go on about 
how great GC is.


Just like there is no C++ book that does not rant about how great 
RAII is... What do you expect from a language evangelic? The 
first Java implementation Hotspot inherited its technology from 
StrongTalk, a Smalltalk successor. It was not a Java phenomenon, 
and FWIW both Lisp, Simula and Algol68 were garbage collected.


What was "new" with Java was compile-once-run-everywhere. 
Although, that wasn't new either, but it was at least marketable 
as new.


Java was the main catalyst for GC - or at least for people 
demanding it. Practically everybody who had gone through IT 
courses, college etc. with Java (and there were loads) wanted 
GC. It was a given for many people.


Well, yes, of course Java being used in universities created a 
demand for Java and similar languages. But GC languages were 
extensively used in universities before Java.


Yes, it didn't last long. But the fact that they bothered to 
introduce it, shows you how big GC was/is.


No, it shows how demanding manual reference counting was in 
Objective-C on regular programmers. GC is the first go to 
solution for easy memory management, and has been so since the 
60s. Most high level languages use garbage collection.




Re: D is crap

2016-07-09 Thread Ola Fosheim Grøstad via Digitalmars-d

On Saturday, 9 July 2016 at 11:27:13 UTC, ketmar wrote:
and with refcounting i have to *explicitly* mark all the code 
as "no refcounting here", or accept refcounting overhead for 
nothing.


That would be automatic reference counting ;-)... Reference 
counting is ok for shared ownership, but in most cases overkill. 
Garbage collection is also useful in some settings, e.g. in some 
types of graph manipulation. Where things go wrong for D is to 
use primitive global garbage collection. It would have worked out 
ok if it provided only primitive local garbage collection.


So what D needs is:

1. local garbage collection (for a single fiber or a facade to a 
graph).


2. solid global ownership management (for both resources and 
memory).


Most newbies can then happily write single-threaded code as 
usual. More advanced programmers need to deal with shared 
ownership. Which they might have to do anyway, since garbage 
collection does not handle resources.




Re: D is crap

2016-07-09 Thread Ola Fosheim Grøstad via Digitalmars-d

On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:

On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:

removed the GC

...

replaced it with automatic reference counting.


you *do* know that refcounting *is* GC, do you? ;-)


And that's a very important point, because the choice of RC vs 
other types of GC ignores the fact that they're both GC, and 
old school programmers didn't want anything to do with a 
"feature" that would slow down their code. RC would have been 
an even worse choice when D started because it is [claimed to 
be] slower than other types of GC.


No, manual reference counting is not particularly slow. Automatic 
reference counting is also not considered to be slower than GC.


Reference counting is not capable of catching cyclic reference, 
which is why garbage collection is considered to be a more 
general solution to the problem.


This is pretty much 101 memory management.



Re: D is crap

2016-07-09 Thread Ola Fosheim Grøstad via Digitalmars-d

On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:

removed the GC

...

replaced it with automatic reference counting.


you *do* know that refcounting *is* GC, do you? ;-)


Reference counting is a technique for collecting garbage, but the 
term «garbage collection» is typically used for techniques that 
catch cycles by tracing down chains of pointers:


https://en.wikipedia.org/wiki/Garbage_collection_(computer_science)#Tracing_garbage_collectors



Re: D is crap

2016-07-09 Thread Chris via Digitalmars-d

On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:

On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:

removed the GC

...

replaced it with automatic reference counting.


you *do* know that refcounting *is* GC, do you? ;-)


And that's a very important point, because the choice of RC vs 
other types of GC ignores the fact that they're both GC, and 
old school programmers didn't want anything to do with a 
"feature" that would slow down their code. RC would have been 
an even worse choice when D started because it is [claimed to 
be] slower than other types of GC. It's been a long time now, 
but I don't recall many arguments against Java's GC because of 
pauses. The objection was always that it would make the code 
run more slowly.


I remember reading an article by Apple about their GC in 
Objective-C and they said that it was a generational GC and that 
some objects would not be collected at all (if too old), if I 
remember correctly. Apparently it wasn't good enough, but that's 
about 7 years ago, so my memory might have been freed of some 
details :-)


Re: D is crap

2016-07-09 Thread ketmar via Digitalmars-d

On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:
p.s. also, it is funny that D's GC is actually *better* if one to 
avoid GC completely, yet people continue to ask for refcounting.


i meat: if i don't want to use GC in D, it is as easy as avoid 
`new` (and delegates with closures). any code that processing 
allocated objects, but never allocates itself doesn't need to be 
changed at all.


and with refcounting i have to *explicitly* mark all the code as 
"no refcounting here", or accept refcounting overhead for nothing.


Re: D is crap

2016-07-09 Thread ketmar via Digitalmars-d

On Saturday, 9 July 2016 at 11:10:22 UTC, bachmeier wrote:
The objection was always that it would make the code run more 
slowly.


i tend to ignore such persons completely after such a claim: they 
are obviously incompetent as programmers.


i also tend to ignore whole "@nogc" movement: it is just a failed 
marketing strategy, which (sadly) tends to consume alot of 
recources even today.


Re: D is crap

2016-07-09 Thread bachmeier via Digitalmars-d

On Saturday, 9 July 2016 at 08:06:54 UTC, ketmar wrote:
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:

removed the GC

...

replaced it with automatic reference counting.


you *do* know that refcounting *is* GC, do you? ;-)


And that's a very important point, because the choice of RC vs 
other types of GC ignores the fact that they're both GC, and old 
school programmers didn't want anything to do with a "feature" 
that would slow down their code. RC would have been an even worse 
choice when D started because it is [claimed to be] slower than 
other types of GC. It's been a long time now, but I don't recall 
many arguments against Java's GC because of pauses. The objection 
was always that it would make the code run more slowly.


Re: D is crap

2016-07-09 Thread Chris via Digitalmars-d
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:

On Friday, 8 July 2016 at 22:25:37 UTC, Chris wrote:
after Java. And D was invented when GC was expected by many 
people.


The GC was by far the most criticised feature of D...



GC was a big selling point. Every Java book went on about how


Err... no, the big selling point that gave Java traction was 
portability and Java being marketed as designed for the 
internet and web. GC languages were already available and in 
use, but the JVM/.NET made it difficult for commercial 
development platforms. Portability and Microsoft's dominance 
was a big issue back then.




Yes, of course the "write-once-run-everywhere" fairy tale helped 
to spread Java, but while it was gaining traction GC became a 
feature everybody wanted. Sorry, but there is not a single book 
or introduction to Java that doesn't go on about how great GC is. 
Java was the main catalyst for GC - or at least for people 
demanding it. Practically everybody who had gone through IT 
courses, college etc. with Java (and there were loads) wanted GC. 
It was a given for many people.


blah ... Apple even added GC to Objective-C to appease the GC 
crowd.


Apple removed the GC rather quickly for the same reasons that 
makes GC a bad choice for D. And replaced it with automatic 
reference counting.


Yes, it didn't last long. But the fact that they bothered to 
introduce it, shows you how big GC was/is.


Re: D is crap

2016-07-09 Thread Walter Bright via Digitalmars-d

On 7/8/2016 2:36 PM, Luís Marques wrote:

On Friday, 8 July 2016 at 21:26:19 UTC, Walter Bright wrote:

Only on Windows, and that's a common source of frustration for me :(


Linux too.


Not by default, right?


-g



Re: D is crap

2016-07-09 Thread ketmar via Digitalmars-d
On Saturday, 9 July 2016 at 07:52:57 UTC, Ola Fosheim Grøstad 
wrote:

removed the GC

...

replaced it with automatic reference counting.


you *do* know that refcounting *is* GC, do you? ;-)


Re: D is crap

2016-07-09 Thread Ola Fosheim Grøstad via Digitalmars-d

On Friday, 8 July 2016 at 22:25:37 UTC, Chris wrote:
after Java. And D was invented when GC was expected by many 
people.


The GC was by far the most criticised feature of D...



GC was a big selling point. Every Java book went on about how


Err... no, the big selling point that gave Java traction was 
portability and Java being marketed as designed for the internet 
and web. GC languages were already available and in use, but the 
JVM/.NET made it difficult for commercial development platforms. 
Portability and Microsoft's dominance was a big issue back then.



blah ... Apple even added GC to Objective-C to appease the GC 
crowd.


Apple removed the GC rather quickly for the same reasons that 
makes GC a bad choice for D. And replaced it with automatic 
reference counting.




Re: D is crap

2016-07-08 Thread Chris via Digitalmars-d

On Friday, 8 July 2016 at 21:53:58 UTC, Ola Fosheim Grøstad wrote:

On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:
As for GC, it's hard to tell. When D was actually (not 
hypothetically) created, GC was _the_ big thing. Java had just 
taken off, people were pissed off with C/C++, programming and 
coding was becoming more and more common.


Errr... Garbage collection was common since the 60s.


Which is not the point. My point was that everybody wanted GC 
after Java. And D was invented when GC was expected by many 
people.


One problem with GC in the late 80s and early 90s is that it 
requires twice as much memory and memory was scarce so 
reference counting was/is the better option. You could make the 
same argument about templates, memory...


Which is why D wouldn't have taken off in the 80ies (see my post 
above).


I also don't recall anyone being in awe of Java having GC. The 
big selling point was portability and the very hyped up idea 
that Java would run well in the browser, which did not 
materialize. Another selling point was that it wasn't 
Microsoft...


GC was a big selling point. Every Java book went on about how 
much safer it is, that you have more time for productive code, 
blah ... Apple even added GC to Objective-C to appease the GC 
crowd.





Re: D is crap

2016-07-08 Thread Ola Fosheim Grøstad via Digitalmars-d

On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:
As for GC, it's hard to tell. When D was actually (not 
hypothetically) created, GC was _the_ big thing. Java had just 
taken off, people were pissed off with C/C++, programming and 
coding was becoming more and more common.


Errr... Garbage collection was common since the 60s. One problem 
with GC in the late 80s and early 90s is that it requires twice 
as much memory and memory was scarce so reference counting was/is 
the better option. You could make the same argument about 
templates, memory...


I also don't recall anyone being in awe of Java having GC. The 
big selling point was portability and the very hyped up idea that 
Java would run well in the browser, which did not materialize. 
Another selling point was that it wasn't Microsoft...





Re: D is crap

2016-07-08 Thread Luís Marques via Digitalmars-d

On Friday, 8 July 2016 at 21:26:19 UTC, Walter Bright wrote:
Only on Windows, and that's a common source of frustration for 
me :(


Linux too.


Not by default, right? Only with the magic import and call. 
That's certainly better than on OS X, where there's no segfault 
handler at all (I don't think there's anything wrong with using 
it for a debug build), but it's something a bit obscure that is 
often not enabled when a segfault crash appears by surprise.


Re: D is crap

2016-07-08 Thread Walter Bright via Digitalmars-d

On 7/8/2016 8:17 AM, Luís Marques wrote:

On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:

If the program is compiled with -g and it crashes (seg faults) you'll usually
at least get a stack trace. Running it under a debugger will get you much more
information.


Only on Windows, and that's a common source of frustration for me :(


Linux too.


Re: D is crap

2016-07-08 Thread ketmar via Digitalmars-d
p.s. it's not something specifical to D. any program that 
"catches" segfault by itself should be burnt with fire.


Re: D is crap

2016-07-08 Thread ketmar via Digitalmars-d

On Friday, 8 July 2016 at 17:04:04 UTC, Luís Marques wrote:

On Friday, 8 July 2016 at 15:31:53 UTC, ketmar wrote:

core.exception.AssertError@z00.d(2): BOOM!


what am i doing wrong? O_O


That's an exception, not a segfault.
Try something like int* x; *x = 42;


segfault is impossible to catch outside of debugger. any hackish 
"solution" to this is WRONG.


Re: D is crap

2016-07-08 Thread Luís Marques via Digitalmars-d

On Friday, 8 July 2016 at 16:08:42 UTC, bachmeier wrote:
Yep. If you're going to pick any feature to use to sell a new 
language, lack of GC is the worst. The only ones that care (and 
it's a small percentage) are the ones that are least likely to 
switch due to their existing tools, libraries, and knowledge.


I said strong support for nogc (e.g. easy to do things with the 
stdlib without allocating), not that GC would not be available.


Re: D is crap

2016-07-08 Thread Luís Marques via Digitalmars-d

On Friday, 8 July 2016 at 15:30:12 UTC, Schrom, Brian T wrote:

I've had reasonable success using lldb on mac.


I was referring to the stack trace on segfault, but regarding the 
user of debuggers on a Mac with D, most of the time it doesn't 
work very well for me. I think last time I used lldb (maybe last 
week) when I tried to print something in a D program nothing 
would happen, not even an error. Now that lldc is more up-to-date 
I'll check if that helps lldb get less confused.


Re: D is crap

2016-07-08 Thread Chris via Digitalmars-d

On Friday, 8 July 2016 at 16:08:42 UTC, bachmeier wrote:

On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:

As for GC, it's hard to tell. When D was actually (not 
hypothetically) created, GC was _the_ big thing. Java had just 
taken off, people were pissed off with C/C++, programming and 
coding was becoming more and more common. Not having GC might 
actually have been a drawback back in the day. People would 
have complained that "Ah, D is like C++, no automatic memory 
management, I might as well stick to C++ or go for Java!" So 
no, I think D is where it is, because things are like they 
are, and "what if" discussions are useless. D has to keep on 
keeping on, there's no magic.


Yep. If you're going to pick any feature to use to sell a new 
language, lack of GC is the worst. The only ones that care (and 
it's a small percentage) are the ones that are least likely to 
switch due to their existing tools, libraries, and knowledge.


True. The last sentence is something to bear in mind whenever we 
discuss attracting more people. If someone is really into C++ 
bare metal micro-optimization kinda stuff, we won't win him/her 
over with "no GC". As you said, they're the least likely to 
switch for said reasons. To be able to opt out of GC is still 
important, but it's not that we will attract thousands and 
thousands of new users because of that.


Re: D is crap

2016-07-08 Thread Luís Marques via Digitalmars-d

On Friday, 8 July 2016 at 15:31:53 UTC, ketmar wrote:

core.exception.AssertError@z00.d(2): BOOM!


what am i doing wrong? O_O


That's an exception, not a segfault.
Try something like int* x; *x = 42;



Re: D is crap

2016-07-08 Thread Schrom, Brian T via Digitalmars-d


On 7/8/16 8:22 AM, Luís Marques via Digitalmars-d wrote:
> On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
>> If the program is compiled with -g and it crashes (seg faults)
>> you'll usually at least get a stack trace. Running it under a
>> debugger will get you much more information.
>
> Only on Windows, and that's a common source of frustration for me
> :(
>
I've had reasonable success using lldb on mac.



Re: D is crap

2016-07-08 Thread bachmeier via Digitalmars-d

On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:

As for GC, it's hard to tell. When D was actually (not 
hypothetically) created, GC was _the_ big thing. Java had just 
taken off, people were pissed off with C/C++, programming and 
coding was becoming more and more common. Not having GC might 
actually have been a drawback back in the day. People would 
have complained that "Ah, D is like C++, no automatic memory 
management, I might as well stick to C++ or go for Java!" So 
no, I think D is where it is, because things are like they are, 
and "what if" discussions are useless. D has to keep on keeping 
on, there's no magic.


Yep. If you're going to pick any feature to use to sell a new 
language, lack of GC is the worst. The only ones that care (and 
it's a small percentage) are the ones that are least likely to 
switch due to their existing tools, libraries, and knowledge.




Re: D is crap

2016-07-08 Thread ketmar via Digitalmars-d

On Friday, 8 July 2016 at 15:17:33 UTC, Luís Marques wrote:

On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
If the program is compiled with -g and it crashes (seg faults) 
you'll usually at least get a stack trace. Running it under a 
debugger will get you much more information.


Only on Windows, and that's a common source of frustration for 
me :(


=== z00.d ===
void func () {
  assert(0, "BOOM!");
}

void main () {
  func();
}

# dmd -g z00.d
# ./z00.d

core.exception.AssertError@z00.d(2): BOOM!

??:? _d_assert_msg [0xb7534687]
z00.d:2 void z00.func() [0x80489f2]
z00.d:6 _Dmain [0x80489ff]
??:? rt.dmain2._d_run_main(int, char**, extern (C) int 
function(char[][])*).runAll().__lambda1() [0xb7566326]
??:? void rt.dmain2._d_run_main(int, char**, extern (C) int 
function(char[][])*).tryExec(scope void delegate()) [0xb75661a0]
??:? void rt.dmain2._d_run_main(int, char**, extern (C) int 
function(char[][])*).runAll() [0xb75662d3]
??:? void rt.dmain2._d_run_main(int, char**, extern (C) int 
function(char[][])*).tryExec(scope void delegate()) [0xb75661a0]

??:? _d_run_main [0xb75660ff]
??:? main [0x8048a83]
??:? __libc_start_main [0xb6f3f696]

what am i doing wrong? O_O


Re: D is crap

2016-07-08 Thread Luís Marques via Digitalmars-d

On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
If the program is compiled with -g and it crashes (seg faults) 
you'll usually at least get a stack trace. Running it under a 
debugger will get you much more information.


Only on Windows, and that's a common source of frustration for me 
:(


Re: D is crap

2016-07-08 Thread Chris via Digitalmars-d

On Friday, 8 July 2016 at 01:17:55 UTC, Luís Marques wrote:

Sometimes I idly wonder what would have happened if D were 
available in the 80's. Sort of like if you put a modern car 
for sale in the 1960's.


I've also thought about that from time to time. I think D would 
have been very "mainstream-successful". Starting from where it 
actually started, I think things have worked out well for D, 
despite its still limited success. Looking back all of these 
years I think that D's marketing mistake was the garbage 
collection. Given its target audience and design trade-offs, I 
believe adoption of the language was disproportionally affected 
by that choice. If D had started with stronger support for 
nogc, even at the cost of delaying some other nice features, I 
believe adoption would have been quite stronger (and more 
easily snowballed) -- irrespective of the actual engineering 
merit of that D variant vs the true D. (it would also have 
avoided all the current piecemeal work of trying to remove GC 
allocation from Phobos, etc.; also, notice that nogc marketing 
would probably have been even more important in the 80s).


This is a futile discussion. D is in many respects a "hindsight 
language" as regards C/C++.[1] People naturally lacked hindsight 
back in the 80ies and a lot of D's features would have been 
frowned upon as "Don't need it!" (templates), "Waste of memory!" 
(e.g. `array.length`) etc. And remember computers and computing 
power were not as common as they are today. You were also dealing 
with a different crowd, there are by far more programmers around 
now than there used to be in the 80ies, with different 
expectations. In the 80ies most programmers were either hard core 
nerds (hence the nerdy image programmers have) or people who had 
lost their jobs elsewhere and had gone through re-educational 
programs to become programmers and thus were not really 
interested in the matter.


As for GC, it's hard to tell. When D was actually (not 
hypothetically) created, GC was _the_ big thing. Java had just 
taken off, people were pissed off with C/C++, programming and 
coding was becoming more and more common. Not having GC might 
actually have been a drawback back in the day. People would have 
complained that "Ah, D is like C++, no automatic memory 
management, I might as well stick to C++ or go for Java!" So no, 
I think D is where it is, because things are like they are, and 
"what if" discussions are useless. D has to keep on keeping on, 
there's no magic.


[1] Sometimes I think that D should to be careful not to become a 
language looked on by yet another "hindsight language".


Re: D is crap

2016-07-07 Thread deadalnix via Digitalmars-d

On Wednesday, 6 July 2016 at 04:56:07 UTC, Walter Bright wrote:
It's certainly doable, but in an age of priorities I suspect 
the time is better spent on


\o/


improving 64 bit code generation.


/o\



Re: D is crap

2016-07-07 Thread Luís Marques via Digitalmars-d

On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
Thanks for taking the time to write this. Let me see if I can 
help.


Wow, this was very well handled. Thanks for keeping your head 
cool and answering in a constructive, friendly and informative 
manner. It's even more admirable coming from someone who says he 
used not to be exactly the most "people person" / good boss / 
group leader, or whatever the expression was.


Sometimes I idly wonder what would have happened if D were 
available in the 80's. Sort of like if you put a modern car for 
sale in the 1960's.


I've also thought about that from time to time. I think D would 
have been very "mainstream-successful". Starting from where it 
actually started, I think things have worked out well for D, 
despite its still limited success. Looking back all of these 
years I think that D's marketing mistake was the garbage 
collection. Given its target audience and design trade-offs, I 
believe adoption of the language was disproportionally affected 
by that choice. If D had started with stronger support for nogc, 
even at the cost of delaying some other nice features, I believe 
adoption would have been quite stronger (and more easily 
snowballed) -- irrespective of the actual engineering merit of 
that D variant vs the true D. (it would also have avoided all the 
current piecemeal work of trying to remove GC allocation from 
Phobos, etc.; also, notice that nogc marketing would probably 
have been even more important in the 80s).


Re: D is crap

2016-07-07 Thread Jacob Carlborg via Digitalmars-d

On 06/07/16 07:01, Walter Bright wrote:


Apple has dropped all 32 bit support


No. For ARM 32bit is still relevant. On OS X the Simulator (used to test 
iOS applications) are running the iOS applications as x86 (both 32 and 
64bit) even though the iOS deceives are running ARM.


Apparently some users are still running 32bit applications on OS X 
because they have plugins that are only available as 32bit, think audio 
and music software.


--
/Jacob Carlborg


Re: D is crap

2016-07-06 Thread ketmar via Digitalmars-d

On Wednesday, 6 July 2016 at 10:26:27 UTC, qznc wrote:

If you want to distribute a binary


gods save me! why should i do that? i am GPL fanatic. and if i'm 
doing contract work, i know what machines my contractor will have.


for x86 you only have the 386 instructions. Ok, 686 is probably 
common enough today. For more special instructions, you could 
guard them and provide a fallback.


nope. just write in the readme: "you need at least Nehalem-grade 
CPU to run that". maybe check it at startup time and fail if CPU 
is too old. that's all, several lines of code. not any different 
from demanding 64-bit system. also note that some 64-bit systems 
can run 32-bit apps, but not vice versa.


GCC has a switch (-mx32) to store pointers as 32bit on a 64bit 
system. That is probably very close to what you want.


except that i should either build everything with that flag and 
hope for the best, or pay for the things i don't need.


Re: D is crap

2016-07-06 Thread ketmar via Digitalmars-d

$subj.


Re: D is crap

2016-07-06 Thread Guillaume Piolat via Digitalmars-d

On Wednesday, 6 July 2016 at 04:56:07 UTC, Walter Bright wrote:


It's certainly doable, but in an age of priorities I suspect 
the time is better spent on improving 64 bit code generation.


It's not like it is a blocker for anyone, there is:
- assembly
- auto-vectorization
- LDC

Not worth your time (and the backend risk!) imho.


Re: D is crap

2016-07-06 Thread qznc via Digitalmars-d

On Wednesday, 6 July 2016 at 01:30:46 UTC, ketmar wrote:
and i'm curious why everyone is so amazed by 64-bit systems. 
none of my software is using more than 2GB of RAM. why should i 
pay for something i don't need? like, all pointers are 
magically twice bigger. hello, cache lines, i have a present 
for you!


The advantage of compiling for AMD64 is that the compiler can 
assume a lot of extensions like the SSE bunch. If you want to 
distribute a binary for x86 you only have the 386 instructions. 
Ok, 686 is probably common enough today. For more special 
instructions, you could guard them and provide a fallback.


One example: To convert a floating point value to integer on 386, 
you need to store it to memory and load it again. Makes sense, if 
you floating point stuff is handled by a co-processor, but today 
this is completely integrated. SSE added an extra instruction to 
avoid the memory detour.


GCC has a switch (-mx32) to store pointers as 32bit on a 64bit 
system. That is probably very close to what you want.


Re: D is crap

2016-07-05 Thread Walter Bright via Digitalmars-d

On 7/5/2016 6:30 PM, ketmar wrote:

I'm curious about why you require 32 bits.

and i'm curious why everyone is so amazed by 64-bit systems. none of my software
is using more than 2GB of RAM. why should i pay for something i don't need?
like, all pointers are magically twice bigger. hello, cache lines, i have a
present for you!


I agree, 64 bits does have its drawbacks.

Apple has dropped all 32 bit support, Ubuntu is preparing to. I wonder when 
Windows will drop it like they did 16 bit support - they'll probably be the last 
to do so, as they value legacy compatibility more than anyone else.


But it's coming. We should be investing in the future with our limited 
resources.


Re: D is crap

2016-07-05 Thread Walter Bright via Digitalmars-d

On 7/5/2016 6:06 PM, deadalnix wrote:

On Tuesday, 5 July 2016 at 23:56:48 UTC, Walter Bright wrote:

On 7/5/2016 2:44 PM, ketmar wrote:

anyway, fixing long-standing bug with `align()` being ignored on stack variables
will allow to use SIMD on x86.


Not really. The alignment requirement has to be done by all functions, whether
they use SIMD or not.



The intel performance optimization manual have some nice trick to mix code with
different stack alignment.

You may want to check that out. Sadly, I can't find the article right now, but
mostly it boils down to :
 - as the stack grow down, you can mask the stack pointer at function entry to
get it aligned.
 - If both caller and callee both need alignment, callee can call/jump over the
masking instructions directly into the meat of the callee.



The trouble with that is you lose the ability to use EBP as a general purpose 
register, as you'll need EBP to point to the parameters and ESP to point to the 
locals.


It's a complex and disruptive change to the code generator.

It's certainly doable, but in an age of priorities I suspect the time is better 
spent on improving 64 bit code generation.


Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

p.s. *heat. ;-)

p.p.s. and i can use SIMD with DMD built-in asm, of course. 
that's what i did in Follin, and it works like a charm. but, of 
course, the code is completely unportable -- and this is 
something i wanted to avoid...


Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Wednesday, 6 July 2016 at 03:23:18 UTC, Basile B. wrote:
while using a 64 bit linux will help you, dancing naked in the 
street won't.


it will really help me to head my house, yes. so you proposing me 
to rebuild the world, and all my custom-built software (alot!) 
for... for nothing, as (i said it before) no of the apps i'm 
using require more than 2GB of RAM. and SIMD instructions are 
perfectly usable in 32-bit mode too -- with anything except DMD. 
so you proposing me to fuck my whole system to workarond a DMD 
bug/limitation. great. this is even more stupid than naked dances.


Re: D is crap

2016-07-05 Thread Basile B. via Digitalmars-d

On Wednesday, 6 July 2016 at 02:34:04 UTC, ketmar wrote:

On Wednesday, 6 July 2016 at 02:10:09 UTC, Basile B. wrote:
ok, bad bet but why do you insist with DMD 32 bit SIMD support 
? can't you use a 64 bit linux distribution ?


i can even dance naked on the street, no problems. but i just 
can't see a reason to do that.


That's a bad analogy. You layer two possibilities that are 
unrelated in order to show that the first one is stupid. But 
that's the opposite, this is the analogy that's stupid, not the 
the proposition you refer to, because while using a 64 bit linux 
will help you, dancing naked in the street won't.


Eventually this kind of reasoning will impress a child or someone 
a bit weak but seriously you can't win when the solution is so 
obvious.


Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Wednesday, 6 July 2016 at 02:10:09 UTC, Basile B. wrote:
ok, bad bet but why do you insist with DMD 32 bit SIMD support 
? can't you use a 64 bit linux distribution ?


i can even dance naked on the street, no problems. but i just 
can't see a reason to do that.


Re: D is crap

2016-07-05 Thread Basile B. via Digitalmars-d

On Wednesday, 6 July 2016 at 01:27:11 UTC, ketmar wrote:

On Tuesday, 5 July 2016 at 23:50:35 UTC, Basile B. wrote:

Major Linux distributions...

...

Are you windows Ketmar ?


no. GNU/Linux here. and i don't care what shitheads from "major 
linux distributions" may think.


ok, bad bet but why do you insist with DMD 32 bit SIMD support ? 
can't you use a 64 bit linux distribution ?


Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Tuesday, 5 July 2016 at 23:56:48 UTC, Walter Bright wrote:

On 7/5/2016 2:44 PM, ketmar wrote:
anyway, fixing long-standing bug with `align()` being ignored 
on stack variables

will allow to use SIMD on x86.


Not really. The alignment requirement has to be done by all 
functions, whether they use SIMD or not.


nope. it should be done only for the data that participating in 
SIMD. and this can be perfectly solved by fixing `align()` 
issues. some bit operations to adjust ESP won't hurt, if people 
want SIMD: SIMD will give much bigger win.


4. people wanting high performance are going to be using 64 
bits anyway

so i'm not in a set of "people". ok.


I'm curious about why you require 32 bits.


and i'm curious why everyone is so amazed by 64-bit systems. none 
of my software is using more than 2GB of RAM. why should i pay 
for something i don't need? like, all pointers are magically 
twice bigger. hello, cache lines, i have a present for you!


Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Tuesday, 5 July 2016 at 23:50:35 UTC, Basile B. wrote:

Major Linux distributions...

...

Are you windows Ketmar ?


no. GNU/Linux here. and i don't care what shitheads from "major 
linux distributions" may think.


Re: D is crap

2016-07-05 Thread deadalnix via Digitalmars-d

On Tuesday, 5 July 2016 at 23:56:48 UTC, Walter Bright wrote:

On 7/5/2016 2:44 PM, ketmar wrote:
anyway, fixing long-standing bug with `align()` being ignored 
on stack variables

will allow to use SIMD on x86.


Not really. The alignment requirement has to be done by all 
functions, whether they use SIMD or not.




The intel performance optimization manual have some nice trick to 
mix code with different stack alignment.


You may want to check that out. Sadly, I can't find the article 
right now, but mostly it boils down to :
 - as the stack grow down, you can mask the stack pointer at 
function entry to get it aligned.
 - If both caller and callee both need alignment, callee can 
call/jump over the masking instructions directly into the meat of 
the callee.




Re: D is crap

2016-07-05 Thread Walter Bright via Digitalmars-d

On 7/5/2016 2:44 PM, ketmar wrote:

anyway, fixing long-standing bug with `align()` being ignored on stack variables
will allow to use SIMD on x86.


Not really. The alignment requirement has to be done by all functions, whether 
they use SIMD or not.




4. people wanting high performance are going to be using 64 bits anyway

so i'm not in a set of "people". ok.


I'm curious about why you require 32 bits.


Re: D is crap

2016-07-05 Thread Basile B. via Digitalmars-d

On Tuesday, 5 July 2016 at 22:38:29 UTC, Seb wrote:

On Tuesday, 5 July 2016 at 21:44:17 UTC, ketmar wrote:

On Tuesday, 5 July 2016 at 20:27:58 UTC, Walter Bright wrote:
4. people wanting high performance are going to be using 64 
bits anyway


so i'm not in a set of "people". ok.


It might be a good time to think about your hardware. Btw there 
is a recent announcement that Ubuntu and others will drop 
32-bit support quite soon.

http://slashdot.org/story/313313

Here is a copy - the same arguments apply also for performance 
features.


Major Linux distributions are in agreement: it's time to stop 
developing new versions for 32-bit processors. Simply: it's a 
waste of time, both to create the 32-bit port, and to keep 
32-bit hardware around to test it on. At the end of June, 
Ubuntu developer Dimitri Ledkov chipped into the debate with 
this mailing list post, saying bluntly that 32-bit ports are a 
waste of resources. "Building i386 images is not 'for free', 
it comes at the cost of utilising our build farm, QA and 
validation time. Whilst we have scalable build-farms, i386 
still requires all packages, autopackage tests, and ISOs to be 
revalidated across our infrastructure." His proposal is that 
Ubuntu version 18.10 would be 64-bit-only, and if users 
desperately need to run 32-bit legacy applications, the'll 
have to do so in containers or virtual machines. [...] In a 
forum thread, the OpenSUSE Chairman account says 32-bit 
support "doubles our testing burden (actually, more so, do you 
know how hard it is to find 32-bit hardware these days?). It 
also doubles our build load on OBS".


I bet it's not a hardware thing but rather an OS thing. People on 
windows mostly use DMD 32 bit because the 64 bit version requires 
MS VS environment.


Are you windows Ketmar ?


Re: D is crap

2016-07-05 Thread Seb via Digitalmars-d

On Tuesday, 5 July 2016 at 21:44:17 UTC, ketmar wrote:

On Tuesday, 5 July 2016 at 20:27:58 UTC, Walter Bright wrote:
4. people wanting high performance are going to be using 64 
bits anyway


so i'm not in a set of "people". ok.


It might be a good time to think about your hardware. Btw there 
is a recent announcement that Ubuntu and others will drop 32-bit 
support quite soon.

http://slashdot.org/story/313313

Here is a copy - the same arguments apply also for performance 
features.


Major Linux distributions are in agreement: it's time to stop 
developing new versions for 32-bit processors. Simply: it's a 
waste of time, both to create the 32-bit port, and to keep 
32-bit hardware around to test it on. At the end of June, 
Ubuntu developer Dimitri Ledkov chipped into the debate with 
this mailing list post, saying bluntly that 32-bit ports are a 
waste of resources. "Building i386 images is not 'for free', it 
comes at the cost of utilising our build farm, QA and 
validation time. Whilst we have scalable build-farms, i386 
still requires all packages, autopackage tests, and ISOs to be 
revalidated across our infrastructure." His proposal is that 
Ubuntu version 18.10 would be 64-bit-only, and if users 
desperately need to run 32-bit legacy applications, the'll have 
to do so in containers or virtual machines. [...] In a forum 
thread, the OpenSUSE Chairman account says 32-bit support 
"doubles our testing burden (actually, more so, do you know how 
hard it is to find 32-bit hardware these days?). It also 
doubles our build load on OBS".


Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Tuesday, 5 July 2016 at 20:27:58 UTC, Walter Bright wrote:

1. so D can run on earlier 32 bit processors without SIMD


this is something programmer should check in runtime if he is 
using SIMD.


2. SIMD support requires the stack be aligned everywhere to 128 
bits. This can be a bit burdensome for 32 bit targets.


but...

3. (1) and (2) are not issues on OSX 32, because their memory 
model requires it


so the code is already there, but only for osx. fun.

anyway, fixing long-standing bug with `align()` being ignored on 
stack variables will allow to use SIMD on x86.


4. people wanting high performance are going to be using 64 
bits anyway


so i'm not in a set of "people". ok.


Re: D is crap

2016-07-05 Thread Walter Bright via Digitalmars-d

On 7/5/2016 9:11 AM, ketmar wrote:

'cause even documentation says so: "The vector extensions are currently
implemented for the OS X 32 bit target, and all 64 bit targets.". and this time
documentation is correct.


This is because:

1. so D can run on earlier 32 bit processors without SIMD

2. SIMD support requires the stack be aligned everywhere to 128 bits. This can 
be a bit burdensome for 32 bit targets.


3. (1) and (2) are not issues on OSX 32, because their memory model requires it

4. people wanting high performance are going to be using 64 bits anyway


Re: D is crap

2016-07-05 Thread Icecream Bob via Digitalmars-d

On Sunday, 3 July 2016 at 07:16:17 UTC, Bauss wrote:

On Sunday, 3 July 2016 at 04:37:02 UTC, D is crap wrote:

[...]


Say what? I have used it for multiple big projects of my own 
ranging from 4-10 lines of code?


[...]



[...]


That's adorable. You think that's a big project :D


Re: D is crap

2016-07-05 Thread Ola Fosheim Grøstad via Digitalmars-d

On Tuesday, 5 July 2016 at 14:38:07 UTC, ZombineDev wrote:
The fact core.simd exists (regardless how well it works) 
contradicts your statement.


Of course not. "core.simd" has been an excuse for not doing 
better.



The floats problem you talk about does not affect SIMD, so to


Of course it does. You need to take same approach to floats on 
both scalars and vectors.


I think you are mixing up SIMD with machine language.





Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Tuesday, 5 July 2016 at 14:52:33 UTC, ZombineDev wrote:

On Tuesday, 5 July 2016 at 12:40:57 UTC, ketmar wrote:

On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:

https://dlang.org/spec/simd.html and the list of intrinsics


core.simd is completely unusable on any 32-bit targets except 
hipsteros.


Why?


'cause even documentation says so: "The vector extensions are 
currently implemented for the OS X 32 bit target, and all 64 bit 
targets.". and this time documentation is correct.


Re: D is crap

2016-07-05 Thread ZombineDev via Digitalmars-d

On Tuesday, 5 July 2016 at 12:40:57 UTC, ketmar wrote:

On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:

https://dlang.org/spec/simd.html and the list of intrinsics


core.simd is completely unusable on any 32-bit targets except 
hipsteros.


Why? I only found this issue:
https://issues.dlang.org/show_bug.cgi?id=16092


Re: D is crap

2016-07-05 Thread ZombineDev via Digitalmars-d
On Tuesday, 5 July 2016 at 12:59:27 UTC, Ola Fosheim Grøstad 
wrote:

On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
Have you put any enhancement request on 
https://issues.dlang.org or written a DIP? If not, I can 
guarantee with almost 100% that it will not get worked because 
no one knows what you need.


SIMD support has been discussed and shot down before.


The fact core.simd exists (regardless how well it works) 
contradicts your statement. I still can't see what *you* find 
missing in the current implementation. Do you have any particular 
SIMD enhancement request that was declined?


The floats problem you talk about does not affect SIMD, so to me 
it seems that your just looking for excuses for not working on a 
solid proposal.






Re: D is crap

2016-07-05 Thread Ola Fosheim Grøstad via Digitalmars-d

On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:
Have you put any enhancement request on 
https://issues.dlang.org or written a DIP? If not, I can 
guarantee with almost 100% that it will not get worked because 
no one knows what you need.


SIMD support has been discussed and shot down before. I don't 
need anything and see no point in a DIP SIMD before getting 
floats fixed.


But it would make the language more attractive.



Re: D is crap

2016-07-05 Thread ketmar via Digitalmars-d

On Tuesday, 5 July 2016 at 11:27:33 UTC, ZombineDev wrote:

https://dlang.org/spec/simd.html and the list of intrinsics


core.simd is completely unusable on any 32-bit targets except 
hipsteros.


Re: D is crap

2016-07-05 Thread ZombineDev via Digitalmars-d
On Tuesday, 5 July 2016 at 09:51:01 UTC, Ola Fosheim Grøstad 
wrote:

On Tuesday, 5 July 2016 at 09:23:42 UTC, ZombineDev wrote:

https://gist.github.com/9il/a167e56d7923185f6ce253ee14969b7f
https://gist.github.com/9il/58c1b80110de2db5f2eff6999346a928

available today with LDC ;)


I meant good manual SIMD support in the language, not 
vectorization.


Have you put any enhancement request on https://issues.dlang.org 
or written a DIP? If not, I can guarantee with almost 100% that 
it will not get worked because no one knows what you need. If you 
really want good SIMD support in D, you should look at 
https://dlang.org/spec/simd.html and the list of intrinsics that 
GDC and LDC  provide and write a list of things that you find 
missing in terms of language support. Otherwise your claims are 
vague and non-actionable.


  1   2   >