Re: Accurately serializing and deserializing a SysTime in binary format

2020-07-21 Thread Ecstatic Coder via Digitalmars-d-learn
On Tuesday, 21 July 2020 at 12:21:16 UTC, Steven Schveighoffer 
wrote:

On 7/21/20 7:44 AM, Ecstatic Coder wrote:

On Tuesday, 21 July 2020 at 11:01:20 UTC, drug wrote:

On 7/20/20 10:04 PM, Ecstatic Coder wrote:
I'm currently implementing a small open source backup tool 
(dub), and therefore I need to accurately store the file 
modification SysTime in binary format, so that I can later 
load this SysTime from the snapshot file to compare it with 
the current file modification SysTime.


Having unfortunately not understood how to do this from the 
SysTime documentation, in despair, I've tried to directly 
serialize the 16 bytes of the SysTime value. This worked 
fine until I call the ".toISOString()" on the deserialized 
SysTime, which inevitably crashes the executable ;)


That is probably a bug. I serialize SysTime as long by means 
msgpack for exchanging between C++ client and D server and it 
works pretty nice.




Ah thanks for telling me :)

The loaded byte array in the union type was indeed the same as 
the saved one, so I immediately thought it was crashing 
because of some hidden pointer for timezone or something which 
was then pointing to garbage at reloading, causing the crash 
of the ".toISOString" call.


Not a bug.

8 of those 16 bytes is a pointer to the timezone, which is 
going to be different on different processes.


What you should do I think is serialize the stdTime [1], and 
set the time zone to whatever you want:


long serialize(SysTime st) { return st.stdTime; }
SysTime deserialize(long st) { return SysTime(st, UTC()); }

The stdTime is always stored as UTC to make math a lot easier. 
The time zone is only used for display.


-Steve

[1] 
https://dlang.org/phobos/std_datetime_systime.html#.SysTime.stdTime


Smart :)

Now I understand my mistake was to try to directly serialize a 
SysTime as provided by the "getTimes" function, instead of 
converting it to a StdTime, which is more versatile...


Re: Accurately serializing and deserializing a SysTime in binary format

2020-07-21 Thread Ecstatic Coder via Digitalmars-d-learn

On Tuesday, 21 July 2020 at 11:01:20 UTC, drug wrote:

On 7/20/20 10:04 PM, Ecstatic Coder wrote:
I'm currently implementing a small open source backup tool 
(dub), and therefore I need to accurately store the file 
modification SysTime in binary format, so that I can later 
load this SysTime from the snapshot file to compare it with 
the current file modification SysTime.


Having unfortunately not understood how to do this from the 
SysTime documentation, in despair, I've tried to directly 
serialize the 16 bytes of the SysTime value. This worked fine 
until I call the ".toISOString()" on the deserialized SysTime, 
which inevitably crashes the executable ;)


That is probably a bug. I serialize SysTime as long by means 
msgpack for exchanging between C++ client and D server and it 
works pretty nice.




Anyway, that's not really want I intended to do, as in 
practice a "ulong" already has enough  resolution for that 
purpose.


So sorry for my ignorance, but I would definitely need some 
help on how to :
- convert a file modification SysTime to a serializable 
number, for instance the number of hectonanoseconds since 
1/1/1970 in UTC;
- convert that number back into a SysTime that I can compare 
to the modification SysTime of the same file.


Eric


Ah thanks for telling me :)

The loaded byte array in the union type was indeed the same as 
the saved one, so I immediately thought it was crashing because 
of some hidden pointer for timezone or something which was then 
pointing to garbage at reloading, causing the crash of the 
".toISOString" call.





Re: Accurately serializing and deserializing a SysTime in binary format

2020-07-21 Thread Ecstatic Coder via Digitalmars-d-learn
As my question obviously didn't interest any expert, I took 
advantage of my lunch break to do some more research ;)


Maybe I'm wrong, but to my knowledge, there is no function to get 
the number of hectonanoseconds since January 1, 1970.


Fortunately I can get the number of seconds since the same date, 
and the number of remaining hectonanoseconds, and then use them 
in conjunction to create a new "SysTime".


With that I've got everything needed to fix my problem, and as I 
can store those values as two independent "uint", it's easy to 
compress them in the snapshot file, so no regrets :)




Accurately serializing and deserializing a SysTime in binary format

2020-07-20 Thread Ecstatic Coder via Digitalmars-d-learn
I'm currently implementing a small open source backup tool (dub), 
and therefore I need to accurately store the file modification 
SysTime in binary format, so that I can later load this SysTime 
from the snapshot file to compare it with the current file 
modification SysTime.


Having unfortunately not understood how to do this from the 
SysTime documentation, in despair, I've tried to directly 
serialize the 16 bytes of the SysTime value. This worked fine 
until I call the ".toISOString()" on the deserialized SysTime, 
which inevitably crashes the executable ;)


Anyway, that's not really want I intended to do, as in practice a 
"ulong" already has enough  resolution for that purpose.


So sorry for my ignorance, but I would definitely need some help 
on how to :
- convert a file modification SysTime to a serializable number, 
for instance the number of hectonanoseconds since 1/1/1970 in UTC;
- convert that number back into a SysTime that I can compare to 
the modification SysTime of the same file.


Eric



Re: Full precision double to string conversion

2018-11-03 Thread Ecstatic Coder via Digitalmars-d-learn
On Saturday, 3 November 2018 at 18:04:07 UTC, Stanislav Blinov 
wrote:
On Saturday, 3 November 2018 at 17:26:19 UTC, Ecstatic Coder 
wrote:



void main() {
double value = -12.000123456;
int precision = 50;

import std.stdio;
writefln("%.*g", precision, value);

import std.format;
string str = format("%.*g", precision, value);
writeln(str);
}

Prints:

-12.00012345600743415512260980904102325439453125
-12.00012345600743415512260980904102325439453125

That's not quite the -12.000123456 that you'd get from C#'s 
ToString().


Unfortunately, but that's still better though, thanks :)


I don't think you understood what I meant. Neither C# nor D 
attempt to exhaust the precision when converting, given default 
arguments. It's merely a matter of those defaults. The snippet 
above obviously provides *more* digits that the default 
.ToString() in C# would.


But indeed what I really need is a D function which gives a 
better decimal approximation to the provided double constant, 
exactly in the same way those in Dart and C# do.


Is there really no such function in D ?


When you call .ToString() in C# with no arguments, it assumes 
the "G" format specifier.


https://docs.microsoft.com/en-us/dotnet/standard/base-types/standard-numeric-format-strings?view=netframework-4.7.2#the-general-g-format-specifier

So for a double, it will use 15-digit precision. D's to!string 
simply uses lower default. If you want the exact same behavior 
as in C#, you can do this:


string toStringLikeInCSharp(double value) {
import std.format : format;
return format("%.15G", value);
}

void main() {
double value = -12.000123456;
import std.stdio;
writeln(value.toStringLikeInCSharp); // prints: 
-12.000123456

}


This version perfectly gets the job done!

Thanks a lot for your help :)



Re: Full precision double to string conversion

2018-11-03 Thread Ecstatic Coder via Digitalmars-d-learn
Actually, what I need is the D equivalent of the default 
ToString() function we have in Dart and C#.


I don't think it means what you think it means:

void main() {
double value = -12.000123456;
int precision = 50;

import std.stdio;
writefln("%.*g", precision, value);

import std.format;
string str = format("%.*g", precision, value);
writeln(str);
}

Prints:

-12.00012345600743415512260980904102325439453125
-12.00012345600743415512260980904102325439453125

That's not quite the -12.000123456 that you'd get from C#'s 
ToString().


Unfortunately, but that's still better though, thanks :)

All of them? Most implementations of conversion algorithms 
actually stop when it's "good enough". AFAIR, D doesn't even 
have it's own implementation and forwards to C, unless that 
changed in recent years.


What I meant was that getting too many significant digits would 
still be a better solution than not having them.


But indeed what I really need is a D function which gives a 
better decimal approximation to the provided double constant, 
exactly in the same way those in Dart and C# do.


Is there really no such function in D ?





Re: Full precision double to string conversion

2018-11-03 Thread Ecstatic Coder via Digitalmars-d-learn

On Saturday, 3 November 2018 at 12:45:03 UTC, Danny Arends wrote:
On Saturday, 3 November 2018 at 12:27:19 UTC, Ecstatic Coder 
wrote:

import std.conv;
import std.stdio;
void main()
{
double value = -12.000123456;
writeln( value.sizeof );
writeln( value );
writeln( value.to!string() );
writeln( value.to!dstring() );
}

/*
8
-12.0001
-12.0001
-12.0001
*/

In Dart, value.toString() returns "-12.000123456".

In C#, value.ToString() returns "-12.000123456".

In D, value.to!string() returns "-12.0001" :(

How can I convert a double value -12.000123456 to its string 
value "-12.000123456", i.e. without loosing double-precision 
digits ?


Specify how many digits you want with writefln:

writefln("%.8f", value);


Actually, what I need is the D equivalent of the default 
ToString() function we have in Dart and C#.


I mean a dumb double-to-string standard library conversion 
function which returns a string including all the double 
precision digits stored in the 52 significant bits of the value, 
preferably with the trailing zeroes removed.


For an unknown reason, D's default double-to-string conversion 
function only expose the single-precision significant digits :(




Full precision double to string conversion

2018-11-03 Thread Ecstatic Coder via Digitalmars-d-learn

import std.conv;
import std.stdio;
void main()
{
double value = -12.000123456;
writeln( value.sizeof );
writeln( value );
writeln( value.to!string() );
writeln( value.to!dstring() );
}

/*
8
-12.0001
-12.0001
-12.0001
*/

In Dart, value.toString() returns "-12.000123456".

In C#, value.ToString() returns "-12.000123456".

In D, value.to!string() returns "-12.0001" :(

How can I convert a double value -12.000123456 to its string 
value "-12.000123456", i.e. without loosing double-precision 
digits ?





Re: phobo's std.file is completely broke!

2018-09-21 Thread Ecstatic Coder via Digitalmars-d
On Thursday, 20 September 2018 at 19:49:01 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/19/2018 11:45 PM, Vladimir Panteleev wrote:
On Thursday, 20 September 2018 at 03:23:36 UTC, Nick 
Sabalausky (Abscissa) wrote:

(Not on a Win box at the moment.)


I added the output of my test program to the gist:
https://gist.github.com/CyberShadow/049cf06f4ec31b205dde4b0e3c12a986#file-output-txt



assert( dir.toAbsolutePath.length > MAX_LENGTH-12 );


Actually it's crazier than that. The concatenation of the 
current directory plus the relative path must be < MAX_PATH 
(approx.). Meaning, if you are 50 directories deep, a relative 
path starting with 50 `..\` still won't allow you to access 
C:\file.txt.




Ouch. Ok, yea, this is pretty solid evidence that ALL usage of 
non-`\\?\` paths on Windows needs to be killed dead, dead, dead.


If it were decided (not that I'm in favor of it) that we should 
be protecting developers from files named " a ", "a." and 
"COM1", then that really needs to be done on our end on top of 
mandatory `\\?\`-based access. Anyone masochistic enough to 
really WANT to deal with MAX_PATH and such is free to access 
the Win32 APIs directly.


+1

On Windows, every logical path provided to the std file functions 
should be properly converted to a physical path starting with 
that prefix.


Obviously this won't solve ALL Windows-specific problems, but 
that will AT LEAST remove a whole class of them.




Re: phobo's std.file is completely broke!

2018-09-19 Thread Ecstatic Coder via Digitalmars-d
On Thursday, 20 September 2018 at 03:15:20 UTC, Vladimir 
Panteleev wrote:
On Wednesday, 19 September 2018 at 06:11:22 UTC, Vladimir 
Panteleev wrote:
One point of view is that the expected behavior is that the 
functions succeed. Another point of view is that Phobos should 
not allow programs to create files and directories with 
invalid paths. Consider, e.g. that a user writes a program 
that creates a large tree of deeply nested filesystem objects. 
When they are done and wish to delete them, their file manager 
fails and displays an error. The user's conclusion? D sucks 
because it corrupts the filesystem and creates objects they 
can't operate with.


You don't even need to use crazy third-party software.

Try this program:

mkdir(`\\?\C:\ a \`);
write(`\\?\C:\ a \a.txt`, "Hello");

Then, try doing the following:

- Double-click the created text file.

- Try deleting the directory from Explorer (by sending it to 
the recycle bin).


- Try permanently deleting it (Shift+Delete).

- Try renaming it.

All of these fail for me. Deleting the directory doesn't even 
show an error - nothing at all happens.


When the OS itself fails to properly deal with such files, I 
don't think D has any business in *facilitating* their creation 
by default.


*Windows Explorer* prevents you from creating a folder or file 
whose name STARTS with spaces. It trims them automatically, 
whether you want it or not.


So it's NOT a surprise that *Windows Explorer* (!) has problems 
if you use it on such files which were created manually.


But obviously, *Windows* OS doesn't prevent you to create them 
through scripts and applications...







Re: phobo's std.file is completely broke!

2018-09-19 Thread Ecstatic Coder via Digitalmars-d
On Thursday, 20 September 2018 at 02:48:06 UTC, Nick Sabalausky 
(Abscissa) wrote:

On 09/19/2018 02:33 AM, Jonathan Marler wrote:


What drives me mad is when you have library writers who try to 
"protect" you from the underlying system by translating 
everything you do into what they "think" you're trying to do.


What drives me mad is when allegedly cross-platform tools 
deliberately propagate non-cross-platform quirks that could 
easily be abstracted away and pretend that's somehow "helping" 
me instead of making a complete wreck of the whole point of 
cross-platform. Bonus points if they're doing it mainly to help 
with my C++-standard premature optimizations.


If I actually want to deal with platform-specific quirks, then 
I'll use the platform's API directly. (And then I'll beat 
myself with a brick, just for fun.)


+1

A cross-platform library has to be designed to operate in the 
same way on each supported platform, even if this means that it's 
harder to implement on some platform, or that some platforms will 
need more complicated implementations.


That's the whole point of this "HAL" approach.


Re: phobo's std.file is completely broke!

2018-09-19 Thread Ecstatic Coder via Digitalmars-d
They are certainly going to be less expensive that actual 
filesystem operations that hit the physical disk, but it will 
still be an unwanted overhead in 99.9% of cases.


In any case, the overhead is only one issue.


Seriously, checking the file path string *length* is above 260 
characters to see if it needs to be fixed is not what I call an 
overhead.


And IF the path is indeed too long, IN THOSE CASES personally I'd 
prefer that the D standard library fixes the path in order to 
make the disk/file operation succeed, than having my application 
crash, because I didn't know I had to put a "version ( Windows )" 
fix somewhere in my code.


But hey, I may be wrong, software robustness and stability is 
often much overrated... ;)




Re: phobo's std.file is completely broke!

2018-09-19 Thread Ecstatic Coder via Digitalmars-d
On Wednesday, 19 September 2018 at 05:32:47 UTC, Vladimir 
Panteleev wrote:
On Wednesday, 19 September 2018 at 05:24:24 UTC, Ecstatic Coder 
wrote:
None would ever be, considering you obviously have decided to 
ignore such a simple solution to the 260 character limit...


Add "ad hominem" to your pile of fallacies, I guess.


Now I will, thanks :)

Once again, this forum proves to be very effective at removing 
any motivation from D users to get involved and contribute to the 
D language.


That's probably one of the keys of its success...


Re: phobo's std.file is completely broke!

2018-09-18 Thread Ecstatic Coder via Digitalmars-d
Do the PS2, GameCube and Xbox filesystems all have identical 
file path limits?


Guess ;)

And, did any of the paths in your game exceed 260 characters in 
length?


No. But the suggested GetPhysicalPath() solution would also work 
equally well in this case.



These comparisons are not helpful.


None would ever be, considering you obviously have decided to 
ignore such a simple solution to the 260 character limit...




Re: phobo's std.file is completely broke!

2018-09-18 Thread Ecstatic Coder via Digitalmars-d
There will always be inherent differences between platforms, 
because they are wildly different.


Right.

Technically the PS2 console, the GameCube and the Xbox console 
were very different from each other, so I had no choice but to 
implement low-level abstraction function (GetPhysicalPath() etc) 
to make the file system classes work similarly across all four 
systems.


That wasn't an easy task, but it made the life so much easier for 
the game programmers that it was obvious this was "the right 
thing" to do.


The fact that D's standard library has already bitten me several 
time with its platform specific problem clearly shows that you 
have chosen another path.


That's your right, but don't expect those who develop 
cross-platform tools in D to be happy to HAVE to put ugly 
"version ( ... )" stuff in their code when their software 
suddenly break on some platforms for unknown (= undocumented) 
reasons...





Re: phobo's std.file is completely broke!

2018-09-18 Thread Ecstatic Coder via Digitalmars-d

On Monday, 17 September 2018 at 22:58:46 UTC, tide wrote:
On Sunday, 16 September 2018 at 22:40:45 UTC, Vladimir 
Panteleev wrote:

On Sunday, 16 September 2018 at 16:17:21 UTC, tide wrote:
Nothing is "locked behind management". If you feel that some 
issue important to you is stalled, you can create a forum 
thread, or email Walter/Andrei to ask for a resolution.


Funny the other guy was saying to create a bugzilla issue.


Do that *first*.


That's already been done.

The path needs to be normalized, which means that \.\ and 
\..\ fragments need to be removed away first. Depending on 
your interpretation of symlinks/junctions/etc., 
"foo/bar/../" might mean something else than "foo/" if "bar" 
is a reparse point.


All these issues yet for some reason this function was 
included in the lot: 
https://dlang.org/phobos/std_path.html#absolutePath

[...]
This issue exists anyways, you'd only expand the path when it 
need to be used. If the file changes within milliseconds, I 
don't see that happening often and if it does there's a flaw 
in your design that'd happen even if the path didn't have to 
be constructed first.


You've missed the point. Complexity breeds bugs and unexpected 
behavior. The expectation is that D's function to delete a 
file should do little else than call the OS function.


If *YOU* are OK with the consequences of complexity, implement 
this in YOUR code, but do not enforce it upon others.


version(Windows)
{
if(path.length >= MAX_PATH)
{
// throw Exception(...) // effectively what happens now

// do workaround for
}
}

The complexity would only exist for those that need it. It'd be 
the difference between their code not working and code working. 
I'm sure people would rather their code work than not work in 
this case.


So you pass a valid path (selected by a user through a UI) to 
rmDir, and it doesn't remove the directory. You think this is 
acceptable behavior?


It is absolutely not acceptable behavior. Complain to 
Microsoft. The OS should not allow users to create or select 
paths that programs cannot operate on without jumping through 
crazy hoops.


Not that crazy, you can get the actual absolutePath with one of 
the OS functions. It isn't that difficult of a workaround.


"Workaround" ;)

That's the problem actually.

As suggested previously, the std.file functions should call a 
GetPhysicalPath function which just returns the path unchanged on 
Linux and MacOS, and on Windows simply checks if the file path is 
smaller or not than the 256 character limit, and if needed makes 
it absolute and prefixes it.


This has no performance impact, and brings a consistent behavior 
across platforms.


THAT would be a nice solution for the cross-platform developers 
who erroneously think that the standard library is already Doing 
The Right Thing (TM) so that their code doesn't need 
platform-specific "workarounds"...


Re: phobo's std.file is completely broke!

2018-09-18 Thread Ecstatic Coder via Digitalmars-d
On Saturday, 15 September 2018 at 23:06:57 UTC, Jonathan M Davis 
wrote:
On Saturday, September 15, 2018 6:54:50 AM MDT Josphe Brigmo 
via Digitalmars-d wrote:

On Saturday, 15 September 2018 at 12:38:41 UTC, Adam D. Ruppe

wrote:
> On Saturday, 15 September 2018 at 10:57:56 UTC, Josphe Brigmo
>
> wrote:
>> Phobos *NEEDS* to be modified to work with these newer OS's.
>
> You need to look at the source code before posting. The code 
> for remove is literally

>
> DeleteFileW(name);
>
> it is a one-line wrapper, and obviously uses the unicode 
> version.

>
> https://github.com/dlang/phobos/blob/master/std/file.d#L1047

It doesn't matter, the fact is that something in phobos is 
broke. Do you really expect me to do all the work? The fact 
that using executeShell or "\\?\" solves 90% of the 
problems(maybe all of them) proves that phobos is not up to 
par.


Using std.file should be on par with using the Windows API from 
C or C++. It doesn't try to fix the arguably broken behavior of 
the Windows API with regards to long paths but requires that 
the programmer deal with them just like they would in C/C++. 
The main differences are that the std.file functions in 
question use D strings rather than C strings, and they 
translate them to the UTF-16 C strings for you rather than 
requiring you to do it. But they don't do anything like add 
"\\?\" for you any more than the Windows API itself does that.


If you consider that to be broken, then sorry. For better or 
worse, it was decided that it was better to let the programmer 
deal with those intricacies rather than trying to tweak the 
input to make it work based on the idea that that could have 
undesirable consequences in some circumstances. On some level, 
that does suck, but the Windows API does not make it easy to 
make this work like it would on a *nix system without 
introducing subtle bugs.


If you find that the std.file functions don't work whereas 
using the same input to the Windows API functions in C/C++ 
would have, then there's a bug in the D code, and it needs to 
be fixed, but if it acts the same as the C/C++ code, then it's 
working as intended.


- Jonathan M Davis


This attitude is unfortunately the cause of a lot frustration 
among cross-platform developers like me.


I chose D for my file scripting needs because it's a 
cross-platform language.


I expect that calling the function F on system X will work the 
same as calling that same function on system Y.


That's the contract in cross-platform programming.

Unfortunately D fails at being consistent.

I recently learned this lesson with my Resync tool.

No everybody wants the cross-platform to behave inconsistently.

For example, in the past I've implemented a proprietary 
cross-platform C++ game engine for Windows, PS2, Xbox and 
GameCube.


The games needed some tuning for the graphics, etc.

But code-wise, the engine made the games behave consistently 
across the different platforms.


This was all about making each method of each class behaving the 
same. As simple as that.


Indeed, on some platforms, the game engine also provided extra 
classes and/or methods to add some functionalities specific to 
these platforms.


But the common trunc was implemented (!) to behave the same. That 
was what our game developers expected...




Re: D IDE

2018-09-05 Thread Ecstatic Coder via Digitalmars-d
Except that you don't have projects or solutions with something 
like vim or emacs. There is no structure specific to them. You 
can set them up to do the build from inside them, and with 
emacs, you can run gdb inside it if you're on an appropriate 
platform, but you're not going to have a "vim" project or an 
"emacs" project. That whole concept is an IDE thing. They edit 
files, and they can do that perfectly fine regardless of what's 
being used to run the build or whatever other tools are 
necessary for the development process.


If I'm in a situation like you describe, then I usually set it 
up so that I can just run the build and tests from the command 
line and not even bother opening up Visual Studio. VS projects 
actually have a way to do that. You don't actually have to open 
up VS to do any building. And if I really need to open up VS to 
run the debugger, then I'll do that, but I won't use VS for 
anything that I don't have to use it for. And in my experience, 
the debugger is pretty much the only thing that would typically 
require actually opening up VS.


There is no reason to muck with the build process or source 
control stuff in order to use vim or emacs. That stuff can 
pretty much always be done from the command-line using all of 
the standard tools that everyone else is using. Just because 
most developers would use the IDE to run the build doesn't mean 
that it's actually required for it. If it were, then stuff like 
automated builds wouldn't be possible.


Regardless, I use vim for editing code. And if I'm actually 
forced to have an IDE like VS or Eclipse open because of some 
tool that has to be run from inside for some reason (which 
aside from the debugger is rarely the case), then I'll have the 
IDE open for whatever it has to be open for. But I don't use 
the IDE for editing code, because that would be a horribly 
inefficient way to do go about it.


- Jonathan M Davis


+1

What must be absolutely standardized is what is *shared* across 
the members of the team (code presentation, tabs, naming 
conventions, build process, versioning, test and deployment 
procedures, etc etc).


But as long as the coding standard is followed, obviously any 
code editor should be fine if it makes you more productive.


For instance, even for contract work, I use Geany for all my 
developments.


And a portable IDE like Geany is especially useful when 
developping *crossplatform* C++ multimedia applications which 
must be edited and tested both on Windows, MacOS and Linux.


It is the perfect companion to cmake, behaving exactly the same 
whatever the platform (editing, find and replace, find in files, 
macros, settings, etc).


And indeed you can still open your project in Visual Studio when 
you need to use a Windows debugger.


Personally I use Geany even for Unity game development, as Unity 
allows to define which editor should be used to show the 
erroneous line of C# code when double clicking onto an error 
message.


Geany is great for that too, as it opens often much faster than 
other IDE...


So my point is, as long as all the shared team standard 
procedures are respected, I fon't see why any company should 
decide which code editor *must* be used by all its developers...





Re: D is dead

2018-09-05 Thread Ecstatic Coder via Digitalmars-d

Hang on a second.

assert(preserve == Yes.preserveAttributes);

Something is smelling an awful lot here.

Up to Windows 7 CopyFileW which is used for Windows didn't copy 
the attributes over[0] but it does now.


This is a bug on our end, which should include a fallback to 
copying manually the file contents over.


[0] 
https://docs.microsoft.com/en-us/windows/desktop/api/winbase/nf-winbase-copyfilew


Yeah, keeping exactly the same behavior on every supported 
platform is never easy.


And when you need to support Windows too, by experience I know it 
can quickly become a pain in the *ss...


Re: D is dead

2018-09-04 Thread Ecstatic Coder via Digitalmars-d
On Tuesday, 4 September 2018 at 09:56:13 UTC, rikki cattermole 
wrote:

On 04/09/2018 9:40 PM, Ecstatic Coder wrote:
But it seems that the latest version of "std.file.copy" now 
completely ignores the "PreserveAttributes.no" argument on 
Windows, which made recent Windows builds of Resync fail on 
read-only files.


What???

There is nothing in the changelog between 2.080.0 and 2.082.0 
for changes to std.file.


Version from July 2017[0]. Version from 2.082.0[1]. They look 
the same to me.


[0] 
https://github.com/dlang/phobos/blob/d8959320e0c47a1861e32bbbf6a3ba30a019798e/std/file.d#L3430
[1] 
https://github.com/dlang/phobos/blob/v2.082.0/std/file.d#L4216


Mayb I'm wrong, but what I can say is that I've recently updated 
DMD and compiled a windows build of Resync, and that I *HAD* to 
make Windows-specific code that removes the "read-only" 
attributes only on Windows.


attributes = source_file_path.getAttributes();
source_file_path.getTimes( access_time, modification_time );

version ( Windows )
{
if ( target_file_path.exists() )
{
target_file_path.setAttributes( attributes & ~1 );
}

source_file_path.copy( target_file_path, 
PreserveAttributes.no );

target_file_path.setAttributes( attributes & ~1 );
target_file_path.setTimes( access_time, modification_time );
target_file_path.setAttributes( attributes );
}
else
{
if ( target_file_path.exists() )
{
target_file_path.setAttributes( 511 );
}

source_file_path.copy( target_file_path, 
PreserveAttributes.no );

target_file_path.setAttributes( attributes );
target_file_path.setTimes( access_time, modification_time );
}

Honestly I don't see why I have to make this ugly fix on Windows, 
while the Linux version has always worked fine on read-only files.




Re: D is dead

2018-09-04 Thread Ecstatic Coder via Digitalmars-d

On Thursday, 23 August 2018 at 06:34:01 UTC, nkm1 wrote:
On Thursday, 23 August 2018 at 05:37:12 UTC, Shachar Shemesh 
wrote:

Let's start with this one:
https://issues.dlang.org/show_bug.cgi?id=14246#c6

The problems I'm talking about are not easily fixable. They 
stem from features not playing well together.


One that hurt me lately was a way to pass a scoped lazy 
argument (i.e. - to specify that the implicit delegate need 
not allocate its frame, because it is not used outside the 
function call).


The only real problem with D is that it's a language designed 
with
GC in mind, yet there are numerous attempts to use it without 
GC.
Also, supporting GC-less programming gets in the way of 
improving

D's GC (which is pretty damn bad by modern standards).
That's the only real technical problem.
For example, the "bug" above just means that D doesn't support 
RAII
(in the C++ sense). That's hardly a *fatal flaw*. Lots of 
languages don't
support RAII. Python, Java, C# - tons of code were written in 
those.
And yes, most of those just use GC to dispose of memory - other 
resources
are rarely used (compared to memory) and it's not a problem to 
manage them

manually.
You also mentioned lazy parameters allocating... GC thing 
again. Just

allocate then? No?
IMO, if getting the maximum number of users is the main goal, D 
is indeed
going the wrong way. It would be better to get rid of @nogc, 
betterC, dip1000,
implement write barriers and use them to improve GC. Martin 
Nowak (I think)
mentioned that write barriers will decrease performance of D 
programs by 1-5%.
Seems like a small price to pay for better GC with shorter 
pauses. It would also
probably be simpler technically than stuff like dip1000 and 
rewriting Phobos.
Of course, maximizing the number of users is not the only goal, 
or even the
main one. My understanding is that Walter wants a "systems 
language" with
"zero cost abstractions". Well, it's very well possible that 
D's design

precludes that.
Other than memory management, I don't see any real fundamental 
problems.


+1

Making D a "true" C++ competitor is not going to happen soon.

Even Rust, which IS by definition a true C++ competitor (no GC, 
etc), will still find it very hard to replace C++ in its current 
niche markets, like embedded and game development.


While putting all the "funded" efforts in making D a *direct* 
competitor to GC languages (like Go, Crystal, C#, Java, etc) 
would be an achievable goal, IMHO...







Re: D is dead (was: Dicebot on leaving D: It is anarchy driven development in all its glory.)

2018-09-04 Thread Ecstatic Coder via Digitalmars-d
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:

On 22/08/18 21:34, Ali wrote:

On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
Pretty positive overall, and the negatives he mentions are 
fairly obvious to anyone paying attention.


Yea, I agree, the negatives are not really negative

Walter not matter how smart he is, he is one man who can work 
on the so many things at the same time


Its a chicken and egg situation, D needs more core 
contributors, and to get more contributors it needs more 
users, and to get more users it need more core contributors




No, no and no.

I was holding out on replying to this thread to see how the 
community would react. The vibe I'm getting, however, is that 
the people who are seeing D's problems have given up on 
affecting change.


It is no secret that when I joined Weka, I was a sole D 
detractor among a company quite enamored with the language. I 
used to have quite heated water cooler debates about that point 
of view.


Every single one of the people rushing to defend D at the time 
has since come around. There is still some debate on whether, 
points vs. counter points, choosing D was a good idea, but the 
overwhelming consensus inside Weka today is that D has *fatal* 
flaws and no path to fixing them.


And by "fatal", I mean literally flaws that are likely to 
literally kill the language.


And the thing that brought them around is not my power of 
persuasion. The thing that brought them around was spending a 
couple of years working with the language on an every-day basis.


And you will notice this in the way Weka employees talk on this 
forum: except me, they all disappeared. You used to see Idan, 
Tomer and Eyal post here. Where are they?


This forum is hostile to criticism, and generally tries to keep 
everyone using D the same way. If you're cutting edge D, the 
forum is almost no help at all. Consensus among former posters 
here is that it is generally a waste of time, so almost 
everyone left, and those who didn't, stopped posting.


And it's not just Weka. I've had a chance to talk in private to 
some other developers. Quite a lot have serious, fundamental 
issues with the language. You will notice none of them speaks 
up on this thread.


They don't see the point.

No technical project is born great. If you want a technical 
project to be great, the people working on it have to focus on 
its *flaws*. The D's community just doesn't do that.


To sum it up: fatal flaws + no path to fixing + no push from 
the community = inevitable eventual death.


With great regrets,
Shachar


Same feeling here btw.

I regularly have to face strange bugs while updating the compiler 
or its libraries.


For instance, my Resync tool used to work fine both on Windows 
and Linux.


But it seems that the latest version of "std.file.copy" now 
completely ignores the "PreserveAttributes.no" argument on 
Windows, which made recent Windows builds of Resync fail on 
read-only files.


Very typical...

While D remains my favorite file scripting language, I must admit 
that this is quite disappointing for such an old language, 
compared to similar languages like Crystal.




Re: Symmetry Autumn of Code

2018-08-05 Thread Ecstatic Coder via Digitalmars-d-announce

On Sunday, 5 August 2018 at 05:22:44 UTC, Mike Franklin wrote:

On Sunday, 5 August 2018 at 04:47:42 UTC, tanner00 wrote:

Hi, I’m interested in working on this project and just wanted 
to touch base. Is there any word on who will be mentoring this 
project? I’m entering college this fall but I’ve been 
programming since a very young age and enjoy systems 
programming.


The project is mostly about creating high-performance, 
resource-efficient 2D software rasterizer, something like this 
(http://nothings.org/gamedev/rasterize/) or 
(https://medium.com/@raphlinus/inside-the-fastest-font-renderer-in-the-world-75ae5270c445)  If that isn't enough work for the event you can build on it by creating path objects with clipping and offsetting (http://angusj.com/delphi/clipper.php), rasterizing TrueType or OpenType fonts, creating drawing primitives, and even potentially creating 2D widgets (buttons, text labels, etc.)  I think it's up to you how much of it you want to take on.


I proposed the idea, but I don't think I'd be a very good 
mentor for the project because I've never created a 2D 
rasterizer myself.  However, I'd be happy to help anyone 
working on the project in an unofficial capacity, and can 
probably articulate the intended use case for it.


Mike


I was 14 and a half when I implemented my first depth buffer 
based rasterizer, in 6502 assembly on a C64, for a hi-res mode 3D 
renderer.


The algorithm, despite being "naive", is actually still an 
efficient one even now.


I stored an array of x/depth ranges (one per raster line), and 
updated them while drawing the wireframe points of the 3 clipped 
edges, while updating the y range of the triangle.


Then I simply iterated on the triangle y range and drew the inner 
points (between minimum_x+1 and maximum_x-1), using the filling 
color and interpolating depth.


Clearly not realtime as my character-based wireframe renderer, I 
admit it.


But this more than fast enough to quickly render a hi-res 3d 
scene in memory in *filled* mode.


So this "dumb" algorithm may still be worth being investigated in 
your case, as this C64 implementation was meant to run on a 
rather similar hardware (very limited memory and CPU, only fixed 
point operations, etc).


Just add antialiasing on the wireframe edges and you're done...



Re: Symmetry Autumn of Code

2018-08-05 Thread Ecstatic Coder via Digitalmars-d-announce

On Sunday, 5 August 2018 at 05:16:50 UTC, Mike Parker wrote:

On Sunday, 5 August 2018 at 04:47:42 UTC, tanner00 wrote:



[...]
Hi, I’m interested in working on this project and just wanted 
to touch base. Is there any word on who will be mentoring this 
project? I’m entering college this fall but I’ve been 
programming since a very young age and enjoy systems 
programming.


No one has volunteered to mentor this project yet, but if you'd 
like to write a proposal for it we can find a mentor if you are 
selected.


Btw I *had* ("And I'd be glad to mentor you on this :)", here on 
July 24th).


Thanks for remembering me why I now better enjoy the Crystal 
community...




Re: C's Biggest Mistake on Hacker News

2018-07-26 Thread Ecstatic Coder via Digitalmars-d
It's the same story as always, just from complaining, things 
won't get magically better... (though it would be great if the 
world worked that way because then maybe my relationships would 
be more successful :O)


You can choose whatever priorities you prefer for your 
scholarship and funded projects.


Sorry to have showed my disagreement with some of your choices 
and strategies.


That was silly, being a loss of time for both of us, indeed.



Re: C's Biggest Mistake on Hacker News

2018-07-26 Thread Ecstatic Coder via Digitalmars-d

On Thursday, 26 July 2018 at 06:04:33 UTC, Paulo Pinto wrote:

On Wednesday, 25 July 2018 at 21:16:40 UTC, Walter Bright wrote:

On 7/24/2018 4:53 AM, Ecstatic Coder wrote:

     str = str1 + " " + str2;


But you have to be careful how it is written:

str = "hello" + "world";
str = "hello" + "world" + str1;

don't work, etc.


Well, like everything in C++, there is always a way.

 str = "hello"s + "world";
 str = "hello"s + "world" + str1;

Spot the difference. :)


It's just synctactic sugar for a constructed string. You can't 
even use C++14 string constants to initialize a string view, or 
you have a dangling pointer, as it's NOT a true constant. 
Ridiculous...


Re: C's Biggest Mistake on Hacker News

2018-07-25 Thread Ecstatic Coder via Digitalmars-d

On Wednesday, 25 July 2018 at 21:16:40 UTC, Walter Bright wrote:

On 7/24/2018 4:53 AM, Ecstatic Coder wrote:

     str = str1 + " " + str2;


But you have to be careful how it is written:

str = "hello" + "world";
str = "hello" + "world" + str1;

don't work, etc.


Yeah. That's exactly there where D shines, and C++ s*cks...

C++ string constants are stupid pointers, no size etc. Indeed one 
big C++ silly thing that Walter fixed perfectly. He is the only 
language designed who found and applied the perfect solution for 
strings, arrays and slices. Big respect to him...


Re: C's Biggest Mistake on Hacker News

2018-07-25 Thread Ecstatic Coder via Digitalmars-d

On Wednesday, 25 July 2018 at 20:24:39 UTC, bpr wrote:
On Wednesday, 25 July 2018 at 17:23:40 UTC, Ecstatic Coder 
wrote:

But don't be too optimistic about BetterC...


I'm too old to get optimistic about these things. In the very 
best case, D has quite an uphill battle for market share. Any 
non mainstream language does. If I were a betting man, I'd bet 
on Rust.


Honestly, considering D's leadership current priorities, I 
don't see how it could become soon a true C++ or Go 
competitor, even with the half-baked BetterC initiative...


There are a few ways I can see, and doubtless others can see 
different ones. Here's one: use Mir and BetterC to write a 
TensorFlow competitor for use in developing and deploying ML 
models. I'm sure you can shoot holes in that idea, but you get 
the point. Try lots of things and see what works, and keep 
doing more of those things. Worked for Python.


For instance, I've suggested they consider using reference 
counting as an alternative default memory management scheme, 
and add it to the lists of scolarship and crowdsourced 
project, and of course they have added all the other 
suggestion, but not this one. What a surprise ;)


I'm pretty sure D leadership is pursuing such things. In fact,

https://wiki.dlang.org/Vision/2018H1

rather prominently mentions it.

Despite this is probably one of the most used allocation 
management scheme in typical C++ development, as this 
drastically reduces the risks of memory leaks and dangling 
pointers...


Anyway, meanwhile D remains a fantastic strongly-typed 
scripting language for file processing and data analysis, and 
its recent adoption at Netflix has once again clearly proved 
it...


For this and similar uses, tracing GC is fine, better in fact 
than the alternatives. I'm only making noise about betterC for 
the cases where C++ dominates and tracing GC is a showstopper.


In an alternative timeline, DasBtterC would have been released 
before D with GC, and the main libraries would have been nogc, 
and maybe there'd be a split between raw pointers and traced 
refs (like Nim and Modula-3) and then maybe there'd have been 
no strong desire for Rust since D could have filled that niche.


+1


Re: C's Biggest Mistake on Hacker News

2018-07-25 Thread Ecstatic Coder via Digitalmars-d

On Wednesday, 25 July 2018 at 16:39:51 UTC, bpr wrote:

On Tuesday, 24 July 2018 at 17:24:41 UTC, Seb wrote:

On Tuesday, 24 July 2018 at 17:14:53 UTC, Chris M. wrote:

On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:
On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder 
wrote:

[...]


No. For many C++ users, tracing GC is absolutely not an 
option. And, if it were, D's GC is not a shining example of 
a good GC. It's not even precise, and I would bet that it 
never will be. If I'm able to tolerate a GC, there are 
languages with much better GCs than the D one, like Go and 
Java.


[...]


There was a precise GC in the works at one point, no clue 
what happened to it.


The newest PR is:

https://github.com/dlang/druntime/pull/1977

Though there's already a bit of precise scanning on Windows, 
e.g. https://github.com/dlang/druntime/pull/1798 and IIRC 
Visual D uses a precise GC too.


Well, this is a big problem with D IMO. There are a lot of 
unfinished, half baked features which linger in development for 
years. How long for precise GC now, over 5 years? I don't think 
D was really designed to be friendly to GC, and it just isn't 
realistic to expect that there will *ever* be a production 
quality precise GC for all of D. Maybe giving up on some things 
and finishing/fixing others would be a better strategy? I think 
so, which is why I think DasBetterC is the most appealing thing 
I've seen in D lately.


+1

But don't be too optimistic about BetterC...

Honestly, considering D's leadership current priorities, I don't 
see how it could become soon a true C++ or Go competitor, even 
with the half-baked BetterC initiative...


For instance, I've suggested they consider using reference 
counting as an alternative default memory management scheme, and 
add it to the lists of scolarship and crowdsourced project, and 
of course they have added all the other suggestion, but not this 
one. What a surprise ;)


Despite this is probably one of the most used allocation 
management scheme in typical C++ development, as this drastically 
reduces the risks of memory leaks and dangling pointers...


Anyway, meanwhile D remains a fantastic strongly-typed scripting 
language for file processing and data analysis, and its recent 
adoption at Netflix has once again clearly proved it...


Re: C's Biggest Mistake on Hacker News

2018-07-25 Thread Ecstatic Coder via Digitalmars-d

On Wednesday, 25 July 2018 at 08:23:40 UTC, Paulo Pinto wrote:

On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:

On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:

[...]


+1

IMO, D in its current state, and with its current ecosystem, 
even after more than a decade of existence, is still NOT the 
best alternative to C/C++ where they HAVE to be used 
(microcontrollers, game engines, etc), despite D has always 
had this objective in mind. And despite C++ is an unsafe 
language which makes it easy to have memory leaks, dangling 
pointers, etc.


[...]


I might add that the C# 7.x improvements for low level memory 
management, and the effort that Unity is doing with their C# 
subset (HPC# with Burst compiler toolchain) to migrate core 
subsystems from C++ to C#, it gets even harder for adoption in 
the games industry.


https://unity3d.com/unity/features/job-system-ECS

Mike Acton and Andreas Fredriksson left Insomianc Games to help 
drive this effort.


Mike opinions regarding performance and C vs C++ are very well 
known across the gaming industry, and here he is improving C# 
performance at Unity.


--
Paulo


Yop :)

Orthodox C++ and data-oriented designs are now the basis of most 
new game engines since several years.


I'm glad that the Unity management has finally decided to switch 
its engin to a more modern archicture, so we can now develop our 
games as everybody else in the industry...




Re: Comparing D vs C++ (wierd behaviour of C++)

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 21:03:00 UTC, Patrick Schluter wrote:

On Tuesday, 24 July 2018 at 19:39:10 UTC, Ecstatic Coder wrote:
He gets different results with and without optimization 
because without optimization the result of the calculation is 
spilled to the i unsigned int and then reloaded for the print 
call. This save and reload truncated the value to its real 
value. In the optimized version, the compiler removed the 
spill and the overflowed value contained in the register is 
printed as is.


Btw you are actually confirming what I said.

if (i != 0x) ...

In the optimized version, when the 64 bits "i" value is 
compared to a 32 bits constant, the test fails...


Proof that the value is stored in a **64** bits register, not 
32...


We're nitpicking over vocabulary. For me buffer != register. 
Buffer is something in memory in my mental model (or is 
hardware like the store buffer between register and the cache) 
but never would I denominate a register as a buffer.


Pick the word you prefer, the i value is stored in a 64 bits 
"place", hence the weird behavior.


Re: Comparing D vs C++ (wierd behaviour of C++)

2018-07-24 Thread Ecstatic Coder via Digitalmars-d
He gets different results with and without optimization because 
without optimization the result of the calculation is spilled 
to the i unsigned int and then reloaded for the print call. 
This save and reload truncated the value to its real value. In 
the optimized version, the compiler removed the spill and the 
overflowed value contained in the register is printed as is.


Btw you are actually confirming what I said.

if (i != 0x) ...

In the optimized version, when the 64 bits "i" value is compared 
to a 32 bits constant, the test fails...


Proof that the value is stored in a **64** bits register, not 
32...




Re: Comparing D vs C++ (wierd behaviour of C++)

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 15:08:35 UTC, Patrick Schluter wrote:

On Tuesday, 24 July 2018 at 14:41:17 UTC, Ecstatic Coder wrote:

On Tuesday, 24 July 2018 at 14:08:26 UTC, Daniel Kozak wrote:

I am not C++ expert so this seems wierd to me:

#include 
#include 

using namespace std;

int main(int argc, char **argv)
{
char c = 0xFF;
std::string sData = {c,c,c,c};
unsigned int i = (sData[0]&0xFF)*256
+ (sData[1]&0xFF))*256)
+ (sData[2]&0xFF))*256
+ (sData[3]&0xFF));

if (i != 0x) { // it is true why?
// this print 18446744073709551615 wow
std::cout << "WTF: " << i  << std::endl;
}   
return 0;
}

compiled with:
g++ -O2 -Wall  -o "test" "test.cxx"
when compiled with -O0 it works as expected

Vs. D:

import std.stdio;

void main(string[] args)
{
char c = 0xFF;
string sData = [c,c,c,c];
uint i = (sData[0]&0xFF)*256
+ (sData[1]&0xFF))*256)
+ (sData[2]&0xFF))*256
+ (sData[3]&0xFF));
if (i != 0x) { // is false - make sense
writefln("WTF: %d", i);
}   
}

compiled with:
dmd -release -inline -boundscheck=off -w -of"test" "test.d"

So it is code gen bug on c++ side, or there is something 
wrong with that code.


As the C++ char are signed by default, when you accumulate 
several shifted 8 bit -1 into a char result and then store it 
in a 64 bit unsigned buffer, you get -1 in 64 bits : 
18446744073709551615.


That's not exactly what happens here. There's no 64 bit buffer.


Sure about that ? ;)

As "i" is printed as 18446744073709551615 when put into cout, I 
don't see how I couldn't be stored as a uint64...


It's actually -1 stored as an uint64.

This kind of optimizer problem is classical when mixing signed 
and unsigned values into such bit shifting expressions.


This is why you should always cast the signed input values to the 
unsigned result type right from the start before starting to 
mix/shift them.




Re: C's Biggest Mistake on Hacker News

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 16:15:52 UTC, bpr wrote:

On Tuesday, 24 July 2018 at 14:07:43 UTC, Ecstatic Coder wrote:

On Tuesday, 24 July 2018 at 13:23:32 UTC, 12345swordy wrote:
On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder 
wrote:
So, at the moment, I don't see how you can EASILY convince 
people to use BetterC for C/C++ use cases, like programming 
games, microcontrollers, etc.


*Extremely powerful meta programming that blows c++ meta 
programming out of the water

*Clean readable syntax
*No header file nonsense
*Standard keyword for ASM if you really need the performance 
boost.

*Compiler enforce memory safety.

-Alex


I know.

And D's builtin strings/arrays/slices/maps/etc and automatic 
memory deallocation are part of what makes D a better 
alternative to C++ too.


No. For many C++ users, tracing GC is absolutely not an option. 
And, if it were, D's GC is not a shining example of a good GC. 
It's not even precise, and I would bet that it never will be. 
If I'm able to tolerate a GC, there are languages with much 
better GCs than the D one, like Go and Java.


I work in a mostly C++ shop where exceptions are intolerable in 
C++ code, and in many places we use CRTP to eliminate dispatch 
overhead. DasBetterC would be usable here but it's too late 
given the existing investment in C++. Obviously there's no CRTP 
in DasBetterC without struct inheritance, but there are other 
designs to address this issue.


Besides having more betterC libraries, I'd like to see some 
kind of restricted approach to exception handling, like the 
ones being investigated in 
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf. If you want a better C++, look at what people who have to use C++ use it for, and where the pain points are.


I agree.

What I leant is that after having built several realtime 3D 
engines using simoly strong/weak references to transparently 
release unused objects, I don't see why such features couldn't be 
integrated in a language like D as a core feature (T,T^,T*), 
instead of being a template library.


This gets the job done, and while not perfect, this remains very 
handy. A cycle detector is only required as a debugging tool.


All you need is 3 kinds of pointers:
- strong reference
- weak reference
- raw pointer

And, unfortunately, more discipline to manage mutual references 
yourself, instead of letting the GC manage that for you. So in 
some cases, having an optional cycle colkector can be very useful 
when using D in a Go-like way...




Re: Comparing D vs C++ (wierd behaviour of C++)

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 14:08:26 UTC, Daniel Kozak wrote:

I am not C++ expert so this seems wierd to me:

#include 
#include 

using namespace std;

int main(int argc, char **argv)
{
char c = 0xFF;
std::string sData = {c,c,c,c};
unsigned int i = (sData[0]&0xFF)*256
+ (sData[1]&0xFF))*256)
+ (sData[2]&0xFF))*256
+ (sData[3]&0xFF));

if (i != 0x) { // it is true why?
// this print 18446744073709551615 wow
std::cout << "WTF: " << i  << std::endl;
}   
return 0;
}

compiled with:
g++ -O2 -Wall  -o "test" "test.cxx"
when compiled with -O0 it works as expected

Vs. D:

import std.stdio;

void main(string[] args)
{
char c = 0xFF;
string sData = [c,c,c,c];
uint i = (sData[0]&0xFF)*256
+ (sData[1]&0xFF))*256)
+ (sData[2]&0xFF))*256
+ (sData[3]&0xFF));
if (i != 0x) { // is false - make sense
writefln("WTF: %d", i);
}   
}

compiled with:
dmd -release -inline -boundscheck=off -w -of"test" "test.d"

So it is code gen bug on c++ side, or there is something wrong 
with that code.


As the C++ char are signed by default, when you accumulate 
several shifted 8 bit -1 into a char result and then store it in 
a 64 bit unsigned buffer, you get -1 in 64 bits : 
18446744073709551615.


Re: C's Biggest Mistake on Hacker News

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 13:23:32 UTC, 12345swordy wrote:

On Tuesday, 24 July 2018 at 09:54:37 UTC, Ecstatic Coder wrote:
So, at the moment, I don't see how you can EASILY convince 
people to use BetterC for C/C++ use cases, like programming 
games, microcontrollers, etc.


*Extremely powerful meta programming that blows c++ meta 
programming out of the water

*Clean readable syntax
*No header file nonsense
*Standard keyword for ASM if you really need the performance 
boost.

*Compiler enforce memory safety.

-Alex


I know.

And D's builtin strings/arrays/slices/maps/etc and automatic 
memory deallocation are part of what makes D a better alternative 
to C++ too.


I'm just saying : Kotlin Native automated memory management 
through automated reference counting with cycle detection.


That solution may have its own drawbacks over a "true" 
traditional garbage collector, but its main advantage is that 
it's transparent. Business as usual...


And IF you need to disable the cycle collector, you can still 
have a TRUE and COMPLETE replacement for C++, by simply using 
weak references to avoid strong reference cycles, just like in 
the provided standard library.


Best of both worlds, no need for a "nogc" standard library, as it 
IS nogc by default, while still providing exactly the same 
functionalities as in the "gc" standard library...




Re: C's Biggest Mistake on Hacker News

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 12:13:27 UTC, Atila Neves wrote:

On Tuesday, 24 July 2018 at 11:53:35 UTC, Ecstatic Coder wrote:

On Tuesday, 24 July 2018 at 10:40:33 UTC, Dukc wrote:

On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:

[...]


They already work, except for the concatenation operator 
because it obviously requires the GC. And converiting a 
pointer from C code to D is easy, because you can slice 
pointers just like arrays -it's just that it won't be bounds 
checked.


Nice.

But if you want D to be REALLY appealing to a majority of C++ 
developers, you'd better provide them with the FULL D 
experience.


And unfortunately, using builtin arrays/strings/slices/maps in 
the usual way is probably a big part for it.


Don't forget that concatenating strings in C++ is perfectly 
ALLOWED in C++, WITHOUT using a GC...


Same in D, it's just that nobody's bothered writing a string 
class/struct.


Atila


Indeed...


Re: C's Biggest Mistake on Hacker News

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 10:40:33 UTC, Dukc wrote:

On Monday, 23 July 2018 at 15:06:16 UTC, Ecstatic Coder wrote:
And something that REALLY must be integrated into BetterC's 
low-level standard library in some way IMHO...


They already work, except for the concatenation operator 
because it obviously requires the GC. And converiting a pointer 
from C code to D is easy, because you can slice pointers just 
like arrays -it's just that it won't be bounds checked.


Nice.

But if you want D to be REALLY appealing to a majority of C++ 
developers, you'd better provide them with the FULL D experience.


And unfortunately, using builtin arrays/strings/slices/maps in 
the usual way is probably a big part for it.


Don't forget that concatenating strings in C++ is perfectly 
ALLOWED in C++, WITHOUT using a GC...


#include 

using namespace std;

int main()
{
string str, str1, str2;
str1 = "Hello";
str2 = "World";
str = str1 + " " + str2;
cout << str << endl;

return 0;
}

Instead of removing D's GC and the feature which rely on it, 
you'd better replace it by something which releases the unused 
memory blocks as soon as they have to be, like the reference 
counted approach used not only in C++, but also in Kotlin Native, 
Crack, etc...


THAT would make D stand above its competition, by making it more 
pleasing and enjoyable to use than C, C++ and Rust for instance 
for their typical use cases...




Re: C's Biggest Mistake on Hacker News

2018-07-24 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 24 July 2018 at 00:41:54 UTC, RhyS wrote:

On Monday, 23 July 2018 at 22:45:15 UTC, Walter Bright wrote:
I've predicted before that what will kill C is managers and 
customers requiring memory safety because unsafeness costs 
them millions. The "just hire better programmers" will never 
work.


I have yet to see a company Walter where higher ups will take 
correct actions to resolve issues.


Customers do not understand  about programming. Your lucky 
if most clients can even get a proper specification formulated 
for what they want. If clients are that knowledgeable we do not 
need to constantly deal with issues where clients had things in 
their heads different then what they told / envisioned.


And most manager are not going to rock the boat and stick their 
necks out. Not when they can simply blame issues on programmer 
incompetence or "it has always been like that with programming 
languages". I have yet to see managers really taking 
responsibility beyond guiding the projects so they do not get 
fired and hope to rack in bonuses. Issues can always be blamed 
on the tools or programmers.


Sorry but that response is so naive Walter that it surprises 
me. Its like wanting a unicorn.


And frankly, good luck convincing any company to convert 
millions of C code into D code. Not when manager hear about 
some new language or framework or whatever that is the chizz. 
They rather keep running the old code and move to something 
new. D is not something new, its not the chizz, its the same 
issue that D has struggle with for years.


Its the same reason why that topic derailed so fast. You want 
to see something fun. Mention PHP on HackerNews/Reddit and you 
see the exact same trolling. People rather push their new 
favorite language, be it Go, Rust, ... then pick D.


Response at my work when i made some stuff in D... "Why did you 
not use Go". Because the managers knew Go from the hype. They 
know Google is behind it. And some of our colleagues in sister 
companies already used Go. And that is all it takes.


I am sorry to say but to succeed as a language beyond being a 
small or hobby language it takes: Being established already or 
having a big name to hype behind your "product". Anything 
beyond that will have topic derail and frankly, its more 
negative then positive.


And D has too much old baggage. Its the same reason why PHP 
despite being a good language ( for what it is ), still keeps 
getting the exact same crude on forums.


If i am honest, DasBetterC is a for me unreliable D product 
because using specific D library function can be GC. Or 
DasBetterC needs to be sold as C only, ever, forget about 
everything else that is D ( library, packages, ... ). Until 
everything is 100% GC free, your going to run into this. And 
even when its 100% GC free, people have long memories.


Its always a struggle swimming up a river.


+1

IMO, D in its current state, and with its current ecosystem, even 
after more than a decade of existence, is still NOT the best 
alternative to C/C++ where they HAVE to be used 
(microcontrollers, game engines, etc), despite D has always had 
this objective in mind. And despite C++ is an unsafe language 
which makes it easy to have memory leaks, dangling pointers, etc.


Because in those case, most of the time, when you use C or C++, 
it's because you HAVE to, not only because they run fast, but 
also because they can run without on a garbage collector. Just 
that simple.


In C++, the memory is immediately released to the allocation 
system by the collections and smart pointers as soon as it's no 
longer used.


This may not be perfect, but this process is continuous and 
predictable. In D, unused memory block are progressively filling 
the available memory until the non-incremental GC is triggered, 
either automatically or manually. Completely the opposite way. 
Not really appropriate for a "forever" event loop, where the 
unused memory HAS be released in a continuous way, not once in a 
while.


And when you CAN afford to use a garbage collector, unfortunately 
D is still not the best pick in many use cases.


While D's standard library makes D a great "plug-n-play" language 
for file processing and data analysis, for many other use cases, 
like web development for instance, some very recent languages 
already provide better alternatives "out of the box" (Go, 
Crystal, etc), as there have PLENTY of third party libraries (web 
frameworks, etc) built on top of the SAME building blocks 
provided by the default libraries of those languages.


So, at the moment, I don't see how you can EASILY convince people 
to use BetterC for C/C++ use cases, like programming games, 
microcontrollers, etc.


Same if you want to EASILY convince people to start to use D 
today for many Go/C#/Java/etc use cases, like developing online 
services, web sites, etc.


Despite I know that some pioneer companies may have already 
chosen D for those same use cases, and are perfectly 

Re: Sutter's ISO C++ Trip Report - The best compliment is when someone else steals your ideas....

2018-07-23 Thread Ecstatic Coder via Digitalmars-d

On Tuesday, 3 July 2018 at 03:27:06 UTC, Ali wrote:
Well, D is not exactly known for contract oriented programming 
or DbC (Design by Contract)
we have to thank Bertrand Meyer and his language Eiffel, for 
that


Thanks for pointing this out !

His book "Object-Oriented Software Construction" is an absolute 
MUST-READ for any decent programmer.


Contracts, large-scale object-oriented architecture, how to 
assign responsabilities to the right class, etc.


Even somthing seemingly insignificant as using uppercase 
typenames is a complete life changer, as this way you can 
immediately see the role of a single-word identifier just by its 
case.


That's after reading his book almost three decades ago that I've 
decided to use the following conventions for my personal code :


- PLAYER : type
- Player : member variable
- player : local variable

Still don't understand why people are still adding silly prefixes 
or suffixes ("m_", "_", "this->", etc etc) to differentiate local 
variables from member variables etc :


- Player : type
- player_, _player, m_player, this->player, etc : member variable
- player : local variable

While using the identifier case gets the job done in a simpler 
and more readable way...


IMO reading this book should be mandatory for any second-year 
student who is learning professional software development...






Re: C's Biggest Mistake on Hacker News

2018-07-23 Thread Ecstatic Coder via Digitalmars-d

On Monday, 23 July 2018 at 11:51:54 UTC, Jim Balter wrote:

On Sunday, 22 July 2018 at 20:10:27 UTC, Walter Bright wrote:

On 7/21/2018 11:53 PM, Walter Bright wrote:
My article C's Biggest Mistake on front page of 
https://news.ycombinator.com !


Direct link:
https://news.ycombinator.com/item?id=17585357


The responses are not encouraging, but I suppose they're useful 
for sociologists studying fallacious thinking.


I agree.

As I've already said in the past here on this forum, D's way of 
managing string/array/slices in the same manner is one of its 
biggest advances over C/C++, both in safety and expressivity.


Very simple stuff indeed, but still lightyears ahead of C++, 
Java, C#, etc.


And something that REALLY must be integrated into BetterC's 
low-level standard library in some way IMHO...


Re: Symmetry Autumn of Code

2018-07-23 Thread Ecstatic Coder via Digitalmars-d-announce

On Monday, 23 July 2018 at 08:08:03 UTC, Mike Franklin wrote:

On Monday, 23 July 2018 at 06:24:04 UTC, Zheng (Vic) Luo wrote:

Moreover, The term "dependency-free" in the project 
description often confuses me, because as a hardware-agnostic 
library the project does have to depend on external 
implementations like "sin"/"memset" or even "thread_start", 
and I'm not sure which kind of dependency is proper for this 
project: Should we assume a multi-threading model? Should this 
library rely on "malloc"/"free"? Correct me if my 
understanding is wrong since I had few experience on embedded 
programming.


There is more to this project than just getting a software 
rasterizer in D.  Part of the goal is to demonstrate D as a 
formidable alternative to C in micrcontroller firmware 
programming.  D will never achieve that notoriety if it's 
always depending on C, the C runtime, the C standard library, 
or some library implemented in C.


So, IMO, if you need to link in a library or object file that 
was not compiled from D code, then you're cheating.  This is 
also one of the reasons why I suggested re-implementing 
software building blocks such as `memcpy`, `memset`, `malloc`, 
`free`, etc. in D as another potential project for the Autumn 
of Code.


So, to keep this software rasterizer project within scope, I 
suggest creating naive implementations of those functions in D 
for now to stay true to spirit of the project (no dependencies, 
everything in D), and "make the point".  You can those software 
building blocks in their own module, and let the user of the 
software rasterizer library link it their own implementation if 
they wish to deviate from the spirit of the proposal.


Mike


I agree

But this BetterC minimalistic standard library (allocations, 
arrays, strings, slices, maps) is something which can be reused 
by many similar hardware-level projects.


This is a project on its own, and as I said, I think it should 
better be provided to the candidate so he can use his development 
time on developing the rasterizer, and, if there is enough time, 
a minimalistic nuklear-like gui system over it to demonstrate its 
performance and usefulness.




Re: Symmetry Autumn of Code

2018-07-23 Thread Ecstatic Coder via Digitalmars-d-announce

On Monday, 23 July 2018 at 09:09:40 UTC, Mike Franklin wrote:

On Sunday, 22 July 2018 at 17:12:31 UTC, Ecstatic Coder wrote:


2/ Nuklear (https://github.com/vurtun/nuklear)


Reading the documentation for Nuklear, I found this: 
https://rawgit.com/vurtun/nuklear/master/doc/nuklear.html#drawing


To draw all draw commands accumulated over a frame you need 
your own render backend able to draw a number of 2D 
primitives. This includes at least filled and stroked 
rectangles, circles, text, lines, triangles and scissors


That's basically what the Autumn of Code proposal would like to 
have built in D: A rasterizer with fundamental drawing 
primitives.  So, it seems Nuklear is a library intended to be 
built on top of the proposed rasterizer.


Mike


+1

Then I agree that Antigrain is probably the best reference code 
for the antialiased renderer, as its code is small, very complete 
(ttf/gsv/raster fonts, top quality antialiasing, etc) and 
reasonably fast.


IMO the better-C standard library runtime should be provided to 
the developer in charge of developing that rasterizer.


Re: Symmetry Autumn of Code

2018-07-22 Thread Ecstatic Coder via Digitalmars-d-announce
I'm interested in the "Graphics library for resource 
constrained embedded systems" project and have some spare time 
this autumn, but I have some questions:
- Does this project aim at creating a hardware-agnostic 
rasterizer supporting a few primitives like https://skia.org/ 
or implementing a full GUI library like emWin rendering widget 
and handling I/O events such as mouse? The latter one sounds a 
little bit challenging to finish in four months
- In the past year I primarily wrote C++ and don't have much 
experiences with production-level D programming, can I get 
involved into this program?


Thanks


IMHO no need to reinvent the wheel for that.

You can probably do both in four months, if you just "port" 
(separately) and bind the code of the two following libraries :

1/ swGL (https://github.com/h0MER247/swGL)
2/ Nuklear (https://github.com/vurtun/nuklear)

They have a very open design, and are already quite well 
optimized for speed and memory consumption.


Moreover this would allow the D port of the Nuklear library to 
also use a hardware accelerated renderer on desktop platforms.


Nice isn't it ?

And I'd be glad to mentor you on this :)




Re: Symmetry Autumn of Code

2018-07-18 Thread Ecstatic Coder via Digitalmars-d-announce
I've said, that if we get signatures, I'll build the damn thing 
myself.
Signatures give a very lightweight vtable implementation while 
also giving conceptual representation of structs+classes.


Which for an event loop, is a very desirable thing to have. But 
alas, I'm waiting on my named parameter DIP and seeing where 
that goes, before continuing work on signatures.


Thanks for the clear explanations.

Glad to know that you're on this.

I hope the importance of your work for D's "competivity" will be 
truly recognized.


Re: Symmetry Autumn of Code

2018-07-18 Thread Ecstatic Coder via Digitalmars-d-announce
On Wednesday, 18 July 2018 at 03:19:53 UTC, rikki cattermole 
wrote:

On 18/07/2018 5:36 AM, Ecstatic Coder wrote:

On Saturday, 14 July 2018 at 06:02:37 UTC, Mike Parker wrote:
Thanks to the sponsorship of Symmetry Investments, the D 
Language Foundation is happy to announce the Symmetry Autumn 
of Code!


We're looking for three university students to hack on D this 
autumn, from September - January. We're also in search of 
potential mentors and ideas for student projects. Head to the 
Symmetry Autumn of Code page for the details.


Spread the word!

https://dlang.org/blog/symmetry-autumn-of-code/


I'd suggest adding the following to SAOC 2018 project 
proposals :


1/ adding a Go-like http module to the standard library
2/ adding Go-like async IO management to the standard library, 
i.e. fibers communicating through blocking channels


Until we get an event loop in druntime, both of these options 
are off the table.


Sad.

Then I'd suggest to add the event loop implementation to SAOC 
2018 too, because the absence of a default http module in D's 
standard library may have very good justifications, but I'm still 
convinced that it doesn't help when trying to "sell" it to modern 
developers, considering that nowadays MANY of the applications 
they will develop in a professional facility will have to 
integrate http code to access or update the company's data.


Re: Symmetry Autumn of Code

2018-07-17 Thread Ecstatic Coder via Digitalmars-d-announce

On Saturday, 14 July 2018 at 06:02:37 UTC, Mike Parker wrote:
Thanks to the sponsorship of Symmetry Investments, the D 
Language Foundation is happy to announce the Symmetry Autumn of 
Code!


We're looking for three university students to hack on D this 
autumn, from September - January. We're also in search of 
potential mentors and ideas for student projects. Head to the 
Symmetry Autumn of Code page for the details.


Spread the word!

https://dlang.org/blog/symmetry-autumn-of-code/


I'd suggest adding the following to SAOC 2018 project proposals :

1/ adding a Go-like http module to the standard library
2/ adding Go-like async IO management to the standard library, 
i.e. fibers communicating through blocking channels
3/ possibility to use automatic reference counting (with weak 
references) instead of garbage collection for automatic unused 
memory deallocation
4/ adding automatic cycle detection and collection to the 
automatic reference counting system


(https://wiki.dlang.org/SAOC_2018_ideas)

Thanks :)



Re: Funding code-d

2018-07-17 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 13 July 2018 at 15:05:05 UTC, Michael wrote:

On Friday, 13 July 2018 at 14:20:19 UTC, Mike Parker wrote:
As promised in my tweet of June 30 (and to the handful of 
people who emailed me), the cloud of mystery surrounding the 
use of the money raised for code-d and its supporting tools 
has now been (partially) lifted!


In this post, I lay out the details of how the first $1000 
will be paid out to project maintainer Jan Jurzitza, a.k.a 
Webfreak001, and explain what we hope to achieve with this 
ecosystem fundraising initiative going forward.


This time around, it all came together in the background of 
prepping for DConf with little forethought beyond activating 
an Open Collective goal and then working with Jan to determine 
the details. Lessons were learned. Later this year, you'll see 
the result when we announce the next of what we hope to be an 
ongoing series of funding targets.


In the meantime:

The blog
https://dlang.org/blog/2018/07/13/funding-code-d/

Reddit
https://www.reddit.com/r/d_language/comments/8yka7b/funding_coded_the_d_blog/


I think this is a worthy cause for the money. I'm glad to see 
the D foundation looking more towards investing in these kinds 
of community projects, as they make up the D ecosystem that 
many opponents of D describe as lacking.


Yeah, indeed all of my friends who tried D and were not convinced 
that's it's a good alternative to Go/C++/etc were pushed away by 
a bad support in Visual Studio Code. Even those who didn't use 
Visual Studio Code (most of them actually).


Like being able to automatically make a foreach loop become 
parallel for instance. Invaluable...


Keep up the good work and let's invest still more money in 
extending further this fantastic plugin.


PS: Joking... ;)


Re: Funding code-d

2018-07-17 Thread Ecstatic Coder via Digitalmars-d-announce

On Saturday, 14 July 2018 at 16:19:29 UTC, Joakim wrote:

On Friday, 13 July 2018 at 14:20:19 UTC, Mike Parker wrote:
As promised in my tweet of June 30 (and to the handful of 
people who emailed me), the cloud of mystery surrounding the 
use of the money raised for code-d and its supporting tools 
has now been (partially) lifted!


In this post, I lay out the details of how the first $1000 
will be paid out to project maintainer Jan Jurzitza, a.k.a 
Webfreak001, and explain what we hope to achieve with this 
ecosystem fundraising initiative going forward.


This time around, it all came together in the background of 
prepping for DConf with little forethought beyond activating 
an Open Collective goal and then working with Jan to determine 
the details. Lessons were learned. Later this year, you'll see 
the result when we announce the next of what we hope to be an 
ongoing series of funding targets.


In the meantime:

The blog
https://dlang.org/blog/2018/07/13/funding-code-d/

Reddit
https://www.reddit.com/r/d_language/comments/8yka7b/funding_coded_the_d_blog/


Nice explication of the plan, really needed. Why github never 
rolled out such a bounty program for OSS and other public 
projects has to be one of the head-scratching moves of all 
time, no wonder they were about to run out of money before they 
sold.


A good way to decide on future projects would be to let 
prospective donors stake money on various proposals, to see how 
much backing they might receive, sort of like how kickstarter 
and other crowdfunding sites work.


+1

May I suggest the two following improvements for the next 
proposals :


1/ integrating a Go-like web server code inside the default 
library (http module, fiber and channel async IO)
2/ possibility to use automatic reference counting (with weak 
references and optional cycle detection) instead of garbage 
collection for automatic unused memory deallocation


The first one to help D compete on the same grounds as Go and 
Crystal, and the second to make it usable in the same GC-unwanted 
use cases where people currently use C or C++.


Probably just a silly idea, please feel free to completely ignore 
it...


PS: Geany is also a VERY nice multi-platform IDE to develop in 
C++ and D on Linux, Windows and Mac, for those who still don't 
know it...


Re: I have a plan.. I really DO

2018-07-16 Thread Ecstatic Coder via Digitalmars-d-announce

On Monday, 16 July 2018 at 07:49:33 UTC, Kagamin wrote:

On Friday, 13 July 2018 at 19:30:07 UTC, RhyS wrote:
If there is a language out there that gaps that C/Java/dynamic 
fast and easy feel, and offers the ability to compile down 
with ease. I have not seen it.


There's no silver bullet, you can choose from what exists or 
create your own.


If D could be used with automatic reference counting (with native 
weak references with OPTIONAL automatic cycle collection), while 
remaining easy to interoperate with C++, that could be a good 
candidate...


I mean having something like that :

T* pointer on any type (scalar, struct, class)
T strong reference on a class
T^ weak reference on a class

And have --arc and --acc as compiler options.

PS: I know this won't happen unless I implement it myself, I got 
the message... ;)


Re: I have a plan.. I really DO

2018-07-13 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 13 July 2018 at 19:30:07 UTC, RhyS wrote:

On Friday, 13 July 2018 at 13:15:07 UTC, Ecstatic Coder wrote:
At the moment, developing in Rust can be quite painful because 
of too much focus on its borrow checker, as the reference 
counting system is just a side feature, which is not deeply 
integrated into the language.


And Go suffers from its own problems, mainly related to the 
excessive limitation of the language features (no genericity, 
"fake" class inheritance, etc).


Those are are big items but its the small stuff that more 
frustrates. Just deal with some database result fetching. In 
dynamic languages that is maybe a 5 line of code, Go makes it 4 
or 5 times as big. Its just a bit too unwieldy.


De facto they are already making room for another language to 
ultimately fill those gaps...


This may be Crystal, D or another yet to come language...


Crystal maybe ... but the link Ruby / RoR does create a bit of 
a artificial barrier. I do notice that Ruby ( not Rails ) is 
getting more recognition these days.


D ... i am being honest but i do not see it. D really has a lot 
going for it but frankly, the missing default HTTP server is 
just silly these days. And no, again, Vibe.D is not a  good 
alternative when it breaks on just about every D release or 
does not perform multi thread correctly ( look up the 
documentation. Out of date and full of unused information ).


What i personally miss is a compile language that simply gets 
the job done.


Take PHP for instance, horrible issues ( a lot less as they 
cleaned up a lot over the years ) but its most redeeming 
feature is it gets the job done. It does not force you into a 
specific pattern, its fast go get visual results, its backward 
compatability is impressive ( hint hint D ), it just works out 
of the box with ease.


Javascript ( the newer ES version + Node ) also match this more.

D looks usable at first for people coming from dynamic 
languages but they are quickly overwhelmed with the whole C/C++ 
focus.


Crystal is bridging that gap but its still more or less Ruby. 
So it needs to deal with some of the reputation issues.


Where is our Java / C like alternative. Swift? Unfortunately 
Apple has no interest outside of its own platform and Linux 
support is spotty.


Kotlin/Native? Its moving fast and most people do not realize 
this. But a long time from finished.


Zig? Kind of a C alternative.


If there is a language out there that gaps that C/Java/dynamic 
fast and easy feel, and offers the ability to compile down with 
ease. I have not seen it.


Indeed Kotlin/Native is becoming VERY impressive these days, as 
it will be usable both for server-side and client-side 
development, including to develop mobile applications for iOS and 
Android.


https://github.com/jetbrains/kotlinconf-spinner

One other very promising language, which almost nobody knows, is 
Crack, as it's quite performant and could be used to implement 
practically anything (web servers, games, etc), as it uses 
automatic reference counting instead of garbage collection.


Sad it has absolutely no community (45 Github stars, including 
mine), and thus will probably stagnate in its current unfinished 
state (no weak references, fibers, channels, etc).




Re: I have a plan.. I really DO

2018-07-13 Thread Ecstatic Coder via Digitalmars-d-announce

On Thursday, 12 July 2018 at 12:07:55 UTC, wjoe wrote:

On Tuesday, 10 July 2018 at 17:25:11 UTC, Yuxuan Shui wrote:
Whether or not rust, go, etc. are just as or more popular than 
C++ or Java in 30 years remains to be seen.


Rust and Go have their strengths, but also suffer from serious 
usability flaws, so I'm not sure they can become as predominant 
as C++ in the years to come.


At the moment, developing in Rust can be quite painful because of 
too much focus on its borrow checker, as the reference counting 
system is just a side feature, which is not deeply integrated 
into the language.


And Go suffers from its own problems, mainly related to the 
excessive limitation of the language features (no genericity, 
"fake" class inheritance, etc).


De facto they are already making room for another language to 
ultimately fill those gaps...


This may be Crystal, D or another yet to come language...


Re: I have a plan.. I really DO

2018-07-11 Thread Ecstatic Coder via Digitalmars-d-announce
This is one of the things about open source / volunteer 
projects that may or may not be a good thing (it can be argued 
both ways).  Since people aren't getting paid to do grunt work, 
if nobody steps up to the plate to fix an issue, it will either 
just sit there forever, or it will fall upon Walter and Andrei 
to get to it, which, given how much is already on their plate, 
will take a very, very long time.  And people will just work on 
whatever interests them.  Happy D users who don't find any 
problems (for THEIR use case) won't have much motivation to 
contribute to something that doesn't directly benefit them (or 
they don't even use it).  Unhappy D users who *do* find a 
problem will either step up and fix it and contribute it so 
that the rest of the community benefits, or they will choose 
not to participate, in which case nothing happens.


I'm not trying to justify this situation, but having observed 
how things work around here for the past many years, that's 
just the way things work.  Either somebody gets ticked off 
enough to actually do something about an issue, resulting in 
all-round benefits, or they just do nothing, and nothing 
happens. (Complaining in the forums doesn't count, since it has 
been proven time and time again that this almost never leads to 
any actual change.)  This is unlike how most commercially 
driven projects work, for obvious reasons, and for better or 
for worse, that's what we have to deal with. (Personally I 
think this is actually a good thing, but I'm guessing many 
people will disagree.)


So saying "wouldn't it be much more effective that the D 
experts of this forum simply fix the open source code" 
ultimately won't lead to much change, for better or for worse.  
*Somebody* has to step up to do it. Expecting somebody else to 
spend their unpaid volunteer time to work on something that may 
not really interest them is, to say the least, unrealistic.  
The solution, as Walter says, is to "be the change you want to 
see".


I agree. And I must admit that from that point of view I'm indeed 
part of the problem...


Re: I have a plan.. I really DO

2018-07-11 Thread Ecstatic Coder via Digitalmars-d-announce

On Tuesday, 10 July 2018 at 18:20:27 UTC, H. S. Teoh wrote:
On Tue, Jul 10, 2018 at 05:25:11PM +, Yuxuan Shui via 
Digitalmars-d-announce wrote:

On Friday, 6 July 2018 at 21:15:46 UTC, H. S. Teoh wrote:

[...]
> Of course, for someone looking for an excuse not to use D, 
> they will always find another reason why this is not 
> sufficient. But that only strengthens the point that the GC 
> is just a convenient excuse not to use D.


Not a good excuse to not fix GC, though.


Of course.  The current GC, while decent, does leave lots of 
room for improvement.  Unfortunately, while much talked about, 
not many people are willing to actually put in the work to 
improve it.  So I'm not really interested in generating more 
talk, as opposed to action.



> Solve that problem, and they will just move on to the next 
> excuse, because the GC is not the real reason; the real 
> reason is probably non-technical. Like good ole inertia: 
> people are lazy and set in their ways, and resist changing 
> what they've grown comfortable with. But actually admitting 
> this would make them look bad, so it is easier to find a 
> convenient excuse like the GC (or whatever else is different 
> from the status quo).


If that's the case, then we are doom. We might just as well 
forget about getting popular, and instead spend time making 
the language better.


I have always been skeptical of popularity.  It is neither a 
necessary nor sufficient condition for improved language 
quality.  That's not to say we should not invest effort in 
marketing D... but popularity does not imply technical 
superiority, and the only reason I'm here is because of D's 
technical superiority.




Like fixing the GC.


Nobody argues *against* fixing the GC.  But, who will actually 
do it? As opposed to the crowds who are very willing to only 
talk about it.



(Although I don't quite agree with you. Some people DO resist 
change, that's why some decades old languages are still 
popular. But look at the popularity of new languages like Go, 
and Rust, and the ever-change landscape of front-end 
development. There're tons of people who adapt certain 
technology just because it is new, why can't that happen to D?)

[...]

Those who adapt technology merely because it's new, are what I 
call the bandwagon jumpers. They will flock to the next brand 
new thing, and then just as readily leave in droves once the 
novelty has worn off. They are unreliable customers, and I 
wouldn't build a business based on their continuing support.  
Again, popularity is orthogonal to technical excellence.



T


Except for Crystal, I think that D is superior to many languages 
in *ease of use* and *expressivity*, and I really like it a lot 
for that.


But for technical aspect like performance, very honestly I'm 
still not sure of its technical superiority over similar 
languages.


For instance, I'm personally convinced that a Go web server can 
often beat its vibe.d equivalent in technical aspects like raw 
performance, memory consumption, multi-core usage, etc.


And even if benchmarks are always to be interpreted cautiously, 
when several of them lead to exactly the same conclusion as my 
own tests, and with such big margins, it's very hard to 
completely ignore them.


Just have a look at this one, which is quite famous :

https://www.techempower.com/benchmarks/

I know that many people here will simply tell me that all those 
personal et external benchmarks are all wrong, etc.


Maybe you are right.

But in terms of communication, wouldn't it be much more effective 
that the D experts of this forum simply fix the open source code 
of those benchmarks to make D's technical superiority much more 
obvious, so that the decision makers of software development 
companies, which stupidly use the informations of such benchmarks 
when investigating alternative technologies, can more easily 
suggest to their leadership to switch to D ?




Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

 Friday, 6 July 2018 at 21:15:46 UTC, H. S. Teoh wrote:
On Fri, Jul 06, 2018 at 08:16:36PM +, Ecstatic Coder via 
Digitalmars-d-announce wrote: [...]
I've never said that this is something smart to do. I'm just 
saying that this code can perfectly be executed once in a C++ 
game frame without having to worry for a game freeze, because 
the string buffer deallocation is done once per frame too.


While with many GC languages, you actually DON'T KNOW when all 
those unused string buffers will be claimed.

[...]

As I've already repeated twice, this is not true in D. You 
*can* predict precisely when the GC runs a collection cycle by 
calling GC.disable and then calling GC.collect according to 
*your* own schedule.  This is not just a theoretical thing.  I 
have actually done this in my own projects, and it does work.


Of course, for someone looking for an excuse not to use D, they 
will always find another reason why this is not sufficient. But 
that only strengthens the point that the GC is just a 
convenient excuse not to use D. Solve that problem, and they 
will just move on to the next excuse, because the GC is not the 
real reason; the real reason is probably non-technical. Like 
good ole inertia: people are lazy and set in their ways, and 
resist changing what they've grown comfortable with. But 
actually admitting this would make them look bad, so it is 
easier to find a convenient excuse like the GC (or whatever 
else is different from the status quo).



T


+1


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 19:22:13 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 17:59:27 UTC, Ecstatic Coder wrote:


While ANY C++ game can make ANY number of 
allocations/allocations inside a game loop and still run 
without a risking any freeze.

You are doing something very wrong if you are doing this.

-Alexander


Just try it.
For what rhyme or reason!? You shouldn't be allocating and 
deallocating inside a critical loop in the first place!
Regardless people have shown you solutions regarding string 
concatenation. Are you going to address that or you just going 
to ignore them?


-Alexander


Pfff, it was just an EXAMPLE of how some insignificant string 
concatenation code may eventually be a problem in any GC language 
even if it's done only once per frame.


I've never said that this is something smart to do. I'm just 
saying that this code can perfectly be executed once in a C++ 
game frame without having to worry for a game freeze, because the 
string buffer deallocation is done once per frame too.


While with many GC languages, you actually DON'T KNOW when all 
those unused string buffers will be claimed.


This ignorance is, in my opinion, the root of this "phoebia".

If you disagree with me, fine. No problem. Maybe I'm wrong.

But this is my opinion. Please feel free to ignore it.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 19:27:51 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 19:22:13 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 17:59:27 UTC, Ecstatic Coder wrote:


While ANY C++ game can make ANY number of 
allocations/allocations inside a game loop and still run 
without a risking any freeze.

You are doing something very wrong if you are doing this.

-Alexander


Just try it.
For what rhyme or reason!? You shouldn't be allocating and 
deallocating inside a critical loop in the first place!
Regardless people have shown you solutions regarding string 
concatenation. Are you going to address that or you just going 
to ignore them?


-Alexander


Also when I used the word phobia I was pretty sure that I was 
referring to irrational fear of things. Big emphasis on the 
word "irrational".


-Aleaxander


Irrational would mean this would be impossible to have a GC 
freeze because of just one string concatenation during the game 
loop of a garbage collected language.




Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 19:56:23 UTC, JN wrote:

On Friday, 6 July 2018 at 18:19:08 UTC, Ecstatic Coder wrote:
Because in C++, smart pointers and collections will make sure 
to free unused memory block as soon as they need to, and no 
later.


I bet if D was reference counted from the start, C++ 
programmers would complain about "smart pointer overhead" and 
how ref counting is too slow for games/real time and you should 
be able to manage your memory yourself.


Probably ;)


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 17:58:46 UTC, bachmeier wrote:

On Friday, 6 July 2018 at 15:53:56 UTC, Ecstatic Coder wrote:

With D, ANY forgotten allocation during the game loop (and I 
really mean even JUST ONE hidden allocation somewhere in the 
whole game or engine), may cause the game to regularly freeze 
at the wrong time, because of an unwanted GC. Hence the phobia.


This program

import std.conv, std.stdio;

@nogc void main() {
int point_count = 3;
string score = point_count.to!string() ~ " POINTS";
writeln(score);
}

provides this compiler output

nogc.d(5): Error: @nogc function 'D main' cannot call non-@nogc 
function 'std.conv.to!string.to!int.to'
nogc.d(5): Error: cannot use operator ~ in @nogc function 'D 
main'
nogc.d(6): Error: @nogc function 'D main' cannot call non-@nogc 
function 'std.stdio.writeln!string.writeln'


Are you saying there are bugs in the @nogc implementation? 
Otherwise I don't see how you will end up with a forgotten 
allocation.


I agree.

But that feature is not something present in all the garbage 
collected language.


The problem with my "naive" text score code is that you can trust 
the C++ version to deallocate the destroyed string buffers "on 
the fly".


Because in C++, smart pointers and collections will make sure to 
free unused memory block as soon as they need to, and no later.


For the garbage collected language version, it's up to the 
garbage collector to decide when and how this memory will be 
claimed. So sometimes this may happen at the wrong time too...


So I'm not saying that D can't work with the GC disabled, etc.

I'm saying that you will find it hard to convince many C++ game 
developers that they can make a few allocations within a game 
loop in a garbage collected language like Java, C#, etc, and not 
have to worry about that.


And by saying, "just disable the garbage collector", you are 
convincing them still more of that, instead of the contrary.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 17:43:29 UTC, JN wrote:

On Friday, 6 July 2018 at 17:26:26 UTC, wjoe wrote:

On Friday, 6 July 2018 at 15:53:56 UTC, Ecstatic Coder wrote:
With D, ANY forgotten allocation during the game loop (and I 
really mean even JUST ONE hidden allocation somewhere in the 
whole game or engine), may cause the game to regularly freeze 
at the wrong time, because of an unwanted GC. Hence the 
phobia.


You make it sound like a C++ game codes, debugs, profiles and 
optimizes itself.

And like there are no gotchas with C++.

Anyway, I know I'm on a D forum here, so "those who don't 
want to understand won't, and those who want will", to 
paraphrase a former poster here.


Well, it appears that you don't.

And since you point out the D forum folks, I know game 
developers are a very special lot, too, with ther mantra like 
repetition of GC is the devil, acting like it's 1985 and the 
need to count clock cycles and top-of-the-food-chain 
I-never-make-mistakes arrogance like nobody else knows how to 
write fast code, yet most games of those clever guys are bug 
ridden pieces of poor quality even years after release, 
including top AAA titles *cough* TES. Despite - or more likely 
because - being made in C++.


Maybe performance aware game developers would do well to also 
adopt the idea of code quality and D offers a lot in that 
regard. Plus C++ish performance on top of that.


Yeah. There are plenty of games done in GC languages. C++ folks 
want to use C++. The ones that wanted to switched, switched 
already. Even if nogc gets more mature, they will find another 
excuse. Probably something like "yeah but now I don't know 
which parts of the language and library I can use and it's 
awkward to put nogc everywhere".


I do some free time game development work in various languages, 
even GC ones and the existence of GC was never a big issue for 
me. Sure, I am not a super mighty C++ programmer, so I don't 
know much, but for me it's more important as a gamedev to have 
features such as operator overloading, value types/be able to 
cast Vector3f[] to float[] without copying (something C/C++/D 
can do, for example Java can't do, C# can partially do that 
with LayoutKind.Sequential), accessibility of C bindings for 
popular libraries like SDL, SFML, ODE.


nogc, betterC, interfacing to C++, at most they get a "hmm, 
that's interesting", but I haven't really seen them bring 
people to D. And I'll take a fun and convenient language over 
performant one any day.


As I said, I wanted to explain the roots of the GC phobia for 
some C++ developers.


If you don't agree when I said that one allocation during a C++ 
game loop is no problem, which one allocation during a GC 
language game loop MAY eventually become a problem, that's fine 
by me.


But as many people here told me to "disable the GC" to completely 
avoid this potential risk of game freeze, I guess that all those 
D experts are also wrong in giving me this advice.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 17:22:15 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 17:16:54 UTC, Ecstatic Coder wrote:
Are you seriously going to ignore video games that are 
entirely implemented in GC focus language such as C#/java?! 
The GC is NOT AN ISSUE IF YOU KNOW WHAT YOU ARE DOING!


-Alexander


+1


You are start reminding me of another person who pull these 
type of stunts. Which is not a good thing btw, as that guy is a 
notorious troll.




So if I agree with you, then I'm a troll ?

While ANY C++ game can make ANY number of 
allocations/allocations inside a game loop and still run 
without a risking any freeze.

You are doing something very wrong if you are doing this.

-Alexander


Just try it.

Inside the game loop, add a for loop which evaluates the score 
text 100 times as explained above.


Or even 1000 times.

This means roughly 2000 allocations and 1999 deallocations.

Your frame rate suffer will suffer from this, which is bad (and 
as such should absolutely avoided) but zero risk of garbage 
collection freeze.


Then add ONE time the "naive" text score concatenation in the 
game loop of a garbage collected loop, and you expose yourself to 
a RISK of a garbage collection freeze. Just because of ONE string 
concatenation, executed only once per frame, which is something 
that could be tolerated in a C++ game.






Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce
Are you seriously going to ignore video games that are entirely 
implemented in GC focus language such as C#/java?! The GC is 
NOT AN ISSUE IF YOU KNOW WHAT YOU ARE DOING!


-Alexander


+1

Indeed ABSOLUTELY NO garbage collection will happen during the 
game loop is 100% of your GC-language code doesn't make any 
string concatenation, object allocation, etc.


While ANY C++ game can make ANY number of allocations/allocations 
inside a game loop and still run without a risking any freeze. I 
will probably slower than it should, so you'd better don't make 
too much of them ;)


But the game won't freeze.

C++ is allocation/deallocation tolerant to a reasonable extent.

GC language aren't. You eventually have to reach the ZERO 
allocation limit, or you expose yourself to unwanted game freezes.





Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 16:48:17 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 16:45:41 UTC, Ecstatic Coder wrote:

On Friday, 6 July 2018 at 16:33:19 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 15:19:33 UTC, Ecstatic Coder wrote:

For C++, the answer is : never.


...Yeah I had already figure out what your aiming at. For C++ 
the correct answer is "I do not know as I don't know how it 
is implemented". You act like there isn't any GC libraries 
for C++.


-Alex


LOL

Unless you implement your game in managed-C++, I don't think 
there is much to worry about that though...


Your comparison is logical fallacious to begin with.

-Alex


I was just trying to explain why C++ developers have GC phobia 
through a very simple example.


Even the simplest string concatenation in any garbage collected 
language (like Java, etc) can be the cause of of serious game 
freeze, which most players (including me) won't tolerate for long.


Even one tiny allocation which is hidden deep somewhere in an 
external library of some sort...


But it was obviously pointless to try to explain it on this D 
forum. I understand it now.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce
Also if your concatenate string in a loop in c# then you use 
the https://www.dotnetperls.com/string-join function as it 
simpler and faster.
There is no reason why we can't have the function equivalent in 
D.


-Alexander


Yeah I know, this code was DELIBERATLY naive.

That was the whole point of it.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 16:33:19 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 15:19:33 UTC, Ecstatic Coder wrote:

For C++, the answer is : never.


...Yeah I had already figure out what your aiming at. For C++ 
the correct answer is "I do not know as I don't know how it is 
implemented". You act like there isn't any GC libraries for C++.


-Alex


LOL

Unless you implement your game in managed-C++, I don't think 
there is much to worry about that though...





Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

But then of course, you need to avoid a lot of D niceties.


Unfortunately, in my case this is the exact moment where D looses 
a LOT of its shininess compared to C++.


The balance is no more that much in favor of D as it was before, 
because it's "standard" D code which is so much more convenient 
than C++ in many situations, especially when implementing file 
processing scripts.


This is why I think that even C++ developers who use D as a file 
processing language (like me) will still stick to C++ for their 
game engine, even if they would probably be more than happy to be 
able to use *STANDARD* D code instead...




Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 14:14:27 UTC, rikki cattermole wrote:

On 07/07/2018 2:11 AM, Ecstatic Coder wrote:

On Friday, 6 July 2018 at 13:50:37 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 13:15:43 UTC, Ecstatic Coder wrote:

LOL

Ok, if I'm wrong, then this means D is already a perfect 
replacement to C++, especially for game development.


Just by curiosity, can you tell me how many successful 
commercial games based on a D game engine are released each 
year ?


Or just this year maybe...


No triple AAA engine is going to switch to D for the 
following reasons:

1.)Cost vs benefit from converting C++ to D.
2.)Gamers do not care how things are implemented, they want 
results.
3.)There are high abundance of c++ programmers for employees 
to hired. I can't say the same thing for D.

4.)GC phobia.(The notorious culprit)


-Alex


+1

Just one silly question.

Can the following "naive" D code trigger a garbage collection 
stall ?


score.Text = point_count.to!string() ~ " POINTS";


If the GC has been disabled (which any sane performance caring 
application should do) no.


Yeah, I know, I'm not silly.

I meant, "if you use standard D code in a game (i.e. with GC 
enabled), the game may stole, but if you use standard C++ code in 
a game, the game may be a bit less performant".


In C++ you don't have to disable anything, and you can still use 
the standard C++ library to make your game if you want to.


With D, I CAN'T use the language and its standard library as 
usual, just because of the GC "phobia".


Which would be the #1 problem for me, because "standard" D is 
perfect to me, as much as "standard" C++ is nice to me.


That's my point.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce
Of course, the answer in C++ is that it won't compile, this is 
D code! ;)


Seriously ?

I wrote : "And what about the same code in C++ ?"

I thought people on this forum were smart enough to understand 
"the C++ port of this D code".


I'm sorry to have been wrong on this.

Anyway, what nobody here *wants* to understand, is that such 
"NAIVE" C++ string code may not be performant, but in C++, even 
if you make allocations/deallocations during the game loop, this 
won't be good for the game performance, but that's all.


With D, ANY forgotten allocation during the game loop (and I 
really mean even JUST ONE hidden allocation somewhere in the 
whole game or engine), may cause the game to regularly freeze at 
the wrong time, because of an unwanted GC. Hence the phobia.


Anyway, I know I'm on a D forum here, so "those who don't want to 
understand won't, and those who want will", to paraphrase a 
former poster here.


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 15:07:41 UTC, wjoe wrote:

On Friday, 6 July 2018 at 13:15:43 UTC, Ecstatic Coder wrote:
Just by curiosity, can you tell me how many successful 
commercial games based on a D game engine are released each 
year ?


Just out of curiosity, how many games have been released based 
on a C++ game engine in 1998 ?



The original Unreal engine was almost completely written in 
asm, back in the late 90ies.


The first C++ game engine I found was the Object Oriented 
Graphical Redering Engine, started some time around 2001.


Carmack resisted C++ for a longer time and I believe I read 
something about the engine was  ported to C++ when they 
developed Id Tech 4 around 2004.


Actually, as I said, even today many game engines are still 
written in a C-inspired manner, i.e. C + classes, templates and 
polymorphism, mainly for performance reasons (cache friendly data 
oriented designs, etc).


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 14:52:46 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 14:11:05 UTC, Ecstatic Coder wrote:

On Friday, 6 July 2018 at 13:50:37 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 13:15:43 UTC, Ecstatic Coder wrote:

[...]


No triple AAA engine is going to switch to D for the 
following reasons:

1.)Cost vs benefit from converting C++ to D.
2.)Gamers do not care how things are implemented, they want 
results.
3.)There are high abundance of c++ programmers for employees 
to hired. I can't say the same thing for D.

4.)GC phobia.(The notorious culprit)


-Alex


+1

Just one silly question.

Can the following "naive" D code trigger a garbage collection 
stall ?


score.Text = point_count.to!string() ~ " POINTS";

The correct answer is: I don't know, as I don't know what 
"point_count" is in the first place, as it never been defined.


-Alex


Actually you answer was right even if the point count was not 
stored as an integer ;)


For C++, the answer is : never.

Two small memory blocks will have to be allocated from the memory 
pool, which is not smart, obviously, but apart of that, nothing 
to worry about.


Because there is no garbage collector in C++, memory has to be 
allocated and deallocated in a continuous manner...


Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 6 July 2018 at 13:50:37 UTC, 12345swordy wrote:

On Friday, 6 July 2018 at 13:15:43 UTC, Ecstatic Coder wrote:

LOL

Ok, if I'm wrong, then this means D is already a perfect 
replacement to C++, especially for game development.


Just by curiosity, can you tell me how many successful 
commercial games based on a D game engine are released each 
year ?


Or just this year maybe...


No triple AAA engine is going to switch to D for the following 
reasons:

1.)Cost vs benefit from converting C++ to D.
2.)Gamers do not care how things are implemented, they want 
results.
3.)There are high abundance of c++ programmers for employees to 
hired. I can't say the same thing for D.

4.)GC phobia.(The notorious culprit)


-Alex


+1

Just one silly question.

Can the following "naive" D code trigger a garbage collection 
stall ?


score.Text = point_count.to!string() ~ " POINTS";

And what about the same code in C++ ?

Now guess why there is such a "phobia"...



Re: I have a plan.. I really DO

2018-07-06 Thread Ecstatic Coder via Digitalmars-d-announce

LOL

Ok, if I'm wrong, then this means D is already a perfect 
replacement to C++, especially for game development.


Just by curiosity, can you tell me how many successful commercial 
games based on a D game engine are released each year ?


Or just this year maybe...


Re: I have a plan.. I really DO

2018-07-04 Thread Ecstatic Coder via Digitalmars-d-announce
Exactly.  As Walter has said before, (and I paraphrase,) it's 
far more profitable to cater to *existing* customers who are 
already using your product, to make their experience better, 
than to bend over backwards to satisfy the critical crowd who 
points at issue X and claim that they would not use D because 
of X.  But X is not the *real* reason they don't want to use D; 
it's just an excuse.  Once you solve problem X, they will find 
issue Y and say *that* is the reason they're still not using D. 
And if you solve Y, they will find issue Z.  It never ends, and 
you're wasting your efforts on non-customers who will *never* 
become customers. Why bother?  Far better to improve things for 
existing customers (who may then get you new customers by 
word-of-mouth of their success stories -- *eager* new customers 
who aren't just looking for the next excuse not to use D).


+1

For instance, to be a perfect C++ alternative, D would probably 
need to be 100% :

1. usable (strings, slices, etc) without GC
2. interoperable with any existing C++ library

For for game development :
3. compilable on all game development platforms 
(Win/Mac/Linux/Android/iOS/Switch/PS4/etc)


I don't know if this can be achieved, or if this is really worth 
the effort.


Re: I have a plan.. I really DO

2018-07-04 Thread Ecstatic Coder via Digitalmars-d-announce

On Wednesday, 4 July 2018 at 18:05:15 UTC, wjoe wrote:

On Wednesday, 4 July 2018 at 08:50:57 UTC, Ecstatic Coder wrote:
But indeed, being able use D in a GC-free environment (like 
C++ and Rust do) would be something many people may NEED, for 
instance to be able to EASILY use D for soft-realtime 
applications like games.


This has to be the no. 1 excuse.


Why is C++ the language of choice currently? My bet is 
productivity and economic concerns. Amongst other things the 
productivity gain from resource management via constructor and 
destructor. Which solves like 75% of the headaches of manual 
resource management and goto nightmares.


Back in the day when C was used to make games, the excuse not 
to use C++ was vtable, exception and RTTI overhead. Now it's 
called the bare metal best performance language which 
everything and their grandma is measured against. This C++ 
overhead didn't make C any slower or C++ any faster than C but 
it made C++ superior in productivity.


This was around 2002/03, and C++, at the time, some 23+ years 
old.


Games have been made with GC'd languages, 3D games, even. And 
successfully, too.
Minecraft, a very successful one, comes to mind, which is or at 
least was made in Java.

Plenty of games are made in C#, too.

My bet, again, would be productivity and economic concerns. The 
countless hours wasted on debugging memory leaks and cyclic 
dependencies are better spent making the actual game/software.
And smart pointers introduce overhead of their own which makes 
them inferior to C's bare metal raw pointer performance - or 
GC'd pointers for that matter. The culprit being the collection 
cycle.


The best thing about this whole argument, however, is the claim 
for GC no can do and with the next breath they pull LUA into 
their games. A scripting language that brings a VM, GC and 
extraordinarily inflated loading times when the scripts are 
compiled to byte code at the end user's PC which make C64 
loading times shine.
The reasoning probably being productivity again and C++'s lunch 
break compile times.


Using the D compiler as a library, instead of LUA, D code could 
be used for 'scripting', as well, and compiled to native 
machine code. In a snap.


I have no metrics between any AAA game engine and their port to 
D but I do know that I wrote a sound/music player library in 
Java, which folks like you claim impossible because GC, never 
bothered with GC and had no performance issues whatsoever - and 
I don't expect any porting it to D.


And there is EASTL. A STL made by Electronic Arts. Because the 
standard implementation shipped with the compiler is too slow ? 
Even though written by C++ wizards ?



Slow code is slow and allocating memory in a tight loop is a 
huge performance killer - regardless of language.


Also, why do you feel like a GC is inacceptable for games but 
doesn't matter for your file handling program? Handling dozens, 
maybe thousands, of files sounds like an awful lot of memory 
management involved and whether a e.g. grep takes 15 seconds to 
do it's job or under 1 matters not?


Nothing forces anyone to use the GC, memory can be managed 
manually via malloc/free and you get to do it with scope 
statements/nested functions which makes it nicer than in C. You 
could also implement shared/weak ptr stuff in D - warts and all.
If you need a GC free standard library, I believe there is an 
ongoing effort -or several- at code.dlang.org and probably 
other places.



You said do this and that, GC, etc. to motivate C++ folks to 
come to D. I say it's an excuse not to use D and no matter the 
effort of advertising, a GC free phobos, etc. on part of the 
D-Lang Foundation and contributors would make these folks 
switch. They would simply find a different excuse.


And where's the usefulness of toy examples like 2 line web 
servers which essentially do nothing?
And how is that helping with getting attention from the game 
devs ?
Putting on the front page a 12 line maze game which can be 
imported from the standard library? Not using the GC?


First, to be clear, I mainly use D as a scripting language for 
file processing, and for this use case, having a GC is a blessing.


You say that garbage collection is not a real problem for game 
development.


Maybe, but that's not my experience. For instance, have you read 
Unity's own official recommandations on how to overcome this 
problem ?


And obviously, Tibur, a highly skilled D game engine developer, 
is not a big fan of D's non incremental garbage collector, from 
the number of @nogc he has put in his Dlib container code.


Maybe you disagree with us because you are a professional game 
developer who has already released a successful commercial game 
in D without caring for the garbage collection. If it's the case, 
then nice, I'd be happy to have it wrong on this :)


And about developing video games in C++, actually most studios 
use orthodox C++. This means no exceptions, no RTTI, few virtual 

Re: I have a plan.. I really DO

2018-07-04 Thread Ecstatic Coder via Digitalmars-d-announce
Throw everything we can this dude's way so we can make D the 
most powerful we can


We need pattern matching, we need typeclasses, we need HKT's, 
we need linear types, we need @nogc Phobos, we need concurrency 
so fearless I can change any variable and not give two shits


Personally I don't really NEED pattern matching, typeclasses, etc

That would be nice, but personally that wouldn't prevent me from 
getting the job done.


But indeed, being able use D in a GC-free environment (like C++ 
and Rust do) would be something many people may NEED, for 
instance to be able to EASILY use D for soft-realtime 
applications like games.


So being able to add a "-nogc" flag to the DMD compiler and use a 
minimal Phobos-like library (strings, arrays, lists, maps and 
other collections, file system functions, etc) which use 
EXCLUSIVELY reference counted memory block accessed through 
strong/weak references and pointers (T@, T&, T*) would be nice.


Not an implementation like the one in the standard library of 
C++, which is maybe safe but not especially efficient or user 
friendly, but preferably something closer to this :


https://github.com/senselogic/BASE/tree/master/CODE/COUNTED

And being able to DIRECTLY use D with REAL Go-like ease of use 
and performance (http components, fibers and channels using both 
concurrency and parallelism) to implement web frameworks and 
online services is also something many people may NEED.




Re: I have a plan.. I really DO

2018-07-03 Thread Ecstatic Coder via Digitalmars-d-announce
D has a very diverse use case so the generalization is moot. 
For example I prefer having the gc manage memory for me...For 
most of the things I do with D...contrary to other opinions.


+1

For most D use cases (including mine, which is file processing), 
D's GC is a blessing, and one of its main advantages over C++, 
IMHO.


And if you want to use D for C++-like use cases where you don't 
want it, this generally leads to having to reinvent the wheel in 
order to avoid unwanted garbage collections. For instance :


https://github.com/gecko0307/dlib/blob/master/dlib/container

That's why I'm personally in favor of D supporting 
reference-counting based memory management directly in its syntax 
(T@, etc), and also providing the GC-free standard components 
(strings, slices, maps, etc).


PS : BTW kudos to Timur Gafarov, it's a pity so many D developers 
prefer to start developing their own engines instead of helping 
Tibur finish Dagon to make it a production-ready game engine 
(adding terrain, UI, networking, etc). Very promising work IMHO !


https://dlang.org/blog/2016/09/16/project-highlight-timur-gafarov/

And having the language help him (native strong/weak references) 
would be nice too :D


Re: I have a plan.. I really DO

2018-07-02 Thread Ecstatic Coder via Digitalmars-d-announce
Let me echo this: transparency has historically been a big 
problem for D.  AFAIK, nobody in the broader community was ever 
told that the D foundation money would be used to fund a bunch 
of Romanian interns, it just happened. In the end, it appears 
to have worked out great, but why would anybody donate without 
being given transparency on where the money was going in the 
first place, when it could have ended badly?


I understand Andrei had connections with that Romanian 
university, but some donor might have had connections with a 
Brazilian or Chinese university that might have worked out even 
better. We'll never explore such connections and alternatives 
without transparency.


The current move to fund some IDE work with Opencollective is 
better in that regard, but with no concrete details on what it 
entails, not significantly better:


https://forum.dlang.org/post/pxwxhhbuburvddnha...@forum.dlang.org

Anyway, I don't use such IDEs, so not a reason for me to donate 
anyway.


Honestly, Dmitry's posts starting this thread are incoherent, 
I'm not sure what he was trying to say. If he feels D users 
should be donating much more, he and others need to make clear 
how that money will be spent.


+1

And maybe it would be a good idea to use a Kickstarter-like 
philosophy to fund D's development with more transparency.


I mean, you should offer a short panel of D enhancement projects, 
with their precise goal, minimum bugdet and investment time limit 
(for instance one year to reach the required budget), plus an 
ordered list of additional developments if the gathered money 
exceeds the initial budget.


People can then invest in the project(s) that interest them the 
most.


If the minimum budget is not reached after the time limit, the 
investors can get their money back, or decide to invest it in 
another project which hasn't expired.


As you guess, in my opinion, two possible fundable projects would 
be :


1. a project to add http-related components to D's standard 
library
2. a project to allow D to be REALLY usable without GC, i.e. add 
weak/strong reference to the language and provide a standard 
library which uses them




Re: I have a plan.. I really DO

2018-07-02 Thread Ecstatic Coder via Digitalmars-d-announce

On Monday, 2 July 2018 at 05:20:51 UTC, Joakim wrote:

On Sunday, 1 July 2018 at 15:40:20 UTC, Ecstatic Coder wrote:

On Sunday, 1 July 2018 at 14:01:11 UTC, Jonathan M Davis wrote:
On Sunday, July 01, 2018 13:37:32 Ecstatic Coder via 
Digitalmars-d-announce wrote:

On Sunday, 1 July 2018 at 12:43:53 UTC, Johannes Loher wrote:
> Am 01.07.2018 um 14:12 schrieb Ecstatic Coder:
>> Add a 10-liner "Hello World" web server example on the 
>> main page and that's it.

>
> There already is one in the examples:
>
> #!/usr/bin/env dub
> /+ dub.sdl:
> name "hello_vibed"
> dependency "vibe-d" version="~>0.8.0"
> +/
> void main()
> {
>
> import vibe.d;
> listenHTTP(":8080", (req, res) {
>
> res.writeBody("Hello, World: " ~ req.path);
>
> });
> runApplication();
>
> }

Yeah I know, guess who asked for it...

But the last step, which is including such functionality 
into the standard library , will never happen, because 
nobody here seems to see the point of doing this.


I guess those who made that for Go and Crystal probably did 
it wrong.


What a mistake they did, and they don't even know they make 
a mistake, silly them... ;)


What should and shouldn't go in the standard library for a 
language is something that's up for a lot of debate and is 
likely to often be a point of contention. There is no clear 
right or wrong here. Languages that have had very sparse 
standard libraries have done quite well, and languages that 
have had kitchen sink libraries have done quite well. There 
are pros and cons to both approaches.


- Jonathan M Davis


I agree.

But here I'm just talking of the "public image" of the 
language.


Languages which integrates HTTP-related components in their 
standard library, and advertize on that (like Crystal for 
instance), obviously apply a different "marketing" strategy 
than languages which have chosen not to do so.


That's all I say...


Two points:

- Andrei pushed to include vibe.d but it didn't happen.

"There's no web services framework (by this time many folks 
know of D, but of those a shockingly small fraction has even 
heard of vibe.d). I have strongly argued with Sönke to bundle 
vibe.d with dmd over one year ago, and also in this forum. 
There wasn't enough interest."

https://forum.dlang.org/post/nipb14$ldb$1...@digitalmars.com

- As you acknowledge, integration has drawbacks too. I thought 
this was an interesting recent article about how it has now 
hobbled one of the biggest tech companies in the world:


https://stratechery.com/2018/intel-and-the-danger-of-integration/

I don't think the web matters enough these days that it is 
worth bundling, which is why a webassembly port is also not 
worth it for most:


https://www.mobiloud.com/blog/mobile-apps-vs-the-mobile-web/


Instead of trying to integrate vibe.d, which I don't think would 
be a good idea, personally I'd rather suggest taking the 
opportunity to design the interface of those standard 
HTTP-related components from scratch (listener server, fibers, 
channels, etc), independently from vibe.d and with a minimalistic 
mindset, by taking inspiration mainly from Go, along with Crystal 
and vibe.d, even if the implementation will obviously end up 
being very similar to the one of vibe.d.




Re: I have a plan.. I really DO

2018-07-01 Thread Ecstatic Coder via Digitalmars-d-announce

On Sunday, 1 July 2018 at 14:01:11 UTC, Jonathan M Davis wrote:
On Sunday, July 01, 2018 13:37:32 Ecstatic Coder via 
Digitalmars-d-announce wrote:

On Sunday, 1 July 2018 at 12:43:53 UTC, Johannes Loher wrote:
> Am 01.07.2018 um 14:12 schrieb Ecstatic Coder:
>> Add a 10-liner "Hello World" web server example on the main 
>> page and that's it.

>
> There already is one in the examples:
>
> #!/usr/bin/env dub
> /+ dub.sdl:
> name "hello_vibed"
> dependency "vibe-d" version="~>0.8.0"
> +/
> void main()
> {
>
> import vibe.d;
> listenHTTP(":8080", (req, res) {
>
> res.writeBody("Hello, World: " ~ req.path);
>
> });
> runApplication();
>
> }

Yeah I know, guess who asked for it...

But the last step, which is including such functionality into 
the standard library , will never happen, because nobody here 
seems to see the point of doing this.


I guess those who made that for Go and Crystal probably did it 
wrong.


What a mistake they did, and they don't even know they make a 
mistake, silly them... ;)


What should and shouldn't go in the standard library for a 
language is something that's up for a lot of debate and is 
likely to often be a point of contention. There is no clear 
right or wrong here. Languages that have had very sparse 
standard libraries have done quite well, and languages that 
have had kitchen sink libraries have done quite well. There are 
pros and cons to both approaches.


- Jonathan M Davis


I agree.

But here I'm just talking of the "public image" of the language.

Languages which integrates HTTP-related components in their 
standard library, and advertize on that (like Crystal for 
instance), obviously apply a different "marketing" strategy than 
languages which have chosen not to do so.


That's all I say...

I personally appreciate that my Go and Crystal code is mostly 
based on standard components which are updated along with the 
language, but I agree that vibe.d can perfectly get the job done 
if you better trust thirdparty libraries for that.





Re: I have a plan.. I really DO

2018-07-01 Thread Ecstatic Coder via Digitalmars-d-announce

On Sunday, 1 July 2018 at 12:43:53 UTC, Johannes Loher wrote:

Am 01.07.2018 um 14:12 schrieb Ecstatic Coder:


Add a 10-liner "Hello World" web server example on the main 
page and that's it.


There already is one in the examples:

#!/usr/bin/env dub
/+ dub.sdl:
name "hello_vibed"
dependency "vibe-d" version="~>0.8.0"
+/
void main()
{
import vibe.d;
listenHTTP(":8080", (req, res) {
res.writeBody("Hello, World: " ~ req.path);
});
runApplication();
}


Yeah I know, guess who asked for it...

But the last step, which is including such functionality into the 
standard library , will never happen, because nobody here seems 
to see the point of doing this.


I guess those who made that for Go and Crystal probably did it 
wrong.


What a mistake they did, and they don't even know they make a 
mistake, silly them... ;)





Re: I have a plan.. I really DO

2018-07-01 Thread Ecstatic Coder via Digitalmars-d-announce

On Sunday, 1 July 2018 at 02:57:26 UTC, RhyS wrote:

On Saturday, 30 June 2018 at 07:11:18 UTC, Joakim wrote:
I'd hope a manager would look at actually meaningful stats 
like downloads, rather than just fluffy stats such as "likes":


http://www.somsubhra.com/github-release-stats/?username=crystal-lang=crystal
http://www.somsubhra.com/github-release-stats/?username=ldc-developers=ldc

I see around 9k total downloads of the various Crystal 0.24 
and 0.25 versions over the last 8 months, compared to 14k 
downloads of the ldc 1.9 compiler alone from two months ago.


Its hard to compare those figure because D and Crystal also use 
package installers on the respectable platforms. Going to the 
crystal download page makes that very clear. Making tracking 
downloads more harder.


D can reach more Git downloads thanks to Windows users that do 
not rely on Linux system packages.


D its buginess in between releases also does not help. I 
probably downloaded LDC and DMD in the last 9 months a dozen 
times, being forced to go back to older versions. Then trying 
the new versions, going back. Again and again on Windows.


Downloads do not mean a lot when you can not keep the people. I 
can swear that i alone am probably responsible for over 25+ 
downloads on Windows and dozens on Linux OS. And every time D 
loses me after running into issues.


Crystal its 0.24 release is still perfectly working here. I 
literally have downloaded 2 version in the last year. 0.23 and 
0.24... That is it. No switching between version because of 
bugs or package issues or dependency issues. Kind of ironic but 
maybe because the http server and other packages are build in 
to the core, i have no need for external 3th party solutions 
like D's Vibe.d.


Of course, all these stats can be gamed, but I think it'd be 
hard to argue Crystal is more popular.


code.d
Total 1336 packages found.

crystalshards.xyz
3359 total shards

Track both sites using archive.org and notice that Crystal is 
growing faster in regards to Shards then D its packages.


Duplicates D something like 6 postgresql driver packages. 
Crystal has 2 drivers. So D is actually fluffing its numbers 
with a lot of not maintained duplicates. Mysql ... Its not hard 
to tell that Crystal its Shards community is more active.


Crystal only recently got the funding to get a full time 
employees to work on the code base. So one can expect the 
development to increase from a mostly community driven 
platform. They out gross Nim by almost double on average ( 
salt.bountysource.com ) and that does not include the 2000$ / 
month that "84 codes" directly donates.


I do not know how much D takes in per month. This has always 
been a more obscure, as is who are the people that really are 
paid full time to work on D. Walter?


Crystal needs a lot of work but so does D. You expect D to have 
more its act together for a language this old. No default http 
server in this day and age is just really weak sauce. And 
Vibe.d breaks plenty of times in between its releases and DMD 
releases.


Both have issues but one is under development for 4 year and 
the other for 16 years. You expect D to simply outclass Crystal 
and other languages. Even Rust is out developing D in many 
areas, a lot thanks to a big community.


+1

At the moement, D's default standard library obviously requires a 
garbage collector, and this won't change for a while.


Trying to convince developers to use D instead of C++ is often 
pointless, because most of the time, if you develop something in 
C++ instead of Java/C#/Go/etc, there is a reason for that.


And that reason why they don't use those nice garbage collected 
languages is generally the same reason why they won't use D 
either.


But those who currently use those same garbage collected 
languages (Go/Java/C#/etc) can be convinced to switch to D, 
because D's garbage collector won't be probably a problem for 
them either, as they are already using one in production.


So what remains a mystery for me is that the D leadership 
OBVIOUSLY CAN'T BE CONVINCED that marketing D as a Go/Java/C# 
alternative could be much more efficient than marketing D as a 
C/C++ alternative.


Why are they trying to sell D on its weakness, instead of selling 
it on its strength.


The only thing that D needs to compete on the same ground as Go 
and Crystal is to have similar default HTTP-related libraries, 
which don't rely on thirdparty libraries for the reasons you just 
explained...


Add a 10-liner "Hello World" web server example on the main page 
and that's it.


And if they REALLY want to ALSO compete with C++, then I strongly 
suggest to add weak and strong references to the syntax (for 
instance T& and T@), and provide an alternative standard library 
which doesn't require garbage collection at all, like those of 
C++ and Rust.


But I think it's quite obvious that the first option (Go-like) 
clearly requires less time and efforts than the second (C++-like).





Re: I have a plan.. I really DO

2018-06-30 Thread Ecstatic Coder via Digitalmars-d-announce

On Saturday, 30 June 2018 at 12:59:02 UTC, punkUser wrote:
I don't normally contribute a lot here but as I've been using a 
fair mix of C/C++, D and Rust lately for a variety of projects 
from game dev to web to services, I have a few thoughts.


Without exhaustively comparing the different pros/cons of the 
languages, the most important thing that makes me pick D for a 
project these days is actually vibe.d. It's the perfect balance 
between letting me expose my low level stuff as a network/web 
service easily while not trying to take over too much of my 
application or conversely get me to manually write async 
network state machines. I'd happily argue that its cooperative 
fiber model is actually superior to C#'s, and while it's not 
quite to the level of Go (mostly just because it's not as 
ubiquitously supported in the standard library), I'll still 
happily take the trade-off to use a language closer to C/C++.


Rust's web framework and cooperative fiber story is still just 
forming, and I have some concern they'll go down the C# route 
which while better than nothing, isn't quite as nice as vibe.d 
where any function can effectively be part of a cooperative 
fiber without the need for infectious markup everywhere. Rust's 
syntax is also a fair bit different than C/C++ which makes it 
harder to collaborate with people for the moment, while D's is 
close enough that anyone with a decent amount of C/C++ 
experience can jump in pretty quickly.


In terms of what makes me *not* want to use D, while GC is 
certainly a factor in some uses, in more cases it's actually 
that I want more language and compiler stability. While things 
have calmed down somewhat in the past year the number of times 
a D update has broken code (mine or code in a dependency) and 
left me trying to debug someone else's code deep in a library 
somewhere when I'm trying to just do a small update has been 
far too high. Rust's "stable" branch and their new epochs model 
(where the language can change every few years but critically 
dependencies using different epochs work together) is something 
I would love to be adopted in D.


In any case I just wanted to give the feedback that from my 
point of view the main thing that keeps me coming back to it 
for new projects is vibe.d. Thus I'm in favor of making vibe.d 
a big part of the selling point and design considerations for D 
going forward.


Already tried. Good luck with that... ;)


Re: I have a plan.. I really DO

2018-06-30 Thread Ecstatic Coder via Digitalmars-d-announce

On Saturday, 30 June 2018 at 07:11:18 UTC, Joakim wrote:

On Saturday, 30 June 2018 at 06:52:01 UTC, Ecstatic Coder wrote:

On Friday, 29 June 2018 at 22:59:25 UTC, bachmeier wrote:

On Friday, 29 June 2018 at 20:13:07 UTC, Ecstatic Coder wrote:

Have a look at Crystal's Github project, you will see that 
Crystal, still in development and quite far from its 1.0 
mile version (= despite no parallism and windows support, 
etc) ALREADY has 11206 stars, 881 forks and 292 contributors 
:


https://github.com/crystal-lang/crystal

Not bad for a language in its 0.25 version and first 
released in June 2014 (4 years), especially compared to D in 
its 2.0 version and first released in December 2001 (16 
years), whose official compiler has 1806 stars, 452 forks 
and 168 contributors :


https://github.com/dlang/dmd

If those numbers means anything, I think its that Crystal is 
probably getting popularity much quicker than D, and 
honestly, after having tried it, I think it's really 
deserved, even if I agree that there are still many things 
that remain to be implemented before it's really ready for 
an official "production-ready" 1.0 release.


Do you by chance work as a manager? Managers like comparisons 
that involve one number, with a higher number being better. I 
don't know what can be learned about D from that comparison 
and I don't think anyone else does either.


That's your opinion.

First, most managers don't become manager by chance, but 
because of their skills.


Like being able to take the right decisions, based on facts, 
not on personal preferences.


For instance, if a good manager sees that the github project 
of a 4 years old compiler has been liked by 11206 persons, and 
the github project of a 16 years old compiler has been liked 
by 1806 persons, I think he could probably think that MUCH 
more people are interested in the development of the first 
github project than in the second.


I'd hope a manager would look at actually meaningful stats like 
downloads, rather than just fluffy stats such as "likes":


http://www.somsubhra.com/github-release-stats/?username=crystal-lang=crystal
http://www.somsubhra.com/github-release-stats/?username=ldc-developers=ldc

I see around 9k total downloads of the various Crystal 0.24 and 
0.25 versions over the last 8 months, compared to 14k downloads 
of the ldc 1.9 compiler alone from two months ago. Of course, 
all these stats can be gamed, but I think it'd be hard to argue 
Crystal is more popular.


Anyway, I you think that Crystal is not worth our attention, it's 
your right.


But my PERSONAL opinion is that Crystal will soon become a great 
alternative to D, Go and Rust for web server development, while I 
still think that D is BY FAR a much better language than Go or 
Rust.


So now we can try to analyze what make Crystal a useful and 
popular language in this domain and learn lessons from it, or 
simply ignore it.


Very honestly I don't care, because I exclusively use D as a file 
processing scripting language, and I'm very happy with D in its 
current state.


And to be perfectly clear on that point, its current syntax is 
perfect, very simple and concise, and if DON'T want any change 
made to its current syntax which would make it less simple and 
concise when using it in GC mode.






Re: I have a plan.. I really DO

2018-06-30 Thread Ecstatic Coder via Digitalmars-d-announce

On Saturday, 30 June 2018 at 07:11:18 UTC, Joakim wrote:

On Saturday, 30 June 2018 at 06:52:01 UTC, Ecstatic Coder wrote:

On Friday, 29 June 2018 at 22:59:25 UTC, bachmeier wrote:

On Friday, 29 June 2018 at 20:13:07 UTC, Ecstatic Coder wrote:

Have a look at Crystal's Github project, you will see that 
Crystal, still in development and quite far from its 1.0 
mile version (= despite no parallism and windows support, 
etc) ALREADY has 11206 stars, 881 forks and 292 contributors 
:


https://github.com/crystal-lang/crystal

Not bad for a language in its 0.25 version and first 
released in June 2014 (4 years), especially compared to D in 
its 2.0 version and first released in December 2001 (16 
years), whose official compiler has 1806 stars, 452 forks 
and 168 contributors :


https://github.com/dlang/dmd

If those numbers means anything, I think its that Crystal is 
probably getting popularity much quicker than D, and 
honestly, after having tried it, I think it's really 
deserved, even if I agree that there are still many things 
that remain to be implemented before it's really ready for 
an official "production-ready" 1.0 release.


Do you by chance work as a manager? Managers like comparisons 
that involve one number, with a higher number being better. I 
don't know what can be learned about D from that comparison 
and I don't think anyone else does either.


That's your opinion.

First, most managers don't become manager by chance, but 
because of their skills.


Like being able to take the right decisions, based on facts, 
not on personal preferences.


For instance, if a good manager sees that the github project 
of a 4 years old compiler has been liked by 11206 persons, and 
the github project of a 16 years old compiler has been liked 
by 1806 persons, I think he could probably think that MUCH 
more people are interested in the development of the first 
github project than in the second.


I'd hope a manager would look at actually meaningful stats like 
downloads, rather than just fluffy stats such as "likes":


http://www.somsubhra.com/github-release-stats/?username=crystal-lang=crystal
http://www.somsubhra.com/github-release-stats/?username=ldc-developers=ldc

I see around 9k total downloads of the various Crystal 0.24 and 
0.25 versions over the last 8 months, compared to 14k downloads 
of the ldc 1.9 compiler alone from two months ago. Of course, 
all these stats can be gamed, but I think it'd be hard to argue 
Crystal is more popular.


Obviously you haven't read my post.

No problem, I'll repeat it.

I said that Crystal is probably gaining popularity FASTER than D.

I've never said that Crystal is more used than D.

FYI, D is in the top 50 at the TIOBE index, while Crystal is only 
in the top 100.


Of course, you will tell me that these rankings are numbers, and 
that a higher number means nothing. Right ?


Re: I have a plan.. I really DO

2018-06-30 Thread Ecstatic Coder via Digitalmars-d-announce

DasBetterC resolves that, though the library issue remains.


Indeed.

Unfortunately, it's often the standard library which makes the 
difference between a nice language, and a nice useful language.


D is a great language not only because of the many great 
decisions you made when designing the language (UFCS, slice-based 
strings and arrays, etc), but also because of its great standard 
library, which is well designed and very complete.


To be really useful as a C++ alternative, D still needs another 
standard library based on reference counting, instead of garbage 
collection, even if this implies that some class interfaces will 
have to diverge from their GC-based counterpart.


Without that, D will be a bit like a gun without ammunitions for 
many developers.



And Rust, despite it has perfect C/C++-like performance


D has perfect C/C++ like performance, if you code it the same 
way.


+1 :)



Re: I have a plan.. I really DO

2018-06-30 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 22:59:25 UTC, bachmeier wrote:

On Friday, 29 June 2018 at 20:13:07 UTC, Ecstatic Coder wrote:

Have a look at Crystal's Github project, you will see that 
Crystal, still in development and quite far from its 1.0 mile 
version (= despite no parallism and windows support, etc) 
ALREADY has 11206 stars, 881 forks and 292 contributors :


https://github.com/crystal-lang/crystal

Not bad for a language in its 0.25 version and first released 
in June 2014 (4 years), especially compared to D in its 2.0 
version and first released in December 2001 (16 years), whose 
official compiler has 1806 stars, 452 forks and 168 
contributors :


https://github.com/dlang/dmd

If those numbers means anything, I think its that Crystal is 
probably getting popularity much quicker than D, and honestly, 
after having tried it, I think it's really deserved, even if I 
agree that there are still many things that remain to be 
implemented before it's really ready for an official 
"production-ready" 1.0 release.


Do you by chance work as a manager? Managers like comparisons 
that involve one number, with a higher number being better. I 
don't know what can be learned about D from that comparison and 
I don't think anyone else does either.


That's your opinion.

First, most managers don't become manager by chance, but because 
of their skills.


Like being able to take the right decisions, based on facts, not 
on personal preferences.


For instance, if a good manager sees that the github project of a 
4 years old compiler has been liked by 11206 persons, and the 
github project of a 16 years old compiler has been liked by 1806 
persons, I think he could probably think that MUCH more people 
are interested in the development of the first github project 
than in the second.


But if you want to think the opposite, it's perfectly your right, 
I've got not problem with that.


Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 20:51:56 UTC, bauss wrote:

On Friday, 29 June 2018 at 20:13:07 UTC, Ecstatic Coder wrote:

On Friday, 29 June 2018 at 19:46:06 UTC, bauss wrote:

On Friday, 29 June 2018 at 19:42:56 UTC, Ecstatic Coder wrote:

On Friday, 29 June 2018 at 17:09:44 UTC, JN wrote:
On Friday, 29 June 2018 at 08:43:34 UTC, Ecstatic Coder 
wrote:
Once Crystal integrates parallelism (at 1.0), it should 
become de facto one of the best alternative to Go, Java, 
C#, etc, because it's actually "Go-made-right". For 
instance it's genericity system works well, and its type 
inference system natively support union types.




Except it has no Windows support and doesn't look like it 
will happen anytime soon. Some people might be living in a 
UNIX bubble, but Windows is a big market, and a language 
won't make it big without Windows support.


Right :)

But remember that Crystal is still in its infancy, as it 
hasn't reached its 1.0 version yet.


Parallelism is on its way, and Windows support too...

Don't forget that nowadays many (can I say most ?) servers 
are based on unix variants, so their platform support order 
looks perfectly fine and logical to me.


Actually a large share of servers run Windows Server and/or 
Azure servers running Windows too.


It's not logical to not support both.

D already has that advantage supporting pretty much every 
platform you can think of.


I agree, but you must compare what is comparable.

Have a look at Crystal's Github project, you will see that 
Crystal, still in development and quite far from its 1.0 mile 
version (= despite no parallism and windows support, etc) 
ALREADY has 11206 stars, 881 forks and 292 contributors :


https://github.com/crystal-lang/crystal

Not bad for a language in its 0.25 version and first released 
in June 2014 (4 years), especially compared to D in its 2.0 
version and first released in December 2001 (16 years), whose 
official compiler has 1806 stars, 452 forks and 168 
contributors :


https://github.com/dlang/dmd

If those numbers means anything, I think its that Crystal is 
probably getting popularity much quicker than D, and honestly, 
after having tried it, I think it's really deserved, even if I 
agree that there are still many things that remain to be 
implemented before it's really ready for an official 
"production-ready" 1.0 release.


Yes. Crystal is a fantastic language already.

As someone who uses many languages, I tend to just use what 
does the task at hand best.


I'm sure I'll be able to find some usage for Crystal when it's 
production ready, but it doesn't mean I'll abandon D. That'll 
probably never happen, especially considering I have a lot of 
projects written in D with thousands of lines of code.


Same for me :)


Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 19:46:06 UTC, bauss wrote:

On Friday, 29 June 2018 at 19:42:56 UTC, Ecstatic Coder wrote:

On Friday, 29 June 2018 at 17:09:44 UTC, JN wrote:

On Friday, 29 June 2018 at 08:43:34 UTC, Ecstatic Coder wrote:
Once Crystal integrates parallelism (at 1.0), it should 
become de facto one of the best alternative to Go, Java, C#, 
etc, because it's actually "Go-made-right". For instance 
it's genericity system works well, and its type inference 
system natively support union types.




Except it has no Windows support and doesn't look like it 
will happen anytime soon. Some people might be living in a 
UNIX bubble, but Windows is a big market, and a language 
won't make it big without Windows support.


Right :)

But remember that Crystal is still in its infancy, as it 
hasn't reached its 1.0 version yet.


Parallelism is on its way, and Windows support too...

Don't forget that nowadays many (can I say most ?) servers are 
based on unix variants, so their platform support order looks 
perfectly fine and logical to me.


Actually a large share of servers run Windows Server and/or 
Azure servers running Windows too.


It's not logical to not support both.

D already has that advantage supporting pretty much every 
platform you can think of.


I agree, but you must compare what is comparable.

Have a look at Crystal's Github project, you will see that 
Crystal, still in development and quite far from its 1.0 mile 
version (= despite no parallism and windows support, etc) ALREADY 
has 11206 stars, 881 forks and 292 contributors :


https://github.com/crystal-lang/crystal

Not bad for a language in its 0.25 version and first released in 
June 2014 (4 years), especially compared to D in its 2.0 version 
and first released in December 2001 (16 years), whose official 
compiler has 1806 stars, 452 forks and 168 contributors :


https://github.com/dlang/dmd

If those numbers means anything, I think its that Crystal is 
probably getting popularity much quicker than D, and honestly, 
after having tried it, I think it's really deserved, even if I 
agree that there are still many things that remain to be 
implemented before it's really ready for an official 
"production-ready" 1.0 release.




Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 18:48:19 UTC, bauss wrote:

On Friday, 29 June 2018 at 17:08:12 UTC, Ecstatic Coder wrote:
If you're a web developer with no dependencies then youre 
either reinventing the wheel (could cause trouble in the long 
run, if your implementations aren't correct.) Or your 
application just isn't more than a hobby project.


Most enterprise projects will have dependencies outside 
standard libraries and that is true for ex. Go too.


I agree with you, but what I mean is that all those nice Go 
and Crystal web frameworks are actually implemented using 
exactly the same building blocks, so that their authors didn't 
have to reinvent the wheel to reimplement them.


That's why there are so many available frameworks, and you can 
easily pick one which closely matches your needs and 
preferences...


Well you don't really need to re-invent the wheel at all with D 
either tbh.


You would need to with vibe.d, because it's really just the 
skeleton of a web application, but with Diamond? Not so much. 
It supports things that other frameworks don't even support, 
which you will end up implementing yourself anyway in 99% of 
all other frameworks. To give an example, consent, privacy and 
GDPR. There is no framework, at least what I have seen, that 
has compliance for such things implemented, but Diamond has it 
usable straight out of the box. Another example would be 
validation for email, url, various credit-cards, files (Not 
just extension, but also whether the data is correct.) etc. 
most of such validations are very limited in other frameworks 
or non-existent at all.


My point is that, even if those languages has http somewhat 
standard, they do not implement actual features that are useful 
to your business logic, application design etc. only to the 
skeleton.


However with frameworks in D you do get the best of both worlds.

http://diamondmvc.org/


Indeed this framework looks really complete, and should get much 
more promotion from D's official website.


But I still think that D's vision of what should be included in 
the standard library really diverges from those of Go and 
Crystal, despite this strategy has worked pretty well for them, 
and that Diamond clearly proves that D has all the basic language 
features to compete well with them (native performance, 
fiber-based concurrency, great string and array support, etc).





Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 17:09:44 UTC, JN wrote:

On Friday, 29 June 2018 at 08:43:34 UTC, Ecstatic Coder wrote:
Once Crystal integrates parallelism (at 1.0), it should become 
de facto one of the best alternative to Go, Java, C#, etc, 
because it's actually "Go-made-right". For instance it's 
genericity system works well, and its type inference system 
natively support union types.




Except it has no Windows support and doesn't look like it will 
happen anytime soon. Some people might be living in a UNIX 
bubble, but Windows is a big market, and a language won't make 
it big without Windows support.


Right :)

But remember that Crystal is still in its infancy, as it hasn't 
reached its 1.0 version yet.


Parallelism is on its way, and Windows support too...

Don't forget that nowadays many (can I say most ?) servers are 
based on unix variants, so their platform support order looks 
perfectly fine and logical to me.


Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce
If you're a web developer with no dependencies then youre 
either reinventing the wheel (could cause trouble in the long 
run, if your implementations aren't correct.) Or your 
application just isn't more than a hobby project.


Most enterprise projects will have dependencies outside 
standard libraries and that is true for ex. Go too.


I agree with you, but what I mean is that all those nice Go and 
Crystal web frameworks are actually implemented using exactly the 
same building blocks, so that their authors didn't have to 
reinvent the wheel to reimplement them.


That's why there are so many available frameworks, and you can 
easily pick one which closely matches your needs and 
preferences...


Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 10:06:12 UTC, bauss wrote:

On Friday, 29 June 2018 at 08:43:34 UTC, Ecstatic Coder wrote:
As you know, I'm convinced that D could be marketed as the 
perfect language to develop native web servers and mobile 
applications, and have its core libraries somewhat extended in 
thqg direction, like Go and Crystal which allow "plug'n'play" 
web server development for instance


D allows for plug n' play web server development too.


Then this should be more advertised...

For instance :

https://crystal-lang.org/

The FIRST paragraph of text of Crystal's web page is :

"Syntax

Crystal’s syntax is heavily inspired by Ruby’s, so it feels 
natural to read and easy to write, and has the added benefit of a 
lower learning curve for experienced Ruby devs.


# A very basic HTTP server
require "http/server"

server = HTTP::Server.new do |context|
  context.response.content_type = "text/plain"
  context.response.print "Hello world, got 
#{context.request.path}!"

end

puts "Listening on http://127.0.0.1:8080;
server.listen(8080)
"

So the FIRST thing you learn about Crystal is that the standard 
library already gives you all you need to program a simple "hello 
world" web server.


The Go standard library is also known to provide the same 
building blocks :


package main

import (
"fmt"
"net/http"
)

func main() {
	http.HandleFunc("/", func(w http.ResponseWriter, r 
*http.Request) {

fmt.Fprintf(w, "Hello, you've requested: %s\n", r.URL.Path)
})

http.ListenAndServe(":80", nil)
}

Both are batteries-included for web development. That's why many 
developers don't feel the need to use thirdparty frameworks to 
implement their microservices...


So if it's also the case for D, then sorry for my mistake...


Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

Anyway, I try to avoid GC as much as possible.
The main issue for me in game development with D is the 
cross-compilation (e.g. iOS, Windows Universal Platform..).


+1

That's why I don't think C++ will be soon replaced by Rust, D, etc

Maybe in a few years, but obviously not right now...


Re: 'static foreach' chapter and more

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Tuesday, 26 June 2018 at 01:52:42 UTC, Ali Çehreli wrote:
I've made some online improvements to "Programming in D" since 
September 2017.


  http://ddili.org/ders/d.en/index.html

NOTE: The copies of the book at hard copy printers are not 
updated yet. If you order from Amazon etc. it will still be the 
OLD version. I need some more time to work on that... Also, 
only the PDF electronic format is up-to-date; other ebook 
formats are NOT.


* The code samples are now up-to-date with 2.080.1

* Digit separator (%,) format specifier: 
http://ddili.org/ders/d.en/formatted_output.html#ix_formatted_output.separator


* Stopwatch is moved to module std.datetime.stopwatch

* Replace 'body' with 'do'

* Text file imports (string imports): 
http://ddili.org/ders/d.en/mixin.html#ix_mixin.file%20import


* First assignment to a member is construction (search for that 
text on the page): 
http://ddili.org/ders/d.en/special_functions.html#ix_special_functions.this,%20constructor


* static foreach: 
http://ddili.org/ders/d.en/static_foreach.html#ix_static_foreach.static%20foreach


Ali


Thanks for also providing this book as a free download.

It's THE perfect book both for people who are learning to program 
for the first time as for experience developers who are just 
learning the D language !


Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 07:03:52 UTC, Dmitry Olshansky wrote:
I never ever (I think) did something provocative, something to 
finally see:


- who in the community WANTS D language to succeed?

- who are just these funny “people” let’s call th this, that 
are I don’t know “just hang around”


Because shame is a weapon much like fear (of death esp), pride 
can be used as weapon but ehm better shame the bastard...


And so on.

So - until we all understand that these donations are not 
because we are begging fir money.


I will send ~ 10$ each day _specifically_ to see who WANTS D TO 
SUCCED and WILL NOT BE SHAMED LIKE THAT FOR ONCE!


It is because it’s (soon) your last chance to invest into the 
Future.


P.S. I mean what you think the future of native code is??? 
Rust? Crystal?? Nim???


And btw, if D could have its standard libraries and language 
features (strings, arrays, maps, slices, etc) also NATIVELY work 
without GC (= NATIVE weak/strong reference counting), IMHO D 
could perfectly be the future of native code, as it could become 
a better alternative to both C++, Rust, Go, etc




Re: I have a plan.. I really DO

2018-06-29 Thread Ecstatic Coder via Digitalmars-d-announce

On Friday, 29 June 2018 at 07:03:52 UTC, Dmitry Olshansky wrote:
I never ever (I think) did something provocative, something to 
finally see:


- who in the community WANTS D language to succeed?

- who are just these funny “people” let’s call th this, that 
are I don’t know “just hang around”


Because shame is a weapon much like fear (of death esp), pride 
can be used as weapon but ehm better shame the bastard...


And so on.

So - until we all understand that these donations are not 
because we are begging fir money.


I will send ~ 10$ each day _specifically_ to see who WANTS D TO 
SUCCED and WILL NOT BE SHAMED LIKE THAT FOR ONCE!


It is because it’s (soon) your last chance to invest into the 
Future.


P.S. I mean what you think the future of native code is??? 
Rust? Crystal?? Nim???


I know most people here don't agree with me, but I think you're 
fighting an already lost battle ;)


As you know, I'm convinced that D could be marketed as the 
perfect language to develop native web servers and mobile 
applications, and have its core libraries somewhat extended in 
thqg direction, like Go and Crystal which allow "plug'n'play" web 
server development for instance, but obviously the D "leadership" 
remains convinced that D must be sold as the best alternative to 
C++.


Personally I'm a complete D fan because it is SOOO MUCH better 
than JavaScript/Python/Perl/etc for file processing...


For engine and game development I'm still using C++, despite I 
prefer D, and believe me this won't change for a while.


Game development is a very special use case, but personally I 
don't think that many of those who use C++ for close-to-the-metal 
development should be that much interested in switching to D, 
because most of its standard libraries depend on the presence of 
a GC...


And to answer your question, IMHO the future of native code 
probably remains C++ (not Rust) for system programming, and 
(unfortunately) Go for web development (great ecosystem, db 
drivers, often faster than Java, C#, Dart, etc) despite it lacks 
several core feature many developers need (generics, etc).


Once Crystal integrates parallelism (at 1.0), it should become de 
facto one of the best alternative to Go, Java, C#, etc, because 
it's actually "Go-made-right". For instance it's genericity 
system works well, and its type inference system natively support 
union types.


Nim disqualifies itself because contrarily to D and C# for 
instance, it doesn't manage mutual dependencies automatically for 
you, which is a pity.


And Rust, despite it has perfect C/C++-like performance and 
doens't need a GC, its borrow checker made it a hell to use at 
first, as unfortunately Rust hasn't integrated strong/weak 
references as a core feature of the language (Rc/Weak are 
templates, RefCell is needed for mutability, etc), despite that's 
actually what many C++ developers use today for resource 
management, and would be more than enough for them to get their 
job done once they switch to Rust...




Re: How an Engineering Company Chose to Migrate to D

2018-06-28 Thread Ecstatic Coder via Digitalmars-d-announce
IMHO, implementing a EP-to-D source code converter was 
probably more risky than simply extending an existing Pascal 
Compiler in that case.


Risc is in the eye of the beholder ;-)



Indeed :)

But that doesn't mean I'm completely wrong.

I also enjoy A LOT implementing minimalistic transpilers using 
the most simplistic code possible, because implementing manual 
tokenization and a parsing using only "baby code" is really all 
that's needed for my small DSLs.


My Github account is literally full of them ;)

So yes, implementing transpilers is incredibly fun and easy.

But implementing full blown compilers too actually.

And the advantage of compilers which generate assembly code is 
that you don't have to fight with the unavoidable limitations of 
the high-level target language.


For instance I've implemented my first "true" compiler in Basic 
when I was 13 years old, in order to implement my first 3D 
renderer and game for my C64 (a simple 3D wireframe tank game 
using a custom 2x4 pixel charset for rendering), as I quickly 
found out that it was actually much faster and easier to 
implement it in a minimalistic basic-like language with 
integrated fixed point and pointer arithmetic converted into 6502 
machine language, than to implement the game directly in 6502 
assembler.


So if at one moment you hit a wall with the transpiling approach, 
you should consider trusting me if I say that implementing an EP 
compiler which emits IL code could actually be just a matter of 
months.


Look at the code of this tutorial, which shows how to implement a 
very limited closure-based language (i.e. with local functions 
and variable) in C using just Flex and Bison :


https://github.com/senselogic/COMPILER_TUTORIAL

It was implemented in just a few days, and if you check by 
yourself, you will see that it's 100% baby code...


So if you change your mind and decide to implement your own 
extended EP compiler (i.e. with additional modern features), you 
could be astonished by the number of passionate developers who 
could also be interested in this "modern object Pascal" project...


That's the approach they've had for Crystal, and so far it's 
worked quite well for them...






Re: Any comments about the new Ruby JIT Compiler

2018-06-23 Thread Ecstatic Coder via Digitalmars-d
On Wednesday, 13 June 2018 at 08:21:45 UTC, Martin Tschierschke 
wrote:
The compilation is done by using the C compiler in the 
background.


https://www.ruby-lang.org/en/news/2018/05/31/ruby-2-6-0-preview2-released/

Could D be an better choice for that purpose?


Any comment?


Wrong strategy...

https://blog.codeship.com/an-introduction-to-crystal-fast-as-c-slick-as-ruby/

"This is a naive Fibonacci implementation for Crystal (it’s also 
valid Ruby):


# fib.cr
def fib(n)
  if n <= 1
1
  else
fib(n - 1) + fib(n - 2)
  end
end

puts fib(42)

Let’s run it and see how long it takes!

time crystal fib.cr
433494437
crystal fib.cr  2.45s user 0.33s system 98% cpu 2.833 total

Since this is also valid Ruby, let’s run it with Ruby this time

time ruby fib.cr
433494437
ruby fib.cr  38.49s user 0.12s system 99% cpu 38.718 total

Crystal took 2.833 seconds to complete. Ruby took 38.718 seconds 
to complete. Pretty cool. We get 20x performance for free. What 
if we compile our program with optimizations turned on?


crystal build --release fib.cr

time ./fib
433494437
./fib  1.11s user 0.00s system 99% cpu 1.113 total

crystal-vs-ruby-benchmark

1.113 seconds. Now we’re nearly 35 times faster than Ruby."

Obviously this benchmark will have to be updated, but I don't 
know how the Ruby developer will manage to beat Crystal with 
their JIT compilation...





Re: How an Engineering Company Chose to Migrate to D

2018-06-23 Thread Ecstatic Coder via Digitalmars-d-announce
Man, proggit can be savage with the criticism. Every Nim/Rust 
and the one Ada programmer have come out of the woodwork to 
make sure you know their language supports nested functions. 
You've seemingly got to be an expert in every current language 
to write a comparison article that suggests D may have some 
advantages.


I've read the criticisms about the choice of the alternative 
language on the Reddit page, and I think that most of them are 
finally quite unfair.


In my programming career, I've already used many strongly-typed 
languages (C, C++, C#, Java, D, Go, Rust, Nim, Crystal, Julia, 
Pascal, etc) for at least one professional or personal project, 
and I'm also convinced that D is a good alternative to EP, 
especially compared to C++, Go and Rust for instance.


Where I disagree with Bastiaan is on the rejection of the Pascal 
language itself, as there are other open-source Pascal compilers 
(GNU Pascal in EP mode) which could have been used and enhanced 
to match the company requirements, while preserving the company 
future for the decades to come.


IMHO, implementing a EP-to-D source code converter was probably 
more risky than simply extending an existing Pascal Compiler in 
that case.


Like everybody here, I hope that Bastiaan efforts will pay in the 
long term, but I'm not as optimistic as many here that this will 
end as a success story, as I'm not sure that his teammates will 
really enjoy working the automatically generated D code as much 
as on the original source code...





Textual database designer (Basil 2.0)

2018-06-16 Thread Ecstatic Coder via Digitalmars-d-announce
For those interested, Basil, my textual database designer, can 
now export database schemas in SQL, CQL, Go and Crystal format, 
and their fake data in SQL and CQL format.


I've slightly changed the syntax so that the table columns can 
use any combination of scalar types, foreign keys, tuples, maps, 
lists and sets.


You can download it here :

https://github.com/senselogic/BASIL

Here are two sample scripts illustrating the new syntax :

BLOG | count 5

SECTION

Id : UINT64 | key, unique, incremented
Number : UINT64
Name : STRING | capacity 45
Text : STRING
Image : STRING | capacity 45

ImageIndex : UINT64 | !stored

USER

Id : UINT64 | key, unique, incremented
FirstName : STRING | capacity 45
LastName : STRING | capacity 45
Email : STRING | capacity 45
Pseudonym : STRING | capacity 45
Password : STRING | capacity 45
Phone : STRING | capacity 45
Street : STRING
City : STRING | capacity 45
Code : STRING | capacity 45
Region : STRING | capacity 45
Country : STRING | capacity 45
Company : STRING | capacity 45
ItIsAdministrator : BOOL

ARTICLE | count 15

Id : UINT64 | key, unique, incremented
SectionId : SECTION.Id | partitioned
UserId : USER.Id | clustered
Title : STRING
Text : STRING
Image : STRING | capacity 45
Date : DATE

Section : POINTER[ SECTION ] | !stored
User : POINTER[ USER ] | !stored
ImageIndex : UINT64 | !stored

COMMENT | count 30

Id : UINT64 | key, unique, incremented
ArticleId : ARTICLE.Id | indexed
UserId : USER.Id | indexed
Text : STRING | english 2 4 5 7
DateTime : DATETIME

Article : POINTER[ ARTICLE ] | !stored
User : POINTER[ USER ] | !stored

SUBSCRIBER

Id : UINT64 | key, unique, incremented
Name : STRING | capacity 45
Email : STRING | capacity 45

TEST | count 10

SIMPLE

Uuid : UUID | key, unique
Bool : BOOL | partitioned
Int8 : INT8 | clustered
Uint8 : UINT8 | indexed
Int16 : INT16
Uint16 : UINT16
Int32 : INT32
Uint32 : UINT32
Int64 : INT64
Uint64 : UINT64
Float32 : FLOAT32
Float64 : FLOAT64
String : STRING
Date : DATE | unique
DateTime : DATETIME
Blob : BLOB

COMPOUND

Id : INT32 | key, unique, incremented
Location : Country : STRING | uppercase
Name : TUPLE[ FirstName : STRING, LastName : STRING ] | 
unique
NameSet : SET[ TUPLE[ FirstName : STRING, LastName : 
STRING ] ] | count 2
CompanyMap : MAP[ Phone : STRING, Company : STRING ] | 
count 2

EmailSet : SET[ Email : STRING ] | count 2
PhoneList : LIST[ Phone : STRING ] | count 2
SimpleDate : SIMPLE.Date
SimpleDateMap : MAP[ COMPOUND.Name, SIMPLE.Date ] | count 
2

SimpleDateSet : SET[ SIMPLE.Date ] | count 2
SimpleDateList : LIST[ SIMPLE.Date ] | count 1 3
NameSetMap : MAP[ SIMPLE.Date, COMPOUND.NameSet ] | count 
2

SimplePointerArray : ARRAY[ POINTER[ SIMPLE ] ] | !stored



  1   2   3   >