Re: D Language Foundation Monthly Meeting Summary (September 24, 2021)

2022-01-13 Thread Mike Parker via Digitalmars-d-announce

On Friday, 14 January 2022 at 06:12:51 UTC, Konstantin wrote:



Hello, Max!
Are there any news or estimates about the roadmap?


I posted a note about it in a meeting summary or a blog post 
(can't remember where) a few weeks ago. But the short of it: in 
the process of revising it, I realized it needs a complete 
rewrite. The document we ended up with isn't what we said we 
wanted.


The rewrite will be a high priority for me once I wrap up the 
DConf Online Q & A videos and get the DIP queue moving again. 
I'll need more input from other people, and the time that takes 
is never predictable, but I intend to get it published in the 
first quarter of this year.


And for the record, it's *not* a roadmap (assuming roadmap means 
a step-by-step plan for language evolution). It's broader than 
that. The intent is to outline Walter's and Atila's current 
focus, their longer-term goals, and areas where contributors can 
direct their efforts. It's a living document that will evolve as 
priorities change.


Re: Why I Like D

2022-01-13 Thread Araq via Digitalmars-d-announce

On Friday, 14 January 2022 at 02:13:48 UTC, H. S. Teoh wrote:
It takes 10x the effort to write a shell-script substitute in 
C++ because at every turn the language works against me -- I 
can't avoid dealing with memory management issues at every turn 
-- should I use malloc/free and fix leaks / dangling pointers 
myself? Should I use std::autoptr? Should I use 
std::shared_ptr? Write my own refcounted pointer for the 15th 
time?  Half my APIs would be cluttered with memory management 
paraphrenalia, and half my mental energy would be spent 
fiddling with pointers instead of MAKING PROGRESS IN MY PROBLEM 
DOMAIN.


With D, I can work at the high level and solve my problem long 
before I even finish writing the same code in C++.


Well C++ ships with unique_ptr and shared_ptr, you don't have to 
roll your own. And you can use them and be assured that the 
performance profile of your program doesn't suddenly collapse 
when the data/heap grows too big as these tools assure 
independence of the heap size. (What does D's GC assure you? That 
it won't run if you don't use it? That's such a low bar...)


Plus with D you cannot really work at the "high level" at all, it 
is full of friction. Is this data const? Or immutable? Is this 
@safe? @system? Should I use @nogc? Are exceptions still a good 
idea? Should I use interfaces or inheritance? Should I use class 
or struct? Pointers or inout? There are many languages where it's 
much easier to focus on the PROBLEM DOMAIN. Esp if the domain is 
"shell-script substitute".


Re: D Language Foundation Monthly Meeting Summary (September 24, 2021)

2022-01-13 Thread Konstantin via Digitalmars-d-announce

On Friday, 1 October 2021 at 23:53:46 UTC, max haughton wrote:

On Friday, 1 October 2021 at 21:48:23 UTC, Konstantin wrote:

On Friday, 1 October 2021 at 12:32:20 UTC, Mike Parker wrote:

Attendees:

Andrei Alexandrescu
Walter Bright
Iain Buclaw
Ali Çehreli
Max Haughton
Martin Kinkelin
Mathias Lang
Razvan Nitu
Mike Parker

[...]


Offtopic:
Are there any plans to publish the roadmap for the language 
and stdlib development on wiki or elsewhere?


Mike is editing it at the moment. It will probably go into the 
foundation Ddoc sources (Not a huge of the wiki since it's not 
tracked in git)


Hello, Max!
Are there any news or estimates about the roadmap?


Re: Why I Like D

2022-01-13 Thread forkit via Digitalmars-d-announce

On Friday, 14 January 2022 at 02:13:48 UTC, H. S. Teoh wrote:


How is using D "losing autonomy"?  Unlike Java, D does not 
force you to use anything. You can write all-out GC code, you 
can write @nogc code (slap it on main() and your entire program 
will be guaranteed to be GC-free -- statically verified by the 
compiler). You can write functional-style code, and, thanks to 
metaprogramming, you can even use more obscure paradigms like 
declarative programming.




I'm talking about the 'perception of autonomy' - which will 
differ between people. Actual autonomy does not, and cannot, 
exist.


I agree, that if a C++ programmer wants the autonomy of chosing 
between GC or not, in their code, then they really don't have 
that autonomy in C++ (well, of course they do actually - but some 
hoops need to be jumped through).


My point is, that a C#/Java programmer is more likely to be 
attracted to D, because D creates a perception of there being 
more autonomy (than that provided by C#/Java).


I'm not saying it's the only thing people consider. Obviously 
their choice is also reflected by the needs of their problem 
domain, their existing skill set, the skillset of those they work 
with, the tools they need, the extent to which their identity is 
attached to a language or community, etc..etc.


And I'm just talking about probability - that is, people are more 
likely to be attracted to something new, something that could 
benefit them, if it also enhances their perception of autonomy, 
or at least, does not not seek to diminish their existing 
autonomy (e.g a C programmer might well be attracted to betterC, 
for example).


D should really focus more on marketing one of its biggest 
strenghts - increased autonomy (well, the perception of).


Getting back to the subject of this thread, that's why 'I' like D.





Re: Why I Like D

2022-01-13 Thread H. S. Teoh via Digitalmars-d-announce
On Fri, Jan 14, 2022 at 01:19:01AM +, forkit via Digitalmars-d-announce 
wrote:
[...]
> C provides even greater autonomy over both C++ and D. And I'd argue,
> that's why C remains so useful, and so popular (for those problems
> where such a level of autonomy is needed).
> 
> By, 'autonomy', I mean a language provided means, for choosing what
> code can do, and how it does it.
[...]
> An aversion to losing that autonomy, I believe, is a very real reason
> as to why larger numbers of C++ programmers do not even consider
> switching to D.

How is using D "losing autonomy"?  Unlike Java, D does not force you to
use anything. You can write all-out GC code, you can write @nogc code
(slap it on main() and your entire program will be guaranteed to be
GC-free -- statically verified by the compiler). You can write
functional-style code, and, thanks to metaprogramming, you can even use
more obscure paradigms like declarative programming.

If anything, D makes it *easier* to have "autonomy", because its
metaprogramming capabilities let you do so without contorting syntax or
writing unmaintainable write-only code.  I can theoretically do
everything in C++ that I do in D, for example, but C++ requires that I
spend 5x the amount of effort to navigate its minefield of language
gotchas (and then 50x the effort to debug the resulting mess), and
afterwards I have to visit the optometrist due to staring at unreadable
syntax for extended periods of time.

In D, I get to choose how low-level I want to go -- if all I need is a
one-off shell script substitute, I can just allocate away and the GC
will worry about cleaning after me.  If I need to squeeze out more
performance, I run the profiler and identify GC hotspots and fix them
(or discover that the GC doesn't even affect performance, and redirect
my efforts elsewhere, where it actually matters more).  If that's not
enough, GC.disable and GC.collect lets me control how the GC behaves.
If that's still not enough, I slap @nogc on my inner loops and pull out
malloc/free.

In C++, I'm guaranteed that there is no GC -- even when having a GC
might actually help me achieve what I want.  In order to reap the
benefits of a GC in C++, I have to jump through *tons* of hoops --
install a 3rd party GC, carefully read the docs to avoid doing things
that might break it ('cos it's not language-supported), be excluded from
using 3rd party libraries that are not compatible with the GC, etc..
Definitely NOT worth the effort for one-off shell script replacements.
It takes 10x the effort to write a shell-script substitute in C++
because at every turn the language works against me -- I can't avoid
dealing with memory management issues at every turn -- should I use
malloc/free and fix leaks / dangling pointers myself? Should I use
std::autoptr? Should I use std::shared_ptr? Write my own refcounted
pointer for the 15th time?  Half my APIs would be cluttered with memory
management paraphrenalia, and half my mental energy would be spent
fiddling with pointers instead of MAKING PROGRESS IN MY PROBLEM DOMAIN.

With D, I can work at the high level and solve my problem long before I
even finish writing the same code in C++.  And when I need to dig under
the hood, D doesn't stop me -- it's perfectly fine with malloc/free and
other such alternatives.  Even if I can't use parts of Phobos because of
GC dependence, D gives me the tools to roll my own easily. (It's not as
if I don't already have to do it myself in C++ anyway -- and D is a
nicer language for it; I can generally get it done faster in D.)

Rather than take away "autonomy", D empowers me to choose whether I want
to do things manually or use the premade high-level niceties the
language affords me. (*And* D lets me mix high-level and low-level code
in the same language. I can even drop down to asm{} blocks if that's
what it takes. Now *that's* empowerment.) With C++, I HAVE to do
everything manually. It's actually less choice than D affords me.


T

-- 
People tell me I'm stubborn, but I refuse to accept it!


Re: Why I Like D

2022-01-13 Thread forkit via Digitalmars-d-announce

On Thursday, 13 January 2022 at 21:32:15 UTC, Paul Backus wrote:


Actually, I think *self*-government has very little to do with 
it.




I'm not so sure.

Presumably, C++ provides a programmer with much greater autonomy 
over their code than D?


C provides even greater autonomy over both C++ and D. And I'd 
argue, that's why C remains so useful, and so popular (for those 
problems where such a level of autonomy is needed).


By, 'autonomy', I mean a language provided means, for choosing 
what code can do, and how it does it.


A language that makes you jump through loops to get that 
autonomy, will serve a niche purpose (like Java for example).


An aversion to losing that autonomy, I believe, is a very real 
reason as to why larger numbers of C++ programmers do not even 
consider switching to D.


Of course, even if they did consider D, there are other 
considerations at play as well.


It think this is also why D (in contrast to C++ programmers) 
tends to attract programmers from the C# and Java like world. 
That is, D provides greater autonomy (which should translate to 
greater freedom to innovate and be creative with code).


Of course autonomy is not something that is real.

Only the 'perception of autonomy' can be real ;-)



Re: Why I Like D

2022-01-13 Thread H. S. Teoh via Digitalmars-d-announce
On Thu, Jan 13, 2022 at 09:32:15PM +, Paul Backus via 
Digitalmars-d-announce wrote:
> On Wednesday, 12 January 2022 at 20:48:39 UTC, forkit wrote:
[...]
> > Programmers want the right of self-government, over their code.
> 
> Actually, I think *self*-government has very little to do with it.
[...]
> So, why do so many programmers reject D? Because there's something
> else they care about more than their own autonomy: other programmers'
> *lack* of autonomy. Or, as it's usually put, "the ecosystem."
[...]
> Suppose you've already decided that you don't want to use a GC, and
> you also don't want to write every part of your project from
> scratch--that is, you would like to depend on existing libraries.
> Where would you rather search for those libraries: code.dlang.org, or
> crates.io? Who would you want the authors of those libraries to be:
> self-governing, autonomous programmers, who are free to use GC as much
> or as little as they like; or programmers who have chosen to give up
> that autonomy and limit themselves to *never* using GC?

This reminds me of the Lisp Curse: the language is so powerful that
everyone can easily write their own [GUI toolkit] (insert favorite
example library here).  As a result, everyone invents their own
solution, all solving more-or-less the same problem, but just
differently enough to be incompatible with each other. And since they're
all DIY solutions, they each suffer from a different set of
shortcomings.  As a result, there's a proliferation of [GUI toolkits],
but none of them have a full feature set, most are in various states of
(in)completion, and all are incompatible with each other.

For the newcomer, there's a bewildering abundance of choices, but none
of them really solves his particular use-case (because none of the
preceding authors faced his specific problem).  As a result, his only
choices are to arbitrarily choose one solution and live with its
problems, or reinvent his own solution. (Or give up and go back to Java.
:-D)

Sounds familiar? :-P


T

-- 
Democracy: The triumph of popularity over principle. -- C.Bond


Re: Why I Like D

2022-01-13 Thread Ola Fosheim Grøstad via Digitalmars-d-announce

On Thursday, 13 January 2022 at 21:32:15 UTC, Paul Backus wrote:
As you correctly observe, D is a great language for programmers 
who want autonomy--far better than something like Java, Go, or 
Rust, which impose relatively strict top-down visions of how 
code ought to be written.


I keep seeing people in forum threads claiming that Rust is not a 
system level language, but a high level language (that poses as 
system level).


With the exception of exceptions (pun?) C++ pretty much is an 
add-on language. You can enable stuff you need. The default is 
rather limited. I personally always enable g++-extensions. And 
having to deal with exceptions when using the system library is a 
point of contention. It should have been an add-on for C++ to 
fulfil the system level vision.


C is very much bare bone, but you have different compilers that 
"adds on" things you might need for particular niches. Which of 
course is also why the bit widths are platform dependent. By 
being bare bone C is to a large extent extended by add ons in 
terms of macros and assembly routines for specific platforms.


This modular add-on aspect is essential for system level 
programming as the contexts are very different (hardware, OS, 
usage, correctness requirements etc).


In hardcore system level programming the eco system actually 
isn't all that critical. Platform support is important. Cross 
platform is important. One singular domain specific framework 
might be important. But you will to a large extent end up writing 
your own libraries.





Re: Why I Like D

2022-01-13 Thread Paul Backus via Digitalmars-d-announce

On Wednesday, 12 January 2022 at 20:48:39 UTC, forkit wrote:
Fear of GC is just a catch-all-phrase that serves no real 
purpose, and provides no real insight into what programmers are 
thinking.


It's all about autonomy and self-government (on the decision of 
whether to use GC or not, or when to use it, and when not to 
use it.


Programmers want the right of self-government, over their code.


Actually, I think *self*-government has very little to do with it.

As you correctly observe, D is a great language for programmers 
who want autonomy--far better than something like Java, Go, or 
Rust, which impose relatively strict top-down visions of how code 
ought to be written. In D, you can write C-style procedural code, 
Java-style object-oriented code, or (with a bit of effort) even 
ML-style functional code. You can use a GC, or you can avoid it. 
You can take advantage of built-in memory-safety checking, or you 
can ignore it. If what programmers care about is autonomy, it 
seems like D should be the ideal choice.


So, why do so many programmers reject D? Because there's 
something else they care about more than their own autonomy: 
other programmers' *lack* of autonomy. Or, as it's usually put, 
"the ecosystem."


If you go to crates.io and download a Rust library, you can be 
almost 100% sure that library will not use GC, because Rust 
doesn't have a GC. If you go to pkg.go.dev and download a Go 
library, you can be almost 100% sure that library *will* use GC, 
because Go *does* have a GC.


On the other hand, if you go to code.dlang.org and download a D 
library...well, who knows? Maybe it'll use the GC, and maybe it 
won't. The only way to tell is to look at that specific library's 
documentation (or its source code).


Suppose you've already decided that you don't want to use a GC, 
and you also don't want to write every part of your project from 
scratch--that is, you would like to depend on existing libraries. 
Where would you rather search for those libraries: 
code.dlang.org, or crates.io? Who would you want the authors of 
those libraries to be: self-governing, autonomous programmers, 
who are free to use GC as much or as little as they like; or 
programmers who have chosen to give up that autonomy and limit 
themselves to *never* using GC?


If you're working on a project as a solo developer, autonomy is 
great. But if you're working as part of a team, you don't want 
every team member to be fully autonomous--you want some kind of 
guidance and leadership to make sure everyone is moving in the 
same direction.


In a business setting, that leadership comes from your boss. But 
in an open-source community, there is no boss. In open source, 
the only source of leadership and guidance is *the language 
itself*. If you want to make sure other programmers in your 
community--your "team"--all agree to not use a GC, the only way 
you can do that is by choosing a language where GC isn't even an 
option.


Re: Why I Like D

2022-01-13 Thread forkit via Digitalmars-d-announce

On Thursday, 13 January 2022 at 11:30:40 UTC, zjh wrote:

On Thursday, 13 January 2022 at 03:10:14 UTC, zjh wrote:

I'm a `GC phobia`.


"A phobia is an irrational fear of something that's unlikely to 
cause harm."


"A phobia is a type of anxiety disorder defined by a persistent 
and excessive fear of an object or situation."


"A phobia is an excessive and irrational fear reaction."

" phobias .. are maladaptive fear response"


plz... go get some help ;-)


Re: LDC 1.28.1

2022-01-13 Thread H. S. Teoh via Digitalmars-d-announce
On Thu, Jan 13, 2022 at 03:51:07PM +, kinke via Digitalmars-d-announce 
wrote:
> A new patch version was just released:
> 
> * Based on D 2.098.1+ (stable from 2 days ago).

Big thanks to the LDC team for continuing to deliver one of the best D
compilers around!


T

-- 
Государство делает вид, что платит нам зарплату, а мы делаем вид, что работаем.


Re: Why I Like D

2022-01-13 Thread H. S. Teoh via Digitalmars-d-announce
On Thu, Jan 13, 2022 at 10:21:12AM +, Stanislav Blinov via 
Digitalmars-d-announce wrote:
[...]
> Oh there is a psychological barrier for sure. On both sides of the,
> uh, "argument". I've said this before but I can repeat it again: time
> it. 4 milliseconds. That's how long a single GC.collect() takes on my
> machine.  That's a quarter of a frame. And that's a dry run. Doesn't
> matter if you can GC.disable or not, eventually you'll have to
> collect, so you're paying that cost (more, actually, since that's not
> going to be a dry run). If you can afford that - you can befriend the
> GC. If not - GC goes out the window.

?? That was exactly my point. If you can't afford it, you use @nogc.
That's what it's there for!

And no, if you don't GC-allocate, you won't eventually have to collect
'cos there'd be nothing to collect. Nobody says you HAVE to use the GC.
You use it when it fits your case; when it doesn't, you GC.disable or
write @nogc, and manage your own allocations, e.g., with an arena
allocator, etc..

Outside of your game loop you can still use GC allocations freely. You
just collect before entering the main loop, then GC.disable or just
enter @nogc code. You can even use GC memory to pre-allocate your arena
allocator buffers, then run your own allocator on top of that. E.g.,
allocate a 500MB buffer (or however big you need it to be) before the
main loop, then inside the main loop a per-frame arena allocator hands
out pointers into this buffer. At the end of the frame, reset the
pointer. That's a single-instruction collection.  After you exit your
main loop, call GC.collect to collect the buffer itself.

This isn't Java where every allocation must come from the GC. D lets you
work with raw pointers for a reason.


> In other words, it's only acceptable if you have natural pauses
> (loading screens, transitions, etc.) with limited resource consumption
> between them OR if you can afford to e.g. halve your FPS for a while.
> The alternative is to collect every frame, which means sacrificing a
> quarter of runtime. No, thanks.

Nobody says you HAVE to use the GC in your main loop.


> Thing is, "limited resource consumption" means you're preallocating
> anyway, at which point one has to question why use the GC in the first
> place.

You don't have to use the GC. You can malloc your preallocated buffers.
Or GC-allocate them but call GC.disable before entering your main loop.


> The majority of garbage created per frame can be trivially
> allocated from an arena and "deallocated" in one `mov` instruction (or
> a few of them). And things that can't be allocated in an arena, i.e.
> things with destructors - you *can't* reliably delegate to the GC
> anyway - which means your persistent state is more likely to be
> manually managed.
[...]

Of course. So don't use the GC for those things. That's all. The GC is
still useful for things outside the main loop, e.g., setup code, loading
resources in between levels, etc..  The good thing about D is that you
*can* make this choice.  It's not like Java where you're forced to use
the GC whether you like it or not.  There's no reason to clamor to
*remove* the GC from D, like some appear to be arguing for.


T

-- 
The only difference between male factor and malefactor is just a little 
emptiness inside.


Re: Why I Like D

2022-01-13 Thread Ola Fosheim Grøstad via Digitalmars-d-announce

On Thursday, 13 January 2022 at 16:33:59 UTC, Paulo Pinto wrote:
ARC, tracing GC, whatever, but make your mind otherwise other 
languages that know what they want to be get the spotlight in 
such vendors.


Go has a concurrent collector, so I would assume it is reasonable 
well-behaving in regards to other system components (e.g. does 
not sporadically saturate the data-bus for a long time). Go's 
runtime also appears to be fairly limited, so it does not 
surprise me that people want to use it on micro controllers.


We had some people in these forums who were interested in using D 
for embedded, but they seemed to give up as modifying the runtime 
was more work than it was worth for them. That is at least my 
interpretation of what they stated when they left.


So well, D has not made a point of capturing embedded programmers 
in the past, and there are no plans for a strategic change in 
that regard AFAIK.





Re: Why I Like D

2022-01-13 Thread Paulo Pinto via Digitalmars-d-announce
On Thursday, 13 January 2022 at 15:44:33 UTC, Ola Fosheim Grøstad 
wrote:
On Thursday, 13 January 2022 at 10:21:12 UTC, Stanislav Blinov 
wrote:
TLDR: it's pointless to lament on irrelevant trivia. Time it! 
Any counter-arguments from either side are pointless without 
that.


"Time it" isn't really useful for someone starting on a 
project, as it is too late when you have something worth 
measuring. The reason for this is that it gets worse and worse 
as your application grows. Then you end up either giving up on 
the project or going through a very expensive and bug prone 
rewrite. There is no trivial upgrade path for code relying on 
the D GC.


And quite frankly, 4 ms is not a realistic worse case scenario 
for the D GC. You have to wait for all threads to stop on the 
worst possible OS/old-budget-hardware/program state 
configuration.


It is better to start with a solution that is known to scale 
well if you are writing highly interactive applications. For D 
that could be ARC.


Just leaving this here from a little well known company.

https://developer.arm.com/solutions/internet-of-things/languages-and-libraries/go

ARC, tracing GC, whatever, but make your mind otherwise other 
languages that know what they want to be get the spotlight in 
such vendors.


Re: Why I Like D

2022-01-13 Thread Ola Fosheim Grøstad via Digitalmars-d-announce

On Thursday, 13 January 2022 at 11:57:41 UTC, Araq wrote:
But the time it takes depends on the number of threads it has 
to stop and the amount of live memory of your heap. If it took 
4ms regardless of these factors it wouldn't be bad, but that's 
not how D's GC works...


Sadly fast scanning is still bad, unless you are on an 
architecture where you can scan without touching the caches. If 
you burst through gigabytes of memory then you have a negative 
effect on real time threads that expect lookup tables to be in 
the caches. That means you need more headroom in real time 
threads, so you sacrifice the quality of work done by real time 
threads by saturating the memory data bus.


It would be better to have a concurrent collector that slowly 
crawls or just take the predicable overhead of ARC that is 
distributed fairly even in time (unless you do something silly).




LDC 1.28.1

2022-01-13 Thread kinke via Digitalmars-d-announce

A new patch version was just released:

* Based on D 2.098.1+ (stable from 2 days ago).
* Linux x86[_64]: Important fix with statically linked druntime 
and lld/bfd linkers. [lld 13 came with a deadly breaking change 
and doesn't work, older versions do - stay tuned for LDC v1.29 
for further improved linker support.]

* New UDAs: `@hidden` and `@noSanitize`.
* Ability to access magic linker symbols on Mac via special 
mangle prefix.
* WebAssembly: Tweaked default lld linker flags for less trouble 
wrt. stack overflows. Thanks @ryuukk!

* Support `rdtscp` in DMD-style inline assembly. Thanks Max!

Full release log and downloads: 
https://github.com/ldc-developers/ldc/releases/tag/v1.28.1


Thanks to all contributors & sponsors!


PS: The original plan was to release this just before Christmas 
vacation, but a high-impact Nullable regression in Phobos 
v2.098.1 prevented that.




Re: Why I Like D

2022-01-13 Thread Ola Fosheim Grøstad via Digitalmars-d-announce
On Thursday, 13 January 2022 at 10:21:12 UTC, Stanislav Blinov 
wrote:
TLDR: it's pointless to lament on irrelevant trivia. Time it! 
Any counter-arguments from either side are pointless without 
that.


"Time it" isn't really useful for someone starting on a project, 
as it is too late when you have something worth measuring. The 
reason for this is that it gets worse and worse as your 
application grows. Then you end up either giving up on the 
project or going through a very expensive and bug prone rewrite. 
There is no trivial upgrade path for code relying on the D GC.


And quite frankly, 4 ms is not a realistic worse case scenario 
for the D GC. You have to wait for all threads to stop on the 
worst possible OS/old-budget-hardware/program state configuration.


It is better to start with a solution that is known to scale well 
if you are writing highly interactive applications. For D that 
could be ARC.







Re: Why I Like D

2022-01-13 Thread Paulo Pinto via Digitalmars-d-announce
On Thursday, 13 January 2022 at 10:21:12 UTC, Stanislav Blinov 
wrote:

On Wednesday, 12 January 2022 at 16:17:02 UTC, H. S. Teoh wrote:

[...]


Oh there is a psychological barrier for sure. On both sides of 
the, uh, "argument". I've said this before but I can repeat it 
again: time it. 4 milliseconds. That's how long a single 
GC.collect() takes on my machine. That's a quarter of a frame. 
And that's a dry run. Doesn't matter if you can GC.disable or 
not, eventually you'll have to collect, so you're paying that 
cost (more, actually, since that's not going to be a dry run). 
If you can afford that - you can befriend the GC. If not - GC 
goes out the window.


In other words, it's only acceptable if you have natural pauses 
(loading screens, transitions, etc.) with limited resource 
consumption between them OR if you can afford to e.g. halve 
your FPS for a while. The alternative is to collect every 
frame, which means sacrificing a quarter of runtime. No, thanks.


Thing is, "limited resource consumption" means you're 
preallocating anyway, at which point one has to question why 
use the GC in the first place. The majority of garbage created 
per frame can be trivially allocated from an arena and 
"deallocated" in one `mov` instruction (or a few of them). And 
things that can't be allocated in an arena, i.e. things with 
destructors - you *can't* reliably delegate to the GC anyway - 
which means your persistent state is more likely to be manually 
managed.


TLDR: it's pointless to lament on irrelevant trivia. Time it! 
Any counter-arguments from either side are pointless without 
that.


You collect it when it matters less, like loading a level, some 
of them take so long that people even have written mini-games 
that play during loading scenes, they won't notice a couple of ms 
more.


Hardly any different from having an arena throw away the whole 
set of frame data during loading.


Unless we start talking about DirectStorage and similar.



Re: Why I Like D

2022-01-13 Thread Paulo Pinto via Digitalmars-d-announce
On Wednesday, 12 January 2022 at 02:37:47 UTC, Walter Bright 
wrote:
"Why I like D" is on the front page of HackerNews at the moment 
at number 11.


https://news.ycombinator.com/news


I enjoyed reading the article.


Re: Why I Like D

2022-01-13 Thread Araq via Digitalmars-d-announce
On Thursday, 13 January 2022 at 10:21:12 UTC, Stanislav Blinov 
wrote:
Oh there is a psychological barrier for sure. On both sides of 
the, uh, "argument". I've said this before but I can repeat it 
again: time it. 4 milliseconds. That's how long a single 
GC.collect() takes on my machine. That's a quarter of a frame. 
And that's a dry run. Doesn't matter if you can GC.disable or 
not, eventually you'll have to collect, so you're paying that 
cost (more, actually, since that's not going to be a dry run). 
If you can afford that - you can befriend the GC. If not - GC 
goes out the window.




But the time it takes depends on the number of threads it has to 
stop and the amount of live memory of your heap. If it took 4ms 
regardless of these factors it wouldn't be bad, but that's not 
how D's GC works... And the language design of D isn't all that 
friendly to better GC implementation. That is the real problem 
here, that is why it keeps coming up.


Re: Why I Like D

2022-01-13 Thread zjh via Digitalmars-d-announce

On Thursday, 13 January 2022 at 03:10:14 UTC, zjh wrote:

I'm a `GC phobia`.


Re: Why I Like D

2022-01-13 Thread Stanislav Blinov via Digitalmars-d-announce

On Wednesday, 12 January 2022 at 16:17:02 UTC, H. S. Teoh wrote:
On Wed, Jan 12, 2022 at 03:41:03PM +, Adam D Ruppe via 
Digitalmars-d-announce wrote:
On Wednesday, 12 January 2022 at 15:25:37 UTC, H. S. Teoh 
wrote:

>However it turns out that unless you are writing a computer
>game, a high frequency trading system, a web server

Most computer games and web servers use GC too.

[...]

Depends on what kind of games, I guess. If you're writing a 
60fps real-time raytraced 3D FPS running at 2048x1152 
resolution, then *perhaps* you might not want a GC killing your 
framerate every so often.


(But even then, there's always GC.disable and @nogc... so it's 
not as if you *can't* do it in D. It's more a psychological 
barrier triggered by the word "GC" than anything else, IMNSHO.)



T


Oh there is a psychological barrier for sure. On both sides of 
the, uh, "argument". I've said this before but I can repeat it 
again: time it. 4 milliseconds. That's how long a single 
GC.collect() takes on my machine. That's a quarter of a frame. 
And that's a dry run. Doesn't matter if you can GC.disable or 
not, eventually you'll have to collect, so you're paying that 
cost (more, actually, since that's not going to be a dry run). If 
you can afford that - you can befriend the GC. If not - GC goes 
out the window.


In other words, it's only acceptable if you have natural pauses 
(loading screens, transitions, etc.) with limited resource 
consumption between them OR if you can afford to e.g. halve your 
FPS for a while. The alternative is to collect every frame, which 
means sacrificing a quarter of runtime. No, thanks.


Thing is, "limited resource consumption" means you're 
preallocating anyway, at which point one has to question why use 
the GC in the first place. The majority of garbage created per 
frame can be trivially allocated from an arena and "deallocated" 
in one `mov` instruction (or a few of them). And things that 
can't be allocated in an arena, i.e. things with destructors - 
you *can't* reliably delegate to the GC anyway - which means your 
persistent state is more likely to be manually managed.


TLDR: it's pointless to lament on irrelevant trivia. Time it! Any 
counter-arguments from either side are pointless without that.


Re: Added copy constructors to "Programming in D"

2022-01-13 Thread Andrea Fontana via Digitalmars-d-announce

On Saturday, 8 January 2022 at 13:23:52 UTC, Imperatorn wrote:

On Saturday, 8 January 2022 at 02:07:10 UTC, Ali Çehreli wrote:

1) After about three years, I finally added copy constructors:


http://ddili.org/ders/d.en/special_functions.html#ix_special_functions.copy%20constructor

[...]


Splendid!

Will the physical book also be updated?

Thanks!


I don't think it is possible. You have to buy a new copy when 
published. /s