Re: A brief survey of build tools, focused on D

2018-12-12 Thread Neia Neutuladh via Digitalmars-d-announce

On Wednesday, 12 December 2018 at 22:41:50 UTC, H. S. Teoh wrote:
And here is the crux of my rant about build systems (earlier in 
this thread).  There is no *technical reason* why build systems 
should be constricted in this way. Today's landscape of 
specific projects being inextricably tied to a specific build 
system is completely the wrong approach.


You could reduce all this language-specific stuff to a way to 
generate a description of what needs to be built and what 
programs are suggested for doing it. This is quite a layer of 
indirection, and that means more work. "I can do less work" is a 
technical reason.


Ensuring that your output is widely usable is also extra work.

There is also a psychological reason: when you're trying to solve 
a set of problems and you are good at code, it's easy to tunnel 
vision into writing all the code yourself. It can even, 
sometimes, be easier to write that new code than to figure out 
how to use something that already exists (if you think you can 
gloss over a lot of edge cases or support a lot fewer pieces, for 
instance).


This is probably why Dub has its own repository instead of using 
Maven.


Seriously, building a lousy software project is essentially 
traversing a DAG of inputs and actions in topological order.  
The algorithms have been known since decades ago, if not 
longer, and there is absolutely no valid reason why we cannot 
import arbitrary sub-DAGs and glue it to the main DAG, and have 
everything work with no additional effort, regardless of where 
said sub-DAGs came from.  It's just a bunch of nodes and 
labelled edges, guys!  All the rest of the complications and 
build system dependencies and walled gardens are extraneous and 
completely unnecessary baggage imposed upon a straightforward 
DAG topological walk that any CS grad could write in less than 
a day.  It's ridiculous.


If any CS grad student could write it in a day, you could say 
that having a generic DAG isn't useful or interesting. That makes 
it seem pretty much useless to pull that out into a separate 
software project, and that's a psychological barrier.


Re: Blog post: What D got wrong

2018-12-12 Thread Jonathan M Davis via Digitalmars-d-announce
On Wednesday, December 12, 2018 3:49:51 PM MST H. S. Teoh via Digitalmars-d-
announce wrote:
> If the delegate property thing is the only real use case for @property,
> it seems quite out-of-proportion that an entire @-identifier in the
> language is dedicated just for this purpose. One would've thought D
> ought to be better designed than this...

Originally, the idea was to restrict property syntax to functions marked
with @property, which would mean no more lax parens. If it's a property
function, then it must be called like one, and if it's not, then it must be
called as a function (i.e. with parens), whereas right now, we have this
mess where folks can use parens or not however they feel like. That doesn't
work well with being able to swap between property functions and public
variables, and it makes generic code harder, because in general, you can't
rely on whether something is called with parens or not, meaning that the
symbol in question has to be an actual function (where parens are optional)
instead of being allowed to be a different kind of callable (which requires
parens) or be a variable (which can't have parens). @property would have
fixed all of that by forcing functions to either be called with or without
parens based on what they're used for, allowing generic code to rely on more
than convention ensuring that symbols are called consistently with or
without parens (and thus allow symbols other than functions to be reliably
used in place of functions where appropriate). So, as originally envisioned,
@property was anything but useless.

However, all of that became extremely unpopular once UFCS became a thing,
because most folks didn't like having an empty set of parens when calling a
templated function that had a template argument that used parens, and as
such, they wanted to be able to continue to drop the parens, which goes
against the core idea behind @property. So, the end result is that the
original plans for @property got dropped, and plenty of folks would be very
unhappy if we went in that direction now - but it's still the case that
@property was supposed to solve a very real problem, and that problem
remains unsolved.

As things stand, you have to be _very_ careful when using anything other
than a function in a generic context that normally uses a function, because
there's no guarantee that using something other than a function will work
due to the lack of guarantee of whether parens will be used or not. It tends
to work better with variables than with callables, because dropping parens
is so popular, and callables aren't, but it's still a problem. Anyone who
wants to use a callable instead of a function in generic code is almost
certainly in for a world of hurt unless they're in control of all of the
code involved - and that's without even getting into the issue of property
functions that return callables (those simply don't work at all).

Template constraints combat this to an extent in that they end up requiring
that the construct in question either be callable with parens or usable
without them, but that places no restrictions on the code that actually uses
the symbols, making it easy for code to use parens when it shouldn't or not
use parens when it should and then run into problems when it's given a type
that conforms to the template constraint, but the code didn't use the symbol
in the same way as the constraint. The only thing things that really prevent
this from be a much bigger problem than it is is that many folks do follow
the conventions set forth by the template constraint (e.g. always calling
front without parens) and the fact that in most cases, it's the lack of
parens which is required, and using variables instead of functions is far
more popular than using callables instead of functions. So, convention is
really all that prevents this from being a bigger problem, and the end
result is that callables in generic code are borderline useless.

On example of trying to work around this problem is that not all that long
ago, isInputRange was actually changed to use popFront without parens just
so that folks could rely on being able to call it without parens, since
previously it was possible to use a delegate or other callable for popFront
instead of a function, which would then not have worked with any code where
folks didn't bother to put parens on popFront when calling it.

All in all though, I think that the fact that we aren't strict about parens
usage mostly kills the use of callables in generic code except in cases
where you're in control of all of the code involved. It could be argued that
callables are desirable infrequently enough that being able to drop parens
when calling functions for whatever syntactic beauty supposedly comes with
outweighs the loss, but that doesn't mean that the problem isn't there, just
that many folks don't care and think that the tradeoff is worth it.

- Jonathan M Davis





Re: OFFTOPIC Re: I've just released Vasaro

2018-12-12 Thread Adam D. Ruppe via Digitalmars-d-announce
On Tuesday, 11 December 2018 at 10:19:38 UTC, Jacob Carlborg 
wrote:
I would recommend waiting until more of the Objective-C support 
is implemented. Creating a subclass is a pain in the ass 
currently.


So I got out my code that (with your help about a year ago) was 
doing a hello world window and menu, but now it doesn't compile, 
complaining about a hidden Class clashing with my Class.


What is the current state and roadmap for this support? The stuff 
described here seems wrong: 
https://dlang.org/spec/objc_interface.html and this apparently 
hasn't been edited for years: https://wiki.dlang.org/DIP43 but 
SEEMS to be what closest matches up.


Re: Blog post: What D got wrong

2018-12-12 Thread H. S. Teoh via Digitalmars-d-announce
On Wed, Dec 12, 2018 at 02:10:31PM -0700, Jonathan M Davis via 
Digitalmars-d-announce wrote:
> On Wednesday, December 12, 2018 6:03:39 AM MST Kagamin via Digitalmars-d-
> announce wrote:
[...]
> > Imagine you have void delegate() prop() and use the property
> > without parentheses everywhere then suddenly m.prop() doesn't
> > call the delegate. So it's mostly for getters and should be used
> > only in edge cases, most code should be fine with optional parens.
> 
> Except that @property does not currently have any effect on this. The
> delegate case (or really, the case of callables in general) is one
> argument for keeping @property for using in that particular corner
> case, since without it, having property functions that return
> callables simply doesn't work, but @property has never been made to
> actually handle that case, so having property functions that return
> callables has never worked in D. It's certainly been discussed before,
> but the implementation has never been changed to make it work.

Yep. Basically, @property as currently implemented is useless, and I've
stopped bothering with it except where Phobos requires it.


> If/when we finally rework @property, that use case would be the number
> one reason to not simply get rid of @property, but until then, it
> doesn't actually fix that use case. As things stand, @property
> basically just serves as documentation of intent for the API and as a
> way to screw up type introspection by having the compiler lie about
> the type of the property.
[...]

Haha yeah, currently @property confers no real benefits and only comes
with bad (and probably unexpected) side-effects.  More confirmation that
it's a waste of time and not worth my attention.

If the delegate property thing is the only real use case for @property,
it seems quite out-of-proportion that an entire @-identifier in the
language is dedicated just for this purpose. One would've thought D
ought to be better designed than this...


T

-- 
Gone Chopin. Bach in a minuet.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread H. S. Teoh via Digitalmars-d-announce
On Wed, Dec 12, 2018 at 02:52:09PM -0700, Jonathan M Davis via 
Digitalmars-d-announce wrote:
[...]
> I would think that to be fully flexible, dub would need to abstract
> things a bit more, maybe effectively using a plugin system for builds
> so that it's possible to have a dub project that uses dub for pulling
> in dependencies but which can use whatever build system works best for
> your project (with the current dub build system being the default).
> But of course, even if that is made to work well, it then introduces
> the problem of random dub projects then needing 3rd party build
> systems that you may or may not have (which is one of the things that
> dub's current build system mostly avoids).

And here is the crux of my rant about build systems (earlier in this
thread).  There is no *technical reason* why build systems should be
constricted in this way. Today's landscape of specific projects being
inextricably tied to a specific build system is completely the wrong
approach.

Projects should not be tied to a specific build system.  Instead,
whatever build tool the author uses to build the project should export a
universal description of how to build it, in a standard format that can
be imported by any other build system. This description should be a
fully general DAG, that specifies all inputs, all outputs (including
intermediate ones), and the actions required to get from input to
output.

Armed with this build description, any build system should be able to
import as a dependency any project built with any other build system,
and be able to successfully build said dependency without even knowing
what build system was originally used to build it or what build system
it is "intended" to be built with.  I should be able to import a Gradle
project, a dub project, and an SCons project as dependencies, and be
able to use make to build everything. And my downstream users ought to
be able to build my project with tup, or any other build tool they
choose, without needing to care that I used make to build my project.

Seriously, building a lousy software project is essentially traversing a
DAG of inputs and actions in topological order.  The algorithms have
been known since decades ago, if not longer, and there is absolutely no
valid reason why we cannot import arbitrary sub-DAGs and glue it to the
main DAG, and have everything work with no additional effort, regardless
of where said sub-DAGs came from.  It's just a bunch of nodes and
labelled edges, guys!  All the rest of the complications and build
system dependencies and walled gardens are extraneous and completely
unnecessary baggage imposed upon a straightforward DAG topological walk
that any CS grad could write in less than a day.  It's ridiculous.


> On some level, dub is able to do as well as it does precisely because
> it's able to assume a bunch of stuff about D projects which is true
> the vast majority of the time, and the more it allows projects that
> don't work that way, the worse dub is going to work as a general tool,
> because it increasingly opens up problems with regards to whether you
> have the right tools or environment to build a particular project when
> using it as a dependency. However, if we don't figure out how to make
> it more flexible, then certain classes of projects really aren't going
> to work well with dub.  That's less of a problem if the project is not
> for a library (and thus does not need to be a dub package so that
> other packages can pull it in as a dependency) and if dub provides a
> good way to just make libraries available as dependencies rather than
> requiring the the ultimate target be built with dub, but even then, it
> doesn't solve the problem when the target _is_ a library (e.g. what if
> it were for wrapping a C or C++ library and needed to do a bunch of
> extra code steps for code generation and needed multiple build steps).

Well exactly, again, the monolithic approach to building software is the
wrong approach, and leads to arbitrary and needless limitations of this
sort.  DAG generation should be decoupled from build execution.  You can
use whatever tool or fancy algorithm you want to generate the lousy DAG,
but once generated, all you have to do is to export it in a standard
format, then any arbitrary number of build executors can read the
description and run it.

Again I say: projects should not be bound to this or that build system.
Instead, they should export a universal build description in a standard
format.  Whoever wants to depend on said projects can simply import the
build description and it will Just Work(tm). The build executor will
know exactly how to build the dependency independently of whatever fancy
tooling the upstream author may have used to generate the DAG.


> So, I don't know. Ultimately, what this seems to come down to is that
> all of the stuff that dub does to make things simple for the common
> case make it terrible for complex cases, but making it work well for
> 

Re: A brief survey of build tools, focused on D

2018-12-12 Thread Jonathan M Davis via Digitalmars-d-announce
On Wednesday, December 12, 2018 1:33:39 PM MST H. S. Teoh via Digitalmars-d-
announce wrote:
> On Wed, Dec 12, 2018 at 10:38:55AM +0100, Sönke Ludwig via Digitalmars-d-
announce wrote:
> > Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
> > > Does dub support the following scenario?
>
> [...]
>
> > This will currently realistically require invoking an external tool
> > such as make through a pre/post-build command (although it may
> > actually be possible to hack this together using sub packages, build
> > commands, and string import paths for the file dependencies). Most
> > notably, there is a directive missing to specify arbitrary files as
> > build dependencies.
>
> I see.  I think this is a basic limitation of dub's design -- it assumes
> a certain (common) compilation model of sources to (single) executable,
> and everything else is only expressible in terms of larger abstractions
> like subpackages.  It doesn't really match the way I work, which I guess
> explains my continuing frustration with using it.  I think of my build
> processes as a general graph of arbitrary input files being converted by
> arbitrary operations (not just compilation) into arbitrary output files.
> When I'm unable to express this in a simple way in my build spec, or
> when I'm forced to use tedious workarounds to express what in my mind
> ought to be something very simple, it distracts me from my focusing on
> my problem domain, and results in a lot of lost time/energy and
> frustration.

What you're describing sounds like it would require a lot of extra machinery
in comparison to how dub is designed to work. dub solves the typical use
case of building a single executable or library (which is what the vast
majority of projects do), and it removes the need to specify much of
anything to make that work, making it fantastic for the typical use case but
causing problems for any use cases that have more complicated needs. I
really don't see how doing much of anything other than building a single
executable or library from a dub project is going to result in anything
other than frustration from the tool even if you can make it work. By the
very nature of what you'd be trying to do, you'd be constantly trying to
work around how dub is designed to work. dub can do more thanks to
subprojects and some of the extra facilities it has for running stuff before
or after the build, but all of that sort of stuff has to work around dub's
core design, making it generally awkward to use, whereas to do something
more complex, at some point, what you really want is basically a build
script (albeit maybe with some extra facilities to properly detect whether
certain phases of the build can be skipped).

I would think that to be fully flexible, dub would need to abstract things a
bit more, maybe effectively using a plugin system for builds so that it's
possible to have a dub project that uses dub for pulling in dependencies but
which can use whatever build system works best for your project (with the
current dub build system being the default). But of course, even if that is
made to work well, it then introduces the problem of random dub projects
then needing 3rd party build systems that you may or may not have (which is
one of the things that dub's current build system mostly avoids).

On some level, dub is able to do as well as it does precisely because it's
able to assume a bunch of stuff about D projects which is true the vast
majority of the time, and the more it allows projects that don't work that
way, the worse dub is going to work as a general tool, because it
increasingly opens up problems with regards to whether you have the right
tools or environment to build a particular project when using it as a
dependency. However, if we don't figure out how to make it more flexible,
then certain classes of projects really aren't going to work well with dub.
That's less of a problem if the project is not for a library (and thus does
not need to be a dub package so that other packages can pull it in as a
dependency) and if dub provides a good way to just make libraries available
as dependencies rather than requiring the the ultimate target be built with
dub, but even then, it doesn't solve the problem when the target _is_ a
library (e.g. what if it were for wrapping a C or C++ library and needed to
do a bunch of extra code steps for code generation and needed multiple build
steps).

So, I don't know. Ultimately, what this seems to come down to is that all of
the stuff that dub does to make things simple for the common case make it
terrible for complex cases, but making it work well for complex cases would
almost certainly make it _far_ worse for the common case. So, I don't know
that we really want to be drastically changing how dub works, but I do think
that we need to make it so that more is possible with it (even if it's more
painful, because it's doing something that goes against the typical use
case).

The most obvious thing that I can think of is 

Re: Blog post: What D got wrong

2018-12-12 Thread Jonathan M Davis via Digitalmars-d-announce
On Wednesday, December 12, 2018 6:03:39 AM MST Kagamin via Digitalmars-d-
announce wrote:
> On Tuesday, 11 December 2018 at 12:57:03 UTC, Atila Neves wrote:
> > @property is useful for setters. Now, IMHO setters are a code
> > stink anyway but sometimes they're the way to go. I have no
> > idea what it's supposed to do for getters (nor am I interested
> > in learning or retaining that information) and never slap the
> > attribute on.
>
> Imagine you have void delegate() prop() and use the property
> without parentheses everywhere then suddenly m.prop() doesn't
> call the delegate. So it's mostly for getters and should be used
> only in edge cases, most code should be fine with optional parens.

Except that @property does not currently have any effect on this. The
delegate case (or really, the case of callables in general) is one argument
for keeping @property for using in that particular corner case, since
without it, having property functions that return callables simply doesn't
work, but @property has never been made to actually handle that case, so
having property functions that return callables has never worked in D. It's
certainly been discussed before, but the implementation has never been
changed to make it work. If/when we finally rework @property, that use case
would be the number one reason to not simply get rid of @property, but until
then, it doesn't actually fix that use case. As things stand, @property
basically just serves as documentation of intent for the API and as a way to
screw up type introspection by having the compiler lie about the type of the
property.

> >I think there’s a general consensus that @safe, pure and
> >immutable should be default.
>
> I can agree there are at least 5 people holding that firm belief,
> but that's hardly a consensus.

There are definitely people who want one or more of those attributes as the
default, but I very much doubt that it's a consensus. It wouldn't surprise
me if @safe or pure by default went over fairly well, but I'm sure that
immutable or const by default would be far more controversial, because
that's a big shift from what C-derived languages normally do. Personally, I
would be very unhappy if it were the default, though I know that there are
some folks who would very much like to see const or immutable be the
default.

> >I’ve lost count now of how many times I’ve had to write @safe
> >@nogc pure nothrow const scope return. Really.
>
> If immutable was default, wouldn't you still need to write const
> attribute everywhere, and @nogc, and nothrow? Strings are like
> the only relevant immutable data structure (and they are already
> immutable), everything else is inherently mutable except for use
> cases with genuine need for immutability like a shared cache of
> objects.

If immutable were the default, then I expect that writing types that worked
with immutable would become more common, because it would then be encouraged
by the language, but I think that your average type is written to work as
mutable (and maybe const), and it's a pretty big shift to write types to be
immutable unless you're talking about simple POD types, so if immutable
became the default, I expect that mutable (or whatever the modifier to make
a type mutable would be) would start getting plastered everywhere. And
without the range API being changed, ranges wouldn't work unless you marked
them as mutable, making const or immutable by default a bit of a mess for
what would now be idiomatic D code (though if the default were changed to
const or immutable, we'd probably see the range API be changed to use the
classic, functional head/tail list mechanism rather than front and popFront,
which could very well be an improvement anyway).

- Jonathan M Davis






Re: Blog post: What D got wrong

2018-12-12 Thread JN via Digitalmars-d-announce
On Wednesday, 12 December 2018 at 20:12:54 UTC, Guillaume Piolat 
wrote:
On Wednesday, 12 December 2018 at 14:48:23 UTC, Atila Neves 
wrote:

On Tuesday, 11 December 2018 at 14:00:10 UTC, dayllenger wrote:
On Tuesday, 11 December 2018 at 13:42:03 UTC, Guillaume 
Piolat wrote:
One could say getters and particularly setters don't really 
deserve a nicer way to write them. It's a code stink, it 
deserve a long ugly name.  (10 years ago I would be in the 
other camp)


Can you please explain it in more detail? I never read such 
about getters and setters.


Tell, don't ask: 
https://martinfowler.com/bliki/TellDontAsk.html


Sometimes formulated slightly differently as "Law of Demeter" 
https://en.wikipedia.org/wiki/Law_of_Demeter


if you like more pompous names.


Law of Demeter is different. Law of Demeter basically translates 
to "don't have more than one dot", like x.y() is fine, x.y.z() 
isn't because it makes too many assumptions about internals of x 
and y.


Properties have use when the setting or getting the variable 
isn't a trivial assignment. For example, sometimes the units need 
to be converted along the way. In many cases, especially when GUI 
programming, you might want to do additional actions when 
settings/getting a variable, like calling listeners to notify 
them of the value change so that they can change the value in the 
GUI widget automatically.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread H. S. Teoh via Digitalmars-d-announce
On Wed, Dec 12, 2018 at 10:38:55AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:
> Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
> > [...]
> > Wait, what does --parallel do if it doesn't compile multiple files
> > at once?
> 
> It currently only works when building with `--build-mode=singleFile`,
> so compiling individual files in parallel instead of compiling chunks
> of files in parallel, which would be the ideal.

Ah, I see.  But that should be relatively easy to fix, right?


[...]
> There are the three directives sourcePaths, sourceFiles and
> excludedSourceFiles (the latter two supporting wildcard expressions)
> to control the list of files. Once an explicit sourcePaths directive
> is given, the folder that is possibly detected by default
> ("source"/"src") is also skipped. They are documented in the package
> format specs ([1], [2]).

Thanks for the info.


> > Also, you refer to "the output binary". Does that mean I cannot
> > generate multiple executables? 'cos that's a showstopper for me.
> 
> Compiling multiple executables currently either requires multiple
> invocations (e.g. with different configurations or sub packages
> specified), or a targetType "none" package that has one dependency per
> executable - the same configuration/architecture applies to all of
> them in that case. If they are actually build dependencies, a possible
> approach is to invoke dub recursively inside of a preBuildCommand.

Unfortunately, that is not a practical solution for me.  Many of my
projects have source files that are generated by utilities that are
themselves D code that needs to be compiled (and run) as part of the
build.  I suppose in theory I could separate them into subpackages, and
factor out the common code shared between these utilities and the main
executable(s), but that is far too much work for something that IMO
ought to be very simple -- since most of the utilities are single-file
drivers with a small number of imports of some shared modules. Creating
entire subpackages for each of them just seems excessive, esp. during
development where the set of utilities / generated files may change a
lot.  Creating/deleting a subpackage every time is just too much work
for little benefit.

Also, does dub correctly support the case where some .d files are
generated by said utilities (which would be dub subpackages, if we
hypothetically went with that setup), but the output may change
depending on the contents of some input data/config files? I.e., if I
change a data file and run dub again, it ought to re-run the codegen
tool and then recompile the main executable that contains the changed
code.  This is a pretty important use-case for me, since it's kinda the
whole point of having a codegen tool.

Compiling the same set of sources for multiple archs (with each arch
possibly entailing a separate list of source files) is kinda a special
case for my current Android project; generally I don't really need
support for this. But solid support for codegen that properly percolates
changes from input data down to recompiling executables is must-have for
me.  Not being able to do this in the most efficient way possible would
greatly hamper my productivity.


> But what I meant is that there is for example currently no way to
> customize the output binary base name ("targetName") and directory
> ("targetPath") depending on the build type.

But this shouldn't be difficult to support, right?  Though I don't
particularly need this feature -- for the time being.


[...]
> > Does dub support the following scenario?
[...]
> This will currently realistically require invoking an external tool
> such as make through a pre/post-build command (although it may
> actually be possible to hack this together using sub packages, build
> commands, and string import paths for the file dependencies). Most
> notably, there is a directive missing to specify arbitrary files as
> build dependencies.

I see.  I think this is a basic limitation of dub's design -- it assumes
a certain (common) compilation model of sources to (single) executable,
and everything else is only expressible in terms of larger abstractions
like subpackages.  It doesn't really match the way I work, which I guess
explains my continuing frustration with using it.  I think of my build
processes as a general graph of arbitrary input files being converted by
arbitrary operations (not just compilation) into arbitrary output files.
When I'm unable to express this in a simple way in my build spec, or
when I'm forced to use tedious workarounds to express what in my mind
ought to be something very simple, it distracts me from my focusing on
my problem domain, and results in a lot of lost time/energy and
frustration.


[...]
> BTW, my plan for the Android part of this was to add support for
> plugins (fetchable from the registry, see [3] for a draft) that handle
> the details in a centralized manner instead of having to put that
> knowledge into the build recipe of each 

Re: Blog post: What D got wrong

2018-12-12 Thread Guillaume Piolat via Digitalmars-d-announce

On Wednesday, 12 December 2018 at 14:48:23 UTC, Atila Neves wrote:

On Tuesday, 11 December 2018 at 14:00:10 UTC, dayllenger wrote:
On Tuesday, 11 December 2018 at 13:42:03 UTC, Guillaume Piolat 
wrote:
One could say getters and particularly setters don't really 
deserve a nicer way to write them. It's a code stink, it 
deserve a long ugly name.  (10 years ago I would be in the 
other camp)


Can you please explain it in more detail? I never read such 
about getters and setters.


Tell, don't ask: https://martinfowler.com/bliki/TellDontAsk.html


Sometimes formulated slightly differently as "Law of Demeter" 
https://en.wikipedia.org/wiki/Law_of_Demeter


if you like more pompous names.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Andre Pany via Digitalmars-d-announce
On Wednesday, 12 December 2018 at 09:38:55 UTC, Sönke Ludwig 
wrote:
Most notably, there is a directive missing to specify arbitrary 
files as build dependencies.


I am working on a pull request:
https://github.com/andre2007/dub/commit/97161fb352dc1237411e2e7010447f8a9e817d48

Productive implementation is finished.
Only tests are missing.

Kind regards
André


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Sönke Ludwig via Digitalmars-d-announce

Am 12.12.2018 um 15:53 schrieb Atila Neves:

On Wednesday, 12 December 2018 at 09:38:55 UTC, Sönke Ludwig wrote:

Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:

[...]


The main open point right now AFAICS is to make --parallel work with
the multiple-files-at-once build modes for machines that have enough
RAM. This is rather simple, but someone has to do it. But apart from
that, I think that the current state is relatively fine form a
performance point of view.


Wait, what does --parallel do if it doesn't compile multiple files at
once?


It currently only works when building with `--build-mode=singleFile`, 
so compiling individual files in parallel instead of compiling chunks 
of files in parallel, which would be the ideal.


If by "the ideal" you mean "compile the fastest", then you don't want to 
compile single files in parallel. I measured across multiple projects, 
and compiling per package (in the D sense, not the dub one) was fastest. 
Which is why it's the default with reggae.




The sentence was ambiguous, but that's what I meant!


Re: OFFTOPIC Re: I've just released Vasaro

2018-12-12 Thread Adam D. Ruppe via Digitalmars-d-announce
On Tuesday, 11 December 2018 at 10:19:38 UTC, Jacob Carlborg 
wrote:
Which year is the machine from? It should say that after the 
model.


Oh, I had to click "more info".

MacBook Air
11-inch, Mid 2011

So I guess it is quite old. I have tried to do the OS update 
several times before and it consistently just freezes (usually 
the progress bar stops, the system keeps working, but one time it 
did outright restart itself), this probably explains why.


I would recommend waiting until more of the Objective-C support 
is implemented. Creating a subclass is a pain in the ass 
currently.


Yeah, I know. I have made some mixins to help smooth it over a 
little though. That is one of the reasons why I am waiting a bit, 
but I feel if I wait on dmd I'll be waiting forever. I'd like at 
least the basics to work.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Atila Neves via Digitalmars-d-announce
On Wednesday, 12 December 2018 at 09:38:55 UTC, Sönke Ludwig 
wrote:

Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:

[...]

The main open point right now AFAICS is to make --parallel 
work with
the multiple-files-at-once build modes for machines that have 
enough
RAM. This is rather simple, but someone has to do it. But 
apart from

that, I think that the current state is relatively fine form a
performance point of view.


Wait, what does --parallel do if it doesn't compile multiple 
files at

once?


It currently only works when building with 
`--build-mode=singleFile`, so compiling individual files in 
parallel instead of compiling chunks of files in parallel, 
which would be the ideal.


If by "the ideal" you mean "compile the fastest", then you don't 
want to compile single files in parallel. I measured across 
multiple projects, and compiling per package (in the D sense, not 
the dub one) was fastest. Which is why it's the default with 
reggae.




Re: Blog post: What D got wrong

2018-12-12 Thread Atila Neves via Digitalmars-d-announce

On Tuesday, 11 December 2018 at 14:00:10 UTC, dayllenger wrote:
On Tuesday, 11 December 2018 at 13:42:03 UTC, Guillaume Piolat 
wrote:
One could say getters and particularly setters don't really 
deserve a nicer way to write them. It's a code stink, it 
deserve a long ugly name.  (10 years ago I would be in the 
other camp)


Can you please explain it in more detail? I never read such 
about getters and setters.


Tell, don't ask: https://martinfowler.com/bliki/TellDontAsk.html

Getters and setters break encapsulation - the client knows way 
too much about your struct/class. Whatever you were going to do 
with the data you got from the object, move it into a member 
function of that object's type.


Setters are like that as well, but worse since mutable state is 
the root of all evil. Personally, I cringe whenever I have to use 
`auto` instead of `const` for a variable declaration.




Re: Blog post: What D got wrong

2018-12-12 Thread Kagamin via Digitalmars-d-announce

On Tuesday, 11 December 2018 at 12:57:03 UTC, Atila Neves wrote:
@property is useful for setters. Now, IMHO setters are a code 
stink anyway but sometimes they're the way to go. I have no 
idea what it's supposed to do for getters (nor am I interested 
in learning or retaining that information) and never slap the 
attribute on.


Imagine you have void delegate() prop() and use the property 
without parentheses everywhere then suddenly m.prop() doesn't 
call the delegate. So it's mostly for getters and should be used 
only in edge cases, most code should be fine with optional parens.



inout
Template this can accomplish the same thing and is more useful 
anyway.


"Everything is a template" is a spiritual successor to 
"everything is an object" hype :)



Returning a reference
It’s practically pointless.


See 
https://github.com/dlang/druntime/blob/master/src/core/stdc/errno.d#L66
Also AFAIK alias this doesn't dereference pointers automatically, 
and retaining the pointer may be not desirable.


I think there’s a general consensus that @safe, pure and 
immutable should be default.


I can agree there are at least 5 people holding that firm belief, 
but that's hardly a consensus.


I’ve lost count now of how many times I’ve had to write @safe 
@nogc pure nothrow const scope return. Really.


If immutable was default, wouldn't you still need to write const 
attribute everywhere, and @nogc, and nothrow? Strings are like 
the only relevant immutable data structure (and they are already 
immutable), everything else is inherently mutable except for use 
cases with genuine need for immutability like a shared cache of 
objects.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Sönke Ludwig via Digitalmars-d-announce

Am 11.12.2018 um 20:46 schrieb H. S. Teoh:

On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:
[...]


The main open point right now AFAICS is to make --parallel work with
the multiple-files-at-once build modes for machines that have enough
RAM. This is rather simple, but someone has to do it. But apart from
that, I think that the current state is relatively fine form a
performance point of view.


Wait, what does --parallel do if it doesn't compile multiple files at
once?


It currently only works when building with `--build-mode=singleFile`, so 
compiling individual files in parallel instead of compiling chunks of 
files in parallel, which would be the ideal.

Then it requires a specific source layout, with incomplete /
non-existent configuration options for alternatives.  Which makes it
unusable for existing code bases.  Unacceptable.


You can define arbitrary import/source directories and list (or
delist) source files individually if you want. There are restrictions
on the naming of the output binary, though, is that what you mean?


Is this documented? I couldn't find any info on it the last time I
looked.


There are the three directives sourcePaths, sourceFiles and 
excludedSourceFiles (the latter two supporting wildcard expressions) to 
control the list of files. Once an explicit sourcePaths directive is 
given, the folder that is possibly detected by default ("source"/"src") 
is also skipped. They are documented in the package format specs ([1], [2]).




Also, you refer to "the output binary". Does that mean I cannot
generate multiple executables? 'cos that's a showstopper for me.


Compiling multiple executables currently either requires multiple 
invocations (e.g. with different configurations or sub packages 
specified), or a targetType "none" package that has one dependency per 
executable - the same configuration/architecture applies to all of them 
in that case. If they are actually build dependencies, a possible 
approach is to invoke dub recursively inside of a preBuildCommand.


But what I meant is that there is for example currently no way to 
customize the output binary base name ("targetName") and directory 
("targetPath") depending on the build type.



Worst of all, it does not support custom build actions, which is a
requirement for many of my projects.  It does not support polyglot
projects. It either does not support explicit control over exact
build commands, or any such support is so poorly documented it might
as well not exist.  This is not only unacceptable, it is a
show-stopper.


Do you mean modifying the compiler invocations that DUB generates or
adding custom commands (aka pre/post build/generate commands)?


Does dub support the following scenario?

- There's a bunch of .java files that have to be compiled with javac.
- But some of the .java files are generated by an external tool, that
  must be run first, before the .java files are compiled.
- There's a bunch of .d files in two directories.
- The second directory contains .d files that need to be compiled
  into multiple executables, and they must be compiled with a local
  (i.e., non-cross) compiler.
- Some of the resulting executables must be run first in order to
  generate a few .d files in the first directory (in addition to
  what's already there).
- After the .d files are generated, the first directory needs to be
  compiled TWICE: once with a cross-compiler (LDC, targetting
  Arm/Android), once with the local D compiler. The first compilation
  must link with cross-compilation Android runtime libraries, and the
  second compilation must link with local X11 libraries.
   - (And obviously, the build products must be put in separate
 subdirectories to prevent stomping over each other.)
- After the .java and .d files are compiled, a series of tools must be
   invoked to generate an .apk file, which also includes a bunch of
   non-code files in resource subdirectories.  Then, another tool must be
   run to align and sign the .apk file.

And here's a critical requirement: any time a file is changed (it can be
a .java file, a .d file, or one of the resources that they depend on),
all affected build products must be correctly updated. This must be done
as efficiently as possible, because it's part of my code-compile-test
cycle, and if it requires more than a few seconds or recompiling the
entire codebase, it's a no-go.

If dub can handle this, then I'm suitably impressed, and retract most of
my criticisms against it. ;-)


This will currently realistically require invoking an external tool such 
as make through a pre/post-build command (although it may actually be 
possible to hack this together using sub packages, build commands, and 
string import paths for the file dependencies). Most notably, there is a 
directive missing to specify arbitrary files as build dependencies.


Another feature that should be there