Re: DMD 2.066 Alpha

2014-06-15 Thread Tove via Digitalmars-d-announce
On Friday, 13 June 2014 at 16:49:26 UTC, Andrei Alexandrescu 
wrote:


Virtual by default will not change. Being able to negate the 
final: label is nice to have but not a must. Adding a keyword 
for that doesn't scale - it would mean we'd need to add one 
keyword to undo each label.



Andrei


Just to try and establish a clear path forwards,
if a pull request existed which added support for...
final!true
final!false
... would it be accepted?

Or would a generic negate-x-DIP be required?
const!false
noexcept!false
etc.


Re: dmd front end now switched to Boost license

2014-06-15 Thread Joakim via Digitalmars-d-announce

On Sunday, 15 June 2014 at 01:08:00 UTC, Leandro Lucarella wrote:

Joakim, el 14 de June a las 19:31 me escribiste:
The frontend was dual-licensed under the Artistic license, 
which
also allows such proprietary use, so nothing has really 
changed.


Mmm, even when is true that the Artistic license is a bit more
permissive than the GPL in some aspects, I think is hardly 
suitable for

doing serious proprietary software (that you intent to sell).

From the artistic license that was distributed by DMD:
You may not charge a fee for this Package itself. However, you 
may
distribute this Package in aggregate with other (possibly 
commercial)
programs as part of a larger (possibly commercial) software 
distribution
provided that you do not advertise this Package as a product of 
your

own.

Is a bit hairy, I don't think any companies would want to do 
proprietary

tools using the artistic license :)

https://github.com/D-Programming-Language/dmd/blob/083271a415716cf3e35321f91826397d91c0a731/src/artistic.txt


I was referring to this clause from the Artistic license:

4. You may distribute the programs of this Package in object 
code or
executable form, provided that you do at least ONE of the 
following:


a) distribute a Standard Version of the executables and 
library files,
together with instructions (in the manual page or equivalent) 
on where

to get the Standard Version.

So you could have always distributed a modified, closed ldc with 
the frontend under the Artistic license- it would have to be ldc 
as the dmd backend is proprietary- as long as you also provided 
an unmodified ldc along with it.


I don't think the part of the Artistic license you excerpted 
would apply to such a modified program, but even if the 
advertising part applied, I doubt any commercial user would care. 
 Usually those who take your code _don't want_ to advertise where 
they got it from. ;)


I realize you prefer the LGPL, to force others to contribute 
back to
the frontend if they modify and distribute it, but the Boost 
license
is much simpler and as Walter points out, proprietary use can 
help

D's adoption.


Again, I think from the practical point of view is the same. If 
you use
boost license and tons of proprietary tools come out CHANGING 
the DMDFE
and not contributing back, then the D community might get a 
boost
because the have better tools but they are missing the 
contributions, so
is hard to tell if the balance would be positive or negative. 
If they
don't change the DMDFE (or contribute back the changes), then 
using

boost or LGPL are the same, because it doesn't matter.


Having better-quality paid tools would be a big boost, whether 
they released their patches or not.  You point out that 
commercial users could always link against a LGPL frontend as a 
library and put their proprietary modifications in their own 
separate library, but that can be very inconvenient, depending on 
the feature.


Also, I've pointed out a new model on this forum before, where 
someone could release a closed, paid D compiler but have a 
contract with their customers that all source code for a 
particular binary will be released within a year or two.  This 
way, you get the best of both worlds, revenue from closed-source 
patches and the patches are open-sourced eventually.  Such mixed 
models or other experimentation is possible under the freedom of 
more permissive licenses like Boost, but is usually much harder 
to pull off with the LGPL, as you'd be forced to separate all 
proprietary code from the LGPL frontend.


Re: DConf 2014 Day 1 Talk 5: Experience Report: Using D at Facebook and Beyond by Adam Simpkins

2014-06-15 Thread Jacob Carlborg via Digitalmars-d-announce

On 2014-06-12 19:28, Andrei Alexandrescu wrote:

https://news.ycombinator.com/newest (please upvote, things get buried
there quickly)

https://twitter.com/D_Programming/status/477139782334963712

https://www.facebook.com/dlang.org/posts/864887076858308

http://www.reddit.com/r/programming/comments/27za5z/dconf_2014_day_1_talk_5_experience_report_using_d/


A comment about DStep and C++. The long term goal is to implement 
support for C++. Although, it's not something that is being worked on 
currently. Contributions are welcome.


--
/Jacob Carlborg


Re: dmd front end now switched to Boost license

2014-06-15 Thread via Digitalmars-d-announce

On Saturday, 14 June 2014 at 19:27:44 UTC, Nick Sabalausky wrote:
don't think those are the only important criteria. The thing 
is, D's licensing overall (DMDFE/DMDBE/LDC/GDC/Phobos) is kinda 
complicated. So any simplification, as long as it doesn't 
restrict anyone, is a net improvement, even if it isn't 
earth-shattering.


Indeed. Having a single license makes the project look focused 
rather than a conglomerate moving in different directions.


Re: DConf 2014 Day 1 Talk 4: Inside the Regular Expressions in D by Dmitry Olshansky

2014-06-15 Thread Dicebot via Digitalmars-d-announce

On Saturday, 14 June 2014 at 16:34:35 UTC, Dmitry Olshansky wrote:
Consider something like REST API generator I have described 
during
DConf. There is different code generated in different contexts 
from same
declarative description - both for server and client. Right 
now simple
fact that you import very same module from both gives solid 
100%
guarantee that API usage between those two programs stays in 
sync.


But let's face it - it's a one-time job to get it right in your 
favorite build tool. Then you have fast and cached (re)build. 
Comparatively costs of CTFE generation are paid in full during 
_each_ build.


There is no such thing as one-time job in programming unless you 
work alone and abandon any long-term maintenance. As time goes 
any mistake that can possibly happen will inevitably happen.


In your proposed scenario there will be two different 
generated files
imported by server and client respectively. Tiny typo in 
writing your
build script will result in hard to detect run-time bug while 
code

itself still happily compiles.


Or a link error if we go a hybrid path where the imported 
module is emitting declarations/hooks via CTFE to be linked to 
by the proper generated code. This is something I'm thinking 
that could be a practical solution.


snip


What is the benefit of this approach over simply keeping all 
ctRegex bodies in separate package, compiling it as a static 
library and referring from actual app by own unique symbol? This 
is something that can does not need any changes in compiler or 
Phobos, just matter of project layout.


It does not work for more complicated cases were you actually 
need access to generated sources (generate templates for example).


You may keep convenience but losing guarantees hurts a lot. To 
be able
to verify static correctness of your program / group of 
programs type
system needs to be aware how generated code relates to 
original source.


Build system does it. We have this problem with all of external 
deps anyway (i.e. who verifies the right version of libXYZ is 
linked not some other?)


It is somewhat worse because you don't routinely change external 
libraries, as opposed to local sources.



Huge mess to maintain. According to my experience
all builds systems are incredibly fragile beasts, trusting them
something that impacts program correctness and won't be 
detected at

compile time is just too dangerous.


Could be, but we have dub which should be simple and nice.
I had very positive experience with scons and half-generated 
sources.


dub is terrible at defining any complicated build models. Pretty 
much anything that is not single step compile-them-all approach 
can only be done via calling external shell script. If using 
external generators is necessary I will take make over anything 
else :)



snip


tl; dr: I believe that we should improve compiler technology to 
achieve same results instead of promoting temporary hacks as the 
true way to do things. Relying on build system is likely to be 
most practical solution today but it is not solution I am 
satisfied with and hardly one I can accept as accomplished target.


Imaginary compiler that continuously runs as daemon/service, is 
capable of JIT-ing and provides basic dependency tracking as part 
of compilation step should behave as good as any external 
solution with much better correctness guarantees and overall user 
experience out of the box.


D port of docopt

2014-06-15 Thread Bob Tolbert via Digitalmars-d-announce

In order to learn D, I've worked up a port of the docopt
commandline parser (original in Python http://docopt.org).

https://github.com/rwtolbert/docopt.d

Since this is my first code in D, I apologize in advance for the
mix if Python and C++ idioms. Since this is ported from Python,
with the intention of staying compatible with future Python
versions, some of that is expected, but I look for this as an
chance to learn more about D.

It is also a pretty useful way to write commandline interfaces.
The included example that mimics the git CLI is pretty impressive.

This is also my first submission as a dub project, so hopefully I
got that right as well.

Still needs more tests ported from Python, but it does pass the
entire functional test suite for the current Python version.

Regards,
Bob


DIP63 : operator overloading for raw templates

2014-06-15 Thread Dicebot via Digitalmars-d-announce

http://forum.dlang.org/post/nwzuvslpvshqmwbed...@forum.dlang.org


Re: Soon be using D with Google App Engine via Managed VMs

2014-06-15 Thread via Digitalmars-d-announce

On Thursday, 12 June 2014 at 15:23:12 UTC, Casey wrote:
I didn't see anything in the article, but can you still use 
CloudSQL and similar from inside of one of those containers 
without using Java/Go/whatever else is supported by App Engine?


CloudSQL can be used from anywhere, but the Datastore is limited 
to App Engine:


«You still have access to core App Engine services such as 
Datastore, Task Queues, and Memcache from within Managed VMs»


https://developers.google.com/cloud/managed-vms

This is apparently not in preview yet, so it is not for 
production use and you have to apply to get access.


Re: DConf 2014 Day 1 Talk 4: Inside the Regular Expressions in D by Dmitry Olshansky

2014-06-15 Thread Dmitry Olshansky via Digitalmars-d-announce

15-Jun-2014 20:21, Dicebot пишет:

On Saturday, 14 June 2014 at 16:34:35 UTC, Dmitry Olshansky wrote:

But let's face it - it's a one-time job to get it right in your
favorite build tool. Then you have fast and cached (re)build.
Comparatively costs of CTFE generation are paid in full during _each_
build.


There is no such thing as one-time job in programming unless you work
alone and abandon any long-term maintenance. As time goes any mistake
that can possibly happen will inevitably happen.


The frequency of such event is orders of magnitude smaller. Let's not 
take arguments to supreme as then doing anything is futile due to the 
potential of mistake it introduces sooner or later.



In your proposed scenario there will be two different generated files
imported by server and client respectively. Tiny typo in writing your
build script will result in hard to detect run-time bug while code
itself still happily compiles.


Or a link error if we go a hybrid path where the imported module is
emitting declarations/hooks via CTFE to be linked to by the proper
generated code. This is something I'm thinking that could be a
practical solution.

snip


What is the benefit of this approach over simply keeping all ctRegex
bodies in separate package, compiling it as a static library and
referring from actual app by own unique symbol? This is something that
can does not need any changes in compiler or Phobos, just matter of
project layout.


Automation. Dumping the body of ctRegex is manual work after all, 
including putting it with the right symbol. In proposed scheme it's just 
a matter of copy-pasting a pattern after initial setup has been done.



It does not work for more complicated cases were you actually need
access to generated sources (generate templates for example).


Indeed, this is a limitation, and the import of generated source would 
be required.



You may keep convenience but losing guarantees hurts a lot. To be able
to verify static correctness of your program / group of programs type
system needs to be aware how generated code relates to original source.


Build system does it. We have this problem with all of external deps
anyway (i.e. who verifies the right version of libXYZ is linked not
some other?)


It is somewhat worse because you don't routinely change external
libraries, as opposed to local sources.



But surely we have libraries that are built as separate project and are 
external dependencies, right? There is nothing new here except that 
d--obj--lib file is changed to generator--generated D file---obj 
file.



Huge mess to maintain. According to my experience
all builds systems are incredibly fragile beasts, trusting them
something that impacts program correctness and won't be detected at
compile time is just too dangerous.


Could be, but we have dub which should be simple and nice.
I had very positive experience with scons and half-generated sources.


dub is terrible at defining any complicated build models. Pretty much
anything that is not single step compile-them-all approach can only be
done via calling external shell script.


I'm not going to like dub then ;)


If using external generators is
necessary I will take make over anything else :)


Then I understand your point about inevitable mistakes, it's all in the 
tool.



snip


tl; dr: I believe that we should improve compiler technology to achieve
same results instead of promoting temporary hacks as the true way to do
things. Relying on build system is likely to be most practical solution
today but it is not solution I am satisfied with and hardly one I can
accept as accomplished target.
Imaginary compiler that continuously runs as daemon/service, is capable
of JIT-ing and provides basic dependency tracking as part of compilation
step should behave as good as any external solution with much better
correctness guarantees and overall user experience out of the box.


What I want to point out is to not mistake goals and the means to an 
end. No matter how we call it CTFE code generation is just a means to an 
end, with serious limitations (especially as it stands today, in the 
real world).


Seamless integration is not about packing everything into single 
compiler invocation:


dmd src/*.d

Generation is generation, as long as it's fast and automatic it solves 
the problem(s) meta programming was established to solve.


For instance if D compiler allowed external tools as plugins (just an 
example to show means vs ends distinction) with some form of the 
following construct:


mixin(call_external_tool(args, 3, 14, 15, .92));

it would make any generation totally practical *today*. This was 
proposed before, and dismissed out of fear of security risks, never 
identifying the proper set of restrictions. After all we have textual 
mixins of potential security risk no problem.


Let's focus on the facts that this has the benefits of:
- sane debugging of the plug-in (it's just a program with the usual symbols)
- fast, as the 

Re: dmd front end now switched to Boost license

2014-06-15 Thread Ben Boeckel via Digitalmars-d-announce
On Sun, Jun 15, 2014 at 02:20:11 +0200, Leandro Lucarella via 
Digitalmars-d-announce wrote:
 I just wanted to point out that there might be more ethical licenses to
 achieve the same effect (allowing companies to build proprietary tools
 on top on DMDFE).

There's MPL which is source-file-based copyleft (rather than link-time
copyleft).

--Ben


Re: D port of docopt

2014-06-15 Thread Soulsbane via Digitalmars-d-announce
Thanks for this. Have played with it a whole lot yet but it looks 
like it will work better for me than getopt does.


Thanks again.


Re: D port of docopt

2014-06-15 Thread Bob Tolbert via Digitalmars-d-announce

On Monday, 16 June 2014 at 00:40:25 UTC, Soulsbane wrote:
Thanks for this. Have played with it a whole lot yet but it 
looks like it will work better for me than getopt does.


Hope it works for you. Let me know if you have questions. While
there are most likely cases of some command line interface it
can't do, I continue to be impressed with all that it does do.

I need to port the rest of the examples over from Python, but in
reality they are just a big string and a bit of code to call the
parser.

Bob


Re: DMD 2.066 Alpha

2014-06-15 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 6/15/14, 12:30 AM, Tove wrote:

On Friday, 13 June 2014 at 16:49:26 UTC, Andrei Alexandrescu wrote:


Virtual by default will not change. Being able to negate the final:
label is nice to have but not a must. Adding a keyword for that
doesn't scale - it would mean we'd need to add one keyword to undo
each label.


Andrei


Just to try and establish a clear path forwards,
if a pull request existed which added support for...
final!true
final!false
 would it be accepted?

Or would a generic negate-x-DIP be required?
const!false
noexcept!false
etc.


I think we'd need an approved DIP. -- Andrei