Re: Dgame RC #1

2015-04-03 Thread Namespace via Digitalmars-d-announce

On Friday, 3 April 2015 at 04:55:42 UTC, Mike Parker wrote:

On Thursday, 2 April 2015 at 09:38:05 UTC, Namespace wrote:



Dgame is based on SDL 2.0.3 (as described in the installation 
tutorial), but tries to wrap any function call which is 
introduced after SDL 2.0.0:


static if (SDL_VERSION_ATLEAST(2, 0, 2))

so that Dgame should be usable with any SDL2.x version.

I will investigate which function is calling SDL_HasAVX.


None of that matters. This has nothing to do with what Dgame is 
calling, but what Derelict is actually trying to load. 
SDL_HasAVX was added to the API in 2.0.2 so does not exist in 
previous versions of SDL, therefore an exception will be thrown 
when Derelict tries to load older versions and that function is 
missing.


Dgame will load DerelictSDL2 as usual and then it will check 
if the supported version is below 2.0.2. If so, DerelictSDL2 
will be reloaded with SharedLibVersion(MAX_SUPPORTED_VERSION)).



That should that work, right?


No, it won't. By default, Derelict attempts to load functions 
from the 2.0.2 API (which includes 2.0.3, since the API did not 
change). That means anything below 2.0.2 will *always* fail to 
load because they are missing the functions added to the API in 
2.0.2.


The right way to do this is to use the selective loading 
mechanism to disable exceptions for certain functions. With the 
1.9.x versions of DerelictSDL2, you no longer have to implement 
that manually. As I wrote above, you can do this:


DerelictSDL2.load(SharedLibVersion(2,0,0));

With that, you can load any version of SDL2 available on the 
system, from 2.0.0 on up. It uses selective loading internally. 
For example, 2.0.0 will load even though it is missing 
SDL_HasAVX and several other functions added in 2.0.1 and 
2.0.2. But you should only do this if you are absolutely sure 
that you are not calling any functions that were not present in 
2.0.0. For example, the SDL_GetPrefPath/SDL_GetBasePath 
functions were added in 2.0.1. If you require those and need 
nothing from 2.0.2, then you should do this:


DerelictSDL2.load(SharedLibVersion(2,0,1));

Now, 2.0.0 will fail to load, but 2.0.1 and higher will 
succeed. You can look at the functions allowSDL_2_0_0 and 
allowSDL_2_0_1 in sdl.d [1] to see exactly which functions were 
added in 2.0.1 and 2.0.2 so that you can determine if you 
require any of them. I also encourage you to go and do a diff 
of the SDL headers for each release to see things other than 
functions, like new constants, that were added in each release 
(and to protect against the possibility that I've made a 
mistake somewhere). That won't affect whether or not Derleict 
loads, but a new constant added in SDL 2.0.2 won't work with a 
function that existed in 2.0.0, for example.


Yes, you're right. I'll undo my changes and I'll set SDL 2.0.2 as 
a basis for Dgame. Thank you for the explanation. :)


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:10:33 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge no in the 
description. We must ban C style separate compilation, there is 
simply no way to move forward otherwise. At the very least not 
endorse it in any way.


I understand that. But:

1. One of D's advantages is fast compilation. I don't think that 
means we should should compile everything all the time because we 
can (it's fast anyway!)
2. There are measureable differences in compile-time. While 
working on reggae I got much faster edit-compile-unittest cycles 
because of separate compilation
3. This is valuable feedback. I was wondering what everybody else 
would think. It could be configureable, your not endorse it in 
any way notwithstanding. I for one would rather have it compile 
separately
4. CTFE and memory consumption can go through the roof 
(anecdotally anyway, it's never been a problem for me) when 
compiling everything at once.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
Also I don't see any point in yet another meta build system. 
The very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


I was also thinking of a binary backend (producing a binary 
executable that does the build, kinda like what ctRegex does but 
for builds), and also something that just builds it on the spot.


The thing is, I want to get feedback on the API first and 
foremost, and delegating the whole 
do-I-or-do-I-not-need-to-build-it logic to programs that already 
do that (and well) first was the obvious (for me) choice.


Also, Ninja is _really_ fast.


Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce
I wanted to work on this a little more before announcing it, but 
it seems I'm going to be busy working on trying to get 
unit-threaded into std.experimental so here it is:


http://code.dlang.org/packages/reggae

If you're wondering about the name, it's because it's supposed to 
build on dub.


You might wonder at some of the design decisions. Some of them 
are solutions to weird problems caused by writing build 
descriptions in a compiled language, others I'm not too sure of. 
Should compiler flags be an array of strings or a string? I got 
tired of typing square brackets so it's a string for now.


Please let me know if the API is suitable or not, preferably by 
trying to actually use it to build your software.


Existing dub projects might work by just doing this from a build 
directory of your choice: reggae -b make /path/to/project. That 
should generate a Makefile (or equivalent Ninja ones if `-b 
ninja` is used) to do what `dub build` usually does. It _should_ 
work for all dub projects, but it doesn't right now. For at least 
a few projects it's due to bugs in `dub describe`. For others it 
might be bugs in reggae, I don't know yet. Any dub.json files 
that use dub configurations extensively is likely to not work.


Features:

. Make and Ninja backends (tup will be the next one)
. Automatically imports dub projects and writes the reggae build 
configuration
. Access to all objects to be built with dub (including 
dependencies) when writing custom builds (reggae does this itself)

. Out-of-tree builds, like CMake
. Arbitrary build rules but pre-built ease-of-use higher level 
targets
. Separate compilation. One file changes, only one file gets 
rebuilt

. Automatic dependency detection for D, C, and C++ source files
. Can build itself (but includes too many object files, another 
`dub describe` bug)


There are several runnable examples in the features directory, in 
the form of Cucumber tests. They include linking D code to C++.


I submitted a proposal to talk about this at DConf but I'll be 
talking about testing instead. Maybe next year? Anyway, destroy!


Atila


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge no in the 
description. We must ban C style separate compilation, there is 
simply no way to move forward otherwise. At the very least not 
endorse it in any way.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce
Also I don't see any point in yet another meta build system. The 
very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:17:50 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
Also I don't see any point in yet another meta build system. 
The very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


I was also thinking of a binary backend (producing a binary 
executable that does the build, kinda like what ctRegex does 
but for builds), and also something that just builds it on the 
spot.


The thing is, I want to get feedback on the API first and 
foremost, and delegating the whole 
do-I-or-do-I-not-need-to-build-it logic to programs that 
already do that (and well) first was the obvious (for me) 
choice.


Also, Ninja is _really_ fast.


The thing is, it may actually affect API. The way I have 
originally expected it, any legal D code would be allowed for 
build commands instead of pure DSL approach. So instead of 
providing high level abstraction like this:


const mainObj  = Target(main.o,  dmd -I$project/src -c $in 
-of$out, Target(src/main.d));
const mathsObj = Target(maths.o, dmd -c $in -of$out, 
Target(src/maths.d));
const app = Target(myapp, dmd -of$out $in, [mainObj, 
mathsObj]);


.. you instead define dependency building blocks in D domain:

struct App
{
enum  path = ./myapp;
alias deps = Depends!(mainObj, mathsObj);

static void generate()
{
import std.process;
enforce(execute([ dmd,  -ofmyapp, deps[0].path, 
deps[1].path]).status);

}
}

And provide higher level helper abstractions on top of that, 
tuned for D projects. This is just random syntax I have just 
invented for example of course. It is already possible to write 
decent cross-platform scripts in D - only dependency tracking 
library is missing. But of course that would make using other 
build systems as backends impossible.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:59:22 UTC, Atila Neves wrote:
Well, I took your advice (and one of my acceptance tests is 
based off of your simplified real-work example) and started 
with the low-level any-command-will-do API first. I built the 
high-level ones on top of that. It doesn't seem crazy to me 
that certain builds can only be done by certain backends. The 
fact that the make backend can track C/C++/D dependencies 
wasn't a given and the implementation is quite ugly.


In any case, the Target structs aren't high-level abstractions, 
they're just data. Data that can be generated by any code. Your 
example is basically how the `dExe` rule works: run dmd at 
run-time, collect dependencies and build all the `Target` 
instances. You could have a D backend that outputs (then 
compiles and runs) your example. The only problem I can see 
is execution speed.


Maybe I didn't include enough examples.

I also need to think of your example a bit more.


I may have misunderstood how it works judging only by provided 
examples. Give a me bit more time to investigate actual sources 
and I may reconsider :)


Re: Digger 1.1

2015-04-03 Thread Robert M. Münch via Digitalmars-d-announce

On 2015-03-18 12:14:01 +, Vladimir Panteleev said:

I've pushed support for DMD bootstrapping, so if you need to build 
master now, build latest Digger from source. I'll make a binary release 
after 2.067 is out.


Any news on this?

And will there by COFF32 support as well?

--
Robert M. Münch
http://www.saphirion.com
smarter | better | faster



Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via 
Digitalmars-d-announce wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
 . Separate compilation. One file changes, only one file gets 
 rebuilt


This immediately has caught my eye as huge no in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


Why? Other than the -fversion=... stuff, what is really 
blocking this? I

personally find unity builds to not be worth it, but I don't see
anything blocking separate compilation for D if dependencies 
are set up

properly.

--Ben


There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old school 
object files are simply not good enough to preserve information 
necessary to produce optimized builds and we are not in position 
to create own metadata + linker combo to circumvent that. This 
also applies to attribute inference which has become a really 
important development direction to handle growing attribute hell.


During last D Berlin Meetup we had an interesting conversation on 
attribute inference topic with Martin Nowak and dropping legacy 
C-style separate compilation seemed to be recognized as 
unavoidable to implement anything decent in that domain.


2)

Ironically, it is just very slow. Those who come from C world got 
used to using separate compilation to speed up rebuilds but it 
doesn't work that way in D. It may look better if you change only 
1 or 2 module but as amount of modified modules grows, 
incremental rebuild quickly becomes _slower_ than full program 
build with all files processed in one go. It can sometimes result 
in order of magnitude slowdown (personal experience).


Difference from C is that repeated imports are very cheap in D 
(you don't copy-paste module content again and again like with 
headers) but at the same time semantic analysis of imported 
module is more expensive (because D semantics are more 
complicated). When you do separate compilation you discard 
already processed imports and repeat it again and again from the 
very beginning for each new compiled file, accumulating huge 
slowdown for application in total.


To get best compilation speed in D you want to process as many 
modules with shared imports at one time as possible. At the same 
time for really big projects it becomes not feasible at some 
point, especially if CTFE is heavily used and memory consumption 
explodes. In that case best approach is partial separate 
compilation - decoupling parts of a program as static libraries 
and doing parallel compilation of each separate library - but 
still compiling each library in one go. That allows to get 
parallelization without doing the same costly work again and 
again.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:40:42 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:17:50 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
Also I don't see any point in yet another meta build system. 
The very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


I was also thinking of a binary backend (producing a binary 
executable that does the build, kinda like what ctRegex does 
but for builds), and also something that just builds it on the 
spot.


The thing is, I want to get feedback on the API first and 
foremost, and delegating the whole 
do-I-or-do-I-not-need-to-build-it logic to programs that 
already do that (and well) first was the obvious (for me) 
choice.


Also, Ninja is _really_ fast.


The thing is, it may actually affect API. The way I have 
originally expected it, any legal D code would be allowed for 
build commands instead of pure DSL approach. So instead of 
providing high level abstraction like this:


const mainObj  = Target(main.o,  dmd -I$project/src -c $in 
-of$out, Target(src/main.d));
const mathsObj = Target(maths.o, dmd -c $in -of$out, 
Target(src/maths.d));
const app = Target(myapp, dmd -of$out $in, [mainObj, 
mathsObj]);


.. you instead define dependency building blocks in D domain:

struct App
{
enum  path = ./myapp;
alias deps = Depends!(mainObj, mathsObj);

static void generate()
{
import std.process;
enforce(execute([ dmd,  -ofmyapp, deps[0].path, 
deps[1].path]).status);

}
}

And provide higher level helper abstractions on top of that, 
tuned for D projects. This is just random syntax I have just 
invented for example of course. It is already possible to write 
decent cross-platform scripts in D - only dependency tracking 
library is missing. But of course that would make using other 
build systems as backends impossible.


Well, I took your advice (and one of my acceptance tests is based 
off of your simplified real-work example) and started with the 
low-level any-command-will-do API first. I built the high-level 
ones on top of that. It doesn't seem crazy to me that certain 
builds can only be done by certain backends. The fact that the 
make backend can track C/C++/D dependencies wasn't a given and 
the implementation is quite ugly.


In any case, the Target structs aren't high-level abstractions, 
they're just data. Data that can be generated by any code. Your 
example is basically how the `dExe` rule works: run dmd at 
run-time, collect dependencies and build all the `Target` 
instances. You could have a D backend that outputs (then compiles 
and runs) your example. The only problem I can see is execution 
speed.


Maybe I didn't include enough examples.

I also need to think of your example a bit more.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:22:42 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 17:10:33 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge no in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


I understand that. But:

1. One of D's advantages is fast compilation. I don't think 
that means we should should compile everything all the time 
because we can (it's fast anyway!)
2. There are measureable differences in compile-time. While 
working on reggae I got much faster edit-compile-unittest 
cycles because of separate compilation
3. This is valuable feedback. I was wondering what everybody 
else would think. It could be configureable, your not endorse 
it in any way notwithstanding. I for one would rather have it 
compile separately
4. CTFE and memory consumption can go through the roof 
(anecdotally anyway, it's never been a problem for me) when 
compiling everything at once.


See 
http://forum.dlang.org/post/nhaoahnqucqkjgdwt...@forum.dlang.org


tl; dr: separate compilation support is necessary, but not at 
single module level.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:55:00 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via 
Digitalmars-d-announce wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
 . Separate compilation. One file changes, only one file 
 gets rebuilt


This immediately has caught my eye as huge no in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


Why? Other than the -fversion=... stuff, what is really 
blocking this? I
personally find unity builds to not be worth it, but I don't 
see
anything blocking separate compilation for D if dependencies 
are set up

properly.

--Ben


There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old 
school object files are simply not good enough to preserve 
information necessary to produce optimized builds and we are 
not in position to create own metadata + linker combo to 
circumvent that. This also applies to attribute inference which 
has become a really important development direction to handle 
growing attribute hell.


During last D Berlin Meetup we had an interesting conversation 
on attribute inference topic with Martin Nowak and dropping 
legacy C-style separate compilation seemed to be recognized as 
unavoidable to implement anything decent in that domain.


2)

Ironically, it is just very slow. Those who come from C world 
got used to using separate compilation to speed up rebuilds but 
it doesn't work that way in D. It may look better if you change 
only 1 or 2 module but as amount of modified modules grows, 
incremental rebuild quickly becomes _slower_ than full program 
build with all files processed in one go. It can sometimes 
result in order of magnitude slowdown (personal experience).


Difference from C is that repeated imports are very cheap in D 
(you don't copy-paste module content again and again like with 
headers) but at the same time semantic analysis of imported 
module is more expensive (because D semantics are more 
complicated). When you do separate compilation you discard 
already processed imports and repeat it again and again from 
the very beginning for each new compiled file, accumulating 
huge slowdown for application in total.


To get best compilation speed in D you want to process as many 
modules with shared imports at one time as possible. At the 
same time for really big projects it becomes not feasible at 
some point, especially if CTFE is heavily used and memory 
consumption explodes. In that case best approach is partial 
separate compilation - decoupling parts of a program as static 
libraries and doing parallel compilation of each separate 
library - but still compiling each library in one go. That 
allows to get parallelization without doing the same costly 
work again and again.


Interesting.

It's true that it's not always faster to compile each module 
separately, I already knew that. It seems to me, however, that 
when that's actually the case, the practical difference is 
negligible. Even if 10x slower, the linker will take longer 
anyway. Because it'll all still be under a second. That's been my 
experience anyway. i.e. It's either faster or it doesn't make 
much of a difference.


All I know is I've seen a definite improvement in my 
edit-compile-unittest cycle by compiling modules separately.


How would the decoupling happen? Is the user supposed to 
partition the binary into suitable static libraries? Or is the 
system supposed to be smart enough to figure that out?


Atila




Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Jacob Carlborg via Digitalmars-d-announce

On 2015-04-03 20:06, Atila Neves wrote:


Interesting.

It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


I just tried compiling one of my project. It has a makefile that does 
separate compilation and a shell script I use for unit testing which 
compiles everything in one go. The makefile takes 5.3 seconds, does not 
including linking since it builds a library. The shell script takes 1.3 
seconds which include compiling unit tests and linking as well.


--
/Jacob Carlborg


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread weaselcat via Digitalmars-d-announce

On Friday, 3 April 2015 at 19:07:09 UTC, Jacob Carlborg wrote:

On 2015-04-03 20:06, Atila Neves wrote:


Interesting.

It's true that it's not always faster to compile each module 
separately,
I already knew that. It seems to me, however, that when that's 
actually
the case, the practical difference is negligible. Even if 10x 
slower,
the linker will take longer anyway. Because it'll all still be 
under a
second. That's been my experience anyway. i.e. It's either 
faster or it

doesn't make much of a difference.


I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The makefile 
takes 5.3 seconds, does not including linking since it builds a 
library. The shell script takes 1.3 seconds which include 
compiling unit tests and linking as well.


change one file and see which one is faster with an incremental 
build.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Jacob Carlborg via Digitalmars-d-announce

On 2015-04-03 19:03, Atila Neves wrote:

I wanted to work on this a little more before announcing it, but it
seems I'm going to be busy working on trying to get unit-threaded into
std.experimental so here it is:

http://code.dlang.org/packages/reggae


One thing I noticed immediately (unless I'm mistaken), compiling a D 
project without dependencies is too complicated. It should just be:


$ cd my_d_project
$ reggae

--
/Jacob Carlborg


Re: Loading of widgets from DML markup and DML Editor in DlangUI

2015-04-03 Thread Bruno Deligny via Digitalmars-d-announce
If you are interested, we are doing a GUI system inspired by 
QtQuick/QMLEngine :


https://github.com/D-Quick/DQuick


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 19:08:58 UTC, weaselcat wrote:
I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The makefile 
takes 5.3 seconds, does not including linking since it builds 
a library. The shell script takes 1.3 seconds which include 
compiling unit tests and linking as well.


change one file and see which one is faster with an incremental 
build.


I don't care if incremental build is 10x faster if full build 
still stays at ~1 second. However I do care (and consider 
unacceptable) if support for incremental builds makes full build 
10 seconds long.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/3/15 12:07 PM, Jacob Carlborg wrote:

On 2015-04-03 20:06, Atila Neves wrote:


Interesting.

It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


I just tried compiling one of my project. It has a makefile that does
separate compilation and a shell script I use for unit testing which
compiles everything in one go. The makefile takes 5.3 seconds, does not
including linking since it builds a library. The shell script takes 1.3
seconds which include compiling unit tests and linking as well.


Truth be told that's 5.3 seconds for an entire build so the comparison 
is only partially relevant. -- Andrei




Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 18:06:42 UTC, Atila Neves wrote:
All I know is I've seen a definite improvement in my 
edit-compile-unittest cycle by compiling modules separately.


How would the decoupling happen? Is the user supposed to 
partition the binary into suitable static libraries? Or is the 
system supposed to be smart enough to figure that out?


Ideally both. Build system should be smart enough to group into 
static libraries automatically if user doesn't care (Andrei 
suggestion of one package per library makes sense) but option of 
explicit definition of compilation units is still necessary of 
course.


Re: Digger 1.1

2015-04-03 Thread Vladimir Panteleev via Digitalmars-d-announce

On Friday, 3 April 2015 at 16:43:38 UTC, Robert M. Münch wrote:

On 2015-03-18 12:14:01 +, Vladimir Panteleev said:

I've pushed support for DMD bootstrapping, so if you need to 
build master now, build latest Digger from source. I'll make a 
binary release after 2.067 is out.


Any news on this?


There is a preview binary release with this implemented.


And will there by COFF32 support as well?


Shouldn't be too hard to add.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread weaselcat via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:55:00 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via 
Digitalmars-d-announce wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
 . Separate compilation. One file changes, only one file 
 gets rebuilt


This immediately has caught my eye as huge no in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


Why? Other than the -fversion=... stuff, what is really 
blocking this? I
personally find unity builds to not be worth it, but I don't 
see
anything blocking separate compilation for D if dependencies 
are set up

properly.

--Ben


There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old 
school object files are simply not good enough to preserve 
information necessary to produce optimized builds and we are 
not in position to create own metadata + linker combo to 
circumvent that. This also applies to attribute inference which 
has become a really important development direction to handle 
growing attribute hell.


Not sure about other people, but I do not care about whole 
program optimization during an edit-compile-run cycle. I just 
want it to compile as fast as possible, and if I change one or 
two files I don't want to have to recompile an entire codebase.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/3/15 11:06 AM, Atila Neves wrote:


It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


Whoa. The difference is much larger (= day and night) on at least a 
couple of projects at work.



All I know is I've seen a definite improvement in my
edit-compile-unittest cycle by compiling modules separately.

How would the decoupling happen? Is the user supposed to partition the
binary into suitable static libraries? Or is the system supposed to be
smart enough to figure that out?


Smarts would be nice, but in first approximation one package = one 
compilation unit is a great policy.



Andrei



Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/3/15 10:10 AM, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:

. Separate compilation. One file changes, only one file gets rebuilt


This immediately has caught my eye as huge no in the description. We
must ban C style separate compilation, there is simply no way to move
forward otherwise. At the very least not endorse it in any way.


Agreed. D build style should be one invocation per package. -- Andrei


Re: Dgame RC #1

2015-04-03 Thread Craig Dillabaugh via Digitalmars-d-announce

One small note about the tutorials. In the tutorial on
Game Loop and Event handling:

http://rswhite.de/dgame5/?page=tutorialtut=handle_events

In the first example, I believe you are missing an import for 
Dgame.Window.Event. It shows up int the second example, so no big 
deal, but I figured I should let you know.


Are the tutorials on GitHub too?

Craig


OpenVG bindings

2015-04-03 Thread ddos via Digitalmars-d-announce

Hi folks,

today i've created my first dlang library ^_^ a binding to the 
OpenVG library standard. The referenced implementation is ShivaVG 
which allows to draw vector graphics within an OpenGL context 
(similar to cairo).

A small demo application is included, using derelict gl3 and glfw3

https://github.com/oggs91/OpenVG_D
http://code.dlang.org/packages/dopenvg


Re: OpenVG bindings

2015-04-03 Thread Rikki Cattermole via Digitalmars-d-announce

On 4/04/2015 11:53 a.m., ddos wrote:

Hi folks,

today i've created my first dlang library ^_^ a binding to the OpenVG
library standard. The referenced implementation is ShivaVG which allows
to draw vector graphics within an OpenGL context (similar to cairo).
A small demo application is included, using derelict gl3 and glfw3

https://github.com/oggs91/OpenVG_D
http://code.dlang.org/packages/dopenvg


Could you please add an example using Devisualization.Window?
That way its one less external to D library that is a dependency.

https://github.com/Devisualization/window