Re: Isn't it about time for D3?

2017-06-10 Thread James Hofmann via Digitalmars-d

On Saturday, 10 June 2017 at 23:30:18 UTC, Liam McGillivray wrote:
I realize that there are people who want to continue using D as 
it is, but those people may continue to use D2. Putting the 
breaking changes in a separate branch ensures that DM won't 
lose current clients as they can just continue using D2 as they 
are. Even after D3 is stabilized, D2 could have continued 
support until nobody wants it.


Here is my suggestion for how to migrate into D3 well, if it were 
to be done:


First define a "Small D2". The goal of this language is to not 
really be that much smaller(it could still leave in most of the 
language, in fact), but to sweep up all the changes breaking with 
D2 into a form that's clearly less flexible while still being 
palatable for the majority of modules. It's allowed to be 
intentionally clumsy and regressive in some ways so that it can 
be forward-looking later, but not so much so that people don't 
want to write in it.


Small D2 is given the freedom to do things like regress standard 
library functions back into hardcoded compiler behaviors, to be 
more restrictive, to allow less ambiguity, and to generally 
follow with existing idioms but break the things that needed 
breaking and leave out the parts that didn't work. The new stuff 
of Small D2 would be "one-trick-pony" features that bottleneck 
what can be expressed into a "library" that is actually a 
compiler hack, or "new syntax" that is actually fake and only 
allows arbitrary expressions that are already possible in other 
ways.


Most importantly, Small D2 retains compatibility with regular 
("big") D2 modules, so codebases can be migrated into Small D2 
piecemeal, while leaving a small remainder to be migrated in a 
larger way when D3 rolls out. In essence, Small D2 allows the 
underlying formulation of D to be challenged, but it actually 
relies on regular D2 constructs internally, so iteration on 
design can go faster than if we "did it for real". And because 
Small D2 gives people a taste of new stuff, it will have some 
instant appeal: People should want to write new code in it, even 
knowing that it's a hack and less powerful.


Small D2 would be the debate platform for what the "core" of D 
is. Then, D3 is Small D2 with everything "done for real" this 
time. Small D2 code migrates forward seamlessly and can continue 
operating as if nothing happened. The rest of it is considered 
breaking.


Re: Mockup of my doc dream ideas

2015-12-24 Thread James Hofmann via Digitalmars-d

On Friday, 25 December 2015 at 05:06:47 UTC, Adam D. Ruppe wrote:
Next, observe the highlighted words... and go ahead and try to 
click on them. Oh snap! They are links to the language features 
used. This is the web, let's link all over the place. I want to 
have relevant conceptual overviews available one click away on 
every function.


Before I get to remarks about this demo, I'd like to connect the 
D doc situation with the current state of technical writing as I 
understand it(it's something I'm moving towards professionally). 
There's a set of trends I've identified in my survey of the 
landscape:


1. There's more code than ever and more of a need for code to be 
well-documented on a conceptual/hand-holding level, as well as 
the minimal "turn comments into docs" situation. "User manuals" 
are out - everyone wants easy UX - but "API docs" are in.
2. A shift away from proprietary tools like Word or Adobe 
Framemaker towards a heterogenous, open-format ecosystem - e.g. 
conversion tools like pandoc are important now.
3. "Topic-based authoring". This is mainly led by the DITA 
format, which imposes a particular *model* for how the 
documentation is laid out and how pieces of information may be 
related, vs. the more free-form linkage of a wiki or the linear 
text of traditional books. You typically see these topic-based 
systems appear in big enterprisey application software, but the 
general idea of fitting the text into a certain container is 
similar to the API doc tradition of following the 
class-and-method structure of the code.
4. A default approach that uses lighter markup like Markdown or 
wiki text in order to minimize the barriers to entry. "Heavy" 
formats like DocBook add friction to the prose-writing process, 
even though they're necessary once you pile on enough features. 
Authors may write in one and convert to the other.


The ideal documentation system for D addresses each of these - 
dedicated authors who are primarily adding prose text, a variety 
of output formats, models for defining code/data/textual 
relationships, and low friction inputs.


The mockup is good in that it cleans up the messy look and feel 
of the current presentation, and suggests the potential for 
interactivity. My main concern from a presentation standpoint is 
high latency, which in these interactive help systems typically 
comes in the form of slow page loads every time I try to click on 
something. My ideal system would stay one step ahead of me or 
allow me to stay on the same page as often as possible. Options 
for offline docs also usually fix the problem.


There is also a pretty good conceptual base in the existing ddoc 
system w/r to its goal of extending documentation comments. Its 
defaults are biased towards HTML presentation, which is OK, 
albeit not as trendy as Markdown.


What ddoc misses, though, is strong capability to invert the 
semantics towards prose-first - for example, if I wanted to write 
a very extensive tutorial that discusses a large codebase or how 
the codebase changes over time, then I'd write prose first, but 
reference an annotated version of the codebase so that I can call 
out specific parts of it. This is something that could be 
realized with a crude, hand-rolled system to extract the code and 
inline it, but it would most likely give up the semantic 
possibilities; I'd end up only having some syntax highlighting 
and not be able to let the reader click on the source and see the 
file in context, link function name callouts to their API 
documentation, etc.


One way to start seriously addressing all of these various issues 
is to export an annotated, pre-highlighted D source file with 
metadata, which could then be slurped up and processed again into 
an interactive API doc, a prose text, presentation slides, etc. 
This work can be seen as a two-parter: parsing the code and 
emitting useful, easily processed data - and then making default 
presentation of that data good, so that minimal effort is still 
rewarded. It's conceptually similar to the problem space of IDE 
integration, where different IDEs may want to query the source in 
different ways.


Semantic documentation source is also one of the rare situations 
where the XML ecosystem is at its best - XSLT basically exists to 
declaratively transform semantic markup into formatted 
presentation, as is done, for example, with DocBook. It wouldn't 
solve any *particular* problem to target XML, but it would expand 
people's ability to dream up and implement presentation-side 
solutions.


Re: I hate new DUB config format

2015-11-30 Thread James Hofmann via Digitalmars-d
Although I admit to coming in late to a big bikeshed-fest, I have 
some opinions on configuration file formats from having seen 
younger, non-technical end users try to configure their own game 
servers. The support cost of misconfiguration due to syntax error 
is enormous. Gob-stoppingly huge. It is day after day of


Q: hey it's broke fix it
A: you forgot to add a double quote in your config file

And so when the file format is pressed into the role of primary 
UI, and is touched directly by hundreds or thousands of people, 
who want to write things that are more than a few trivial lines, 
relying only on JSON, which biases towards parser and programmer 
friendliness, not towards forgiving syntax, is not the right 
trade-off for total human effort and the stress levels of project 
maintainers. Those files are source code and need the additional 
care and forgiving structure of a source code language. If you 
want externally-generated configurations, then JSON is the right 
move, but it is not a complete design - it's passing the buck to 
users.


For similar reasons there are a lot of interfaces and formats for 
writing documents and nobody is entirely happy with any of them. 
It's easy to write simple things in Markdown variants, but 
complex material pushes against the feature boundaries. You can 
do pretty much everything with DocBook or TeX, but they're a 
chore. Word processing tools can smooth out the problem of 
discovering features, but again restrict your control and emit 
weird markup in the process. The happy medium workflow tends to 
involve writing in a "light" format and then transferring content 
to the "heavy" one to merge it and add finishing touches.


A year or two ago I spent a lot of time thinking about 
source-vs-serialization trade-offs. When you can get 
bi-directional serialization and ensure that pretty-printed 
output looks like the original input, you lose some freedom on 
the end of the user being allowed to make mistakes or change up 
their style partway through. Sometimes you want that, and 
sometimes you don't, and it really does depend on the context 
you're putting that format into.


If you look at Lisp family languages, for example, they take an 
extreme posture on bi-di behavior in that "code is data and data 
is code", but that also means that their ecosystem is defined 
around having s-expression-friendly editors and a coding style 
that biases a bit more towards using powerful language extension 
tools. And it's another valid way to go with configuration 
formats, of course. It would make sense to jump all the way over 
to a script interpreter if, as got mentioned earlier in this 
thread, SDL were to start being extended to do "programming-like" 
things.


FWIW, I'm tempted to take the side of "make JS the default, 
compile existing SDL and JSON to JS when run, add compilers for 
TOML or YAML if there's demand". If you make code your lowest 
common denominator, nothing else matters, and JS is the de-facto 
lowest common denominator of code, today. Someone presented with 
a config whose syntax they don't know can tell Dub to port it to 
JS and edit that instead, and so over time all configs end up 
being a blob of JS code, in the same way that the "light"/"heavy" 
markup situation is resolved by gradually converting everything 
into the heavy format even if it didn't start there. That is OK. 
Dub might run a bit slower, and there are some security issues 
raised from it, but the world is unlikely to blow up because 
someone wrote "clever" JS in their Dub config.


Also, people will see the option of coding JS and go, "Now I can 
write a build system on top of Dub, and it can use my own config 
format, way better than SDL or YAML or TOML! Everyone's gonna 
love this!" The D and Dub maintainers smile innocently and say 
nothing...


Re: Our template emission strategy is broken

2015-11-11 Thread James Hofmann via Digitalmars-d
On Wednesday, 11 November 2015 at 17:19:31 UTC, David Nadlinger 
wrote:
Of course, many of the problems could have probably been 
avoided if there was an iron-clad rule that the module 
dependency graph must remain acyclic from the beginning of 
development (at least at the level of units of compilation). 
But how could they have known how bad it would get otherwise? I 
don't think this is reflected in our documentation anywhere, at 
least not in a prominent place.


 — David


There is some literature about whether this kind of rule, 
enforced at compile-time, can benefit software architecture in 
general, in the context of F# projects vs. similar C# projects. 
(Answer: probably yes - there's a relationship between cyclical 
dependencies and other accidental coupling; you can make some 
case for it both theoretically and in statistics)


http://evelinag.com/blog/2014/06-09-comparing-dependency-networks/#.VkP6mL_eNYU

Looking at what D does now, at least according to "The D 
Programming Language" (2010), it tries to allow any ordering but 
throws an exception at runtime when ambiguious cases are 
detected. So there's already some precedent to avoid cyclical 
dependency simply to avoid those errors. The coupling argument 
and the compile-times argument just add more urgency to it.


Nim enforced the same dependency rule as F# the last time I 
looked, but I think that position was softening towards 
"optional" due to some community pressure. I don't think anything 
makes this architecture style actually impractical and it might 
help to have the compiler warn towards it - although the same 
kind of community pressure is going to arise if it did do so. 
After all, nobody likes to be told that they are writing Bad Code 
:)


Re: Make all new symbols go through std.experimental?

2015-10-28 Thread James Hofmann via Digitalmars-d
On Wednesday, 28 October 2015 at 00:28:51 UTC, Andrei 
Alexandrescu wrote:

What say you?

Andrei


I think most people like the idea of having new things 
categorized, but encounter difficulty agreeing on what categories 
to use. Everyone will make broad assumptions that fit their 
comfort zone. Word choices like "experimental" or "testing" could 
influence people's impressions of stability, and so could the 
method of friction that is added to prevent accidental use. The 
point is to be a little bit scary - but are we scaring off the 
right people when we use these words and techniques?


I don't have much D history but I would approach the issue 
ground-up and reconsider the options for introductions. I thought 
of these ways that friction could be added(considering here both 
library and language features):


1. Rebuild compiler with additional flags.
2. Available only in one or more separate library versions(e.g. a 
dub package that replaces "stable" std).

3. Run compiler with additional flags.
4. Import a named package like std.experimental.

There is a very key difference between the first three and the 
fourth in that if you have modified the compiler, your build 
flags, or your library environment, you are more likely to be 
working on a personal machine with personal projects, and can 
afford to tinker with a rapidly changing situation. With 4, 
people will be able to use it injudiciously in production by 
copy-pasting source code from someone's example - no warnings 
will appear as a result of this.


Knowing that, we have to ask some questions about workflows to 
settle what the role of std.experimental is. Friction appears 
_after_ a symbol is moved out of experimental, as noted by 
previous replies, when what is desired is for graduation out of 
experimental to be apparently seamless. On the other hand, it's 
very troublesome to have working code break after toggling a flag 
to turn on the experimental things. Flags should feel "precise" 
and not disturb the existing code. Environmental changes add 
friction around initial use - when code doesn't "just work" 
because it requires an environment change it is likely to be 
reconsidered by its user.


Perhaps that means a rule like: if it's entirely new, use a build 
flag, or expose it in an "unstable" library version. If it 
modifies/refines an old, stable symbol, use std.experimental, and 
deprecate the old API as confidence grows - thus encouraging 
people to move towards experimental often, but to start 
manipulating the rest of the environment if they want to be truly 
adventurous. (this doesn't exactly solve the "post-hoc friction" 
problem, but it scores well on accident prevention)


There's probably much more to consider but I will leave my 
thought there.