CG: a note about articulations on EventChord (issue 5696080)

2012-02-26 Thread janek . lilypond

Reviewers: ,

Message:
This is a bit vague and perhaps wordy, but i found it really helpful
when David was changing EventChord.
I'm not sure if this text reflects how things work now (wrt/ David's
changes) - hopefully someone knowledgeable will confirm.

Description:
CG: a note about articulations on EventChord

Explains a bit about iterators.

Please review this at http://codereview.appspot.com/5696080/

Affected files:
  M Documentation/contributor/programming-work.itexi


Index: Documentation/contributor/programming-work.itexi
diff --git a/Documentation/contributor/programming-work.itexi  
b/Documentation/contributor/programming-work.itexi
index  
1ff0247a731a2fd725ae08507a12fca73fa9a3cd..0897ca125211074cb7d986a1a457cc826520898f  
100644

--- a/Documentation/contributor/programming-work.itexi
+++ b/Documentation/contributor/programming-work.itexi
@@ -1567,6 +1567,9 @@ Iterators are routines written in C++ that process  
music expressions

 and sent the music events to the appropriate engravers and/or
 performers.

+See a short example discussing iterators and their duties in
+@ref{Articulations on EventChord}.
+

 @node Engraver tutorial
 @section Engraver tutorial
@@ -2248,6 +2251,7 @@ would become zero as items are moved to other homes.
 * Spacing algorithms::
 * Info from Han-Wen email::
 * Music functions and GUILE debugging::
+* Articulations on EventChord::
 @end menu

 @node Spacing algorithms
@@ -2653,3 +2657,30 @@ The breakpoint failing may have to do with the call  
sequence.  See
 @file{parser.yy}, run_music_function().  The function is called directly  
from

 C++, without going through the GUILE evaluator, so I think that is why
 there is no debugger trap.
+
+@node Articulations on EventChord
+@subsection Articulations on EventChord
+
+From David Kastrup's email
+@uref{http://lists.gnu.org/archive/html/lilypond-devel/2012-02/msg00189.html}:
+
+LilyPond's typesetting does not act on music expressions and music
+events.  It acts exclusively on stream events.  It is the act of
+iterators to convert a music expression into a sequence of stream events
+played in time order.
+
+The EventChord iterator is pretty simple: it just takes its elements
+field when its time comes up, turns every member into a StreamEvent and
+plays that through the typesetting process.  The parser currently
+appends all postevents belonging to a chord at the end of elements,
+and thus they get played at the same point of time as the elements of
+the chord.  Due to this design, you can add per-chord articulations or
+postevents or even assemble chords with a common stem by using parallel
+music providing additional notes/events: the typesetter does not see a
+chord structure or postevents belonging to a chord, it just sees a
+number of events occuring at the same point of time in a Voice context.
+
+So all one needs to do is let the EventChord iterator play articulations
+after elements, and then adding to articulations in EventChord is
+equivalent to adding them to elements (except in cases where the order
+of events matters).



___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel


Re: articulations on EventChord

2012-02-09 Thread Janek Warchoł
2012/2/8 David Kastrup d...@gnu.org:
 Uh, you just copied and pasted everything including the speculative
 parts and those not particular to event chords.

oops, i thought that once you let EventChord replay articulations
everyting from that e-mail will become true, so i decided to push
directly to reduce fuss :/

 So I am taking the liberty of removing that commit (while that
 is still feasible) and would ask you to create an issue instead.

done http://code.google.com/p/lilypond/issues/detail?id=2306

thanks for correcting me,
Janek

___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel


Re: articulations on EventChord

2012-02-08 Thread Janek Warchoł
2012/2/7 David Kastrup d...@gnu.org:

 I have been just rewritting a snippet that used to add text spanners to
 elements working purely on EventChord.  It now has to add to
 articulations on a NoteEvent, and to elements on EventChord.

 As an element of courtesy to programmers, I would like to allow people
 to add to articulations on EventChord.  That way, you can do
 extract-named-music on both EventChord and NoteEvent, and just plaster
 your stuff into articulations' without worrying which of the two you
 got.

Sounds good to me.

 As long as this is just user generated content, there is not much more
 to do rather than let the event chord iterator replay articulations
 after elements.

I don't understand.  Do you mean that if an articulation is applied to
EventChord, it would be automatically applied to EventChord's elements
(moved to elements level)?  Or that articulations applied to whole
EventChord will be applied on top of articulations applied to
elements?
(If i screwed this completely, you don't have to waste a lot of time explaining)

thanks,
Janek

___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel


Re: articulations on EventChord

2012-02-08 Thread David Kastrup
Janek Warchoł janek.lilyp...@gmail.com writes:

 2012/2/7 David Kastrup d...@gnu.org:

 As long as this is just user generated content, there is not much
 more to do rather than let the event chord iterator replay
 articulations after elements.

 I don't understand.  Do you mean that if an articulation is applied to
 EventChord, it would be automatically applied to EventChord's elements
 (moved to elements level)?  Or that articulations applied to whole
 EventChord will be applied on top of articulations applied to
 elements?  (If i screwed this completely, you don't have to waste a
 lot of time explaining)

LilyPond's typesetting does not act on music expressions and music
events.  It acts exclusively on stream events.  It is the act of
iterators to convert a music expression into a sequence of stream events
played in time order.

The EventChord iterator is pretty simple: it just takes its elements
field when its time comes up, turns every member into a StreamEvent and
plays that through the typesetting process.  The parser currently
appends all postevents belonging to a chord at the end of elements,
and thus they get played at the same point of time as the elements of
the chord.  Due to this design, you can add per-chord articulations or
postevents or even assemble chords with a common stem by using parallel
music providing additional notes/events: the typesetter does not see a
chord structure or postevents belonging to a chord, it just sees a
number of events occuring at the same point of time in a Voice context.

So all one needs to do is let the EventChord iterator play articulations
after elements, and then adding to articulations in EventChord is
equivalent to adding them to elements (except in cases where the order
of events matters).

Feel free to add this information into CG when you find a nice place...

-- 
David Kastrup

___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel


Re: articulations on EventChord

2012-02-08 Thread Janek Warchoł
2012/2/8 David Kastrup d...@gnu.org:
 LilyPond's typesetting does not act on music expressions and music
 events.  It acts exclusively on stream events.  It is the act of
 iterators to convert a music expression into a sequence of stream events
 played in time order.

 The EventChord iterator is pretty simple: it just takes its elements
 field when its time comes up, turns every member into a StreamEvent and
 plays that through the typesetting process.  The parser currently
 appends all postevents belonging to a chord at the end of elements,
 and thus they get played at the same point of time as the elements of
 the chord.  dd to this design, you can add per-chord articulations or
 postevents or even assemble chords with a common stem by using parallel
 music providing additional notes/events: the typesetter does not see a
 chord structure or postevents belonging to a chord, it just sees a
 number of events occuring at the same point of time in a Voice context.

 So all one needs to do is let the EventChord iterator play articulations
 after elements, and then adding to articulations in EventChord is
 equivalent to adding them to elements (except in cases where the order
 of events matters).

Ah, so when a music function encounters an EventChord, it won't have
to find its elements?  It will simply apply articulations to whatever
it received, be it NoteEvent or EventChord, and don't worry about the
details?  Seems perfect!

 Feel free to add this information into CG when you find a nice place...

Pushed as 1ba879b62380c43246fa74e7e2b80b6fa0fde754
Many thanks!
Janek

___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel


Re: articulations on EventChord

2012-02-08 Thread David Kastrup
Janek Warchoł janek.lilyp...@gmail.com writes:

 2012/2/8 David Kastrup d...@gnu.org:
 LilyPond's typesetting does not act on music expressions and music
 events.  It acts exclusively on stream events.  It is the act of
 iterators to convert a music expression into a sequence of stream events
 played in time order.

 The EventChord iterator is pretty simple: it just takes its elements
 field when its time comes up, turns every member into a StreamEvent and
 plays that through the typesetting process.  The parser currently
 appends all postevents belonging to a chord at the end of elements,
 and thus they get played at the same point of time as the elements of
 the chord.  dd to this design, you can add per-chord articulations or
 postevents or even assemble chords with a common stem by using parallel
 music providing additional notes/events: the typesetter does not see a
 chord structure or postevents belonging to a chord, it just sees a
 number of events occuring at the same point of time in a Voice context.

 So all one needs to do is let the EventChord iterator play articulations
 after elements, and then adding to articulations in EventChord is
 equivalent to adding them to elements (except in cases where the order
 of events matters).

 Ah, so when a music function encounters an EventChord, it won't have
 to find its elements?  It will simply apply articulations to whatever
 it received, be it NoteEvent or EventChord, and don't worry about the
 details?  Seems perfect!

 Feel free to add this information into CG when you find a nice place...

 Pushed as 1ba879b62380c43246fa74e7e2b80b6fa0fde754
 Many thanks!

Uh, you just copied and pasted everything including the speculative
parts and those not particular to event chords.  This is too
unstructured as content and not likely to be all that helpful.  I will
follow up with the code turning this into more of a reality eventually,
but since I don't really have the time to review the text properly at
this point of time, I would prefer it if it was pasted into an issue
report rather than the CG, to be dealt with with a proper review.  So I
am taking the liberty of removing that commit (while that is still
feasible) and would ask you to create an issue instead.

Thanks

-- 
David Kastrup


___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel


articulations on EventChord

2012-02-07 Thread David Kastrup

I have been just rewritting a snippet that used to add text spanners to
elements working purely on EventChord.  It now has to add to
articulations on a NoteEvent, and to elements on EventChord.

As an element of courtesy to programmers, I would like to allow people
to add to articulations on EventChord.  That way, you can do
extract-named-music on both EventChord and NoteEvent, and just plaster
your stuff into articulations' without worrying which of the two you
got.

As long as this is just user generated content, there is not much more
to do rather than let the event chord iterator replay articulations
after elements.

It might make sense at some point of time to actually let the parser
generate this arrangement as well.  However, that would be a change of
format that should likely be left for another time.

What's your take on this?  Just replaying articulations is not likely to
cause any incompatibilities (though of course snippets making use of
that trick generate incompatible music streams in the process).

-- 
David Kastrup


___
lilypond-devel mailing list
lilypond-devel@gnu.org
https://lists.gnu.org/mailman/listinfo/lilypond-devel