[NTG-context] cow-font

2020-12-22 Thread Peter Münster
Hi,

How could I make use of the cow-font please?

I've tried this test-file:

--8<---cut here---start->8---
\setupbodyfont[cow]
\starttext
test
\stoptext
--8<---cut here---end--->8---

But the page is empty.

With version 2017.08.29 it worked well...

TIA for any help,
-- 
   Peter
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] issue with \cite in ConTeXt ver: 2020.12.22 22:14 LMTX fmt: 2020.12.22

2020-12-22 Thread Alan Bowen
\cite no longer works as expected

MWE:

\startsetups[tightspace]
 \spaceskip 0.5\interwordspace plus.5\interwordstretch minus
\interwordshrink
\stopsetups

\def\dostartbibitem[#1][#2]%
{\doifsomethingelse{#2}%
 {\startBibItem[reference={#1},title={#2}]}%
 {\startBibItem[reference={#1},title={#1}]}%
}
\def\startbibitem{\dodoubleempty\dostartbibitem}
\def\stopbibitem{\stopBibItem}

\definedescription[BibItem][
width=broad,
margin=1.5pc,
indenting={no},
indentnext=no,
alternative=hanging,
hang=1,
headcommand=\gobbleoneargument,
align=right,
before={\directsetup{tightspace}\bgroup\language[packed]},
after={\egroup},
]

\definereferenceformat [cite] [type=title,left={},right={}]
\definereferenceformat [bibpage] [type=page]

starttext
Now \cite[test 2020] on page
\page
\startbibitem[test 2020]
\input ward
\stopbibitem
\stoptext

Instead of “Now test 2020 on page”, I get “Now 248>BibItem:referencetext on
page” and no sign of an internal link.

Alan
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] new font trickery

2020-12-22 Thread Hans Hagen

Hi,


I finally decided to start an experiment that I had on my todo list for 
a while: dynamically scaling fonts. Below is an example of usage (plus 
some comment). In that example some 2*200 different font sizes are used 
which in mkiv demands 400 font definitions. This costs time and memory. 
The example below runs (on my old laptop) in less than 2 seconds and 
only uses two instances. I bet that our CJK users will love it.


I post it because:

(1) I need to discuss the impact with Wolfgang ... how do we integrate 
this. We can for instance define some sizes (\tfa \tfb ...) differently.


(2) I want to see if average performance improves on huge documents with 
many fonts / sizes (that is for Massimiliano to test).


(3) Are there side effects? It does work for math (rather neat trickery) 
but maybe we need an additional configration for that.


(4) I considered several variants but for now use a low level 
\glyphscale command that takes a number (in goodl old tex tradition a 
scale of 1.0 is entered as 1000).


(5) The implementation can be improved a bit (performance wise). There 
is a bit more overhead involved but usually I can compensate that.


(6) It will not be backported to MKIV so one can only test in LMTX. 
it's a bit of a mix between engine and context features.


(7) There can be bugs (unforseen side effects, or typos in the somewhat 
quick patches in the source).


Consider it an experiment, but so far I'm rather satisfied,

Hans



 EXAMPLE 

\setuplayout[middle]

\setupbodyfont[pagella,10pt]

\setupalign[verytolerant,stretch]

\setupwhitespace[big]

\starttext

\startbuffer
\definescaledfont[bfe][scale=2000,style=bf]

\setuphead[chapter][style=\bfe]

\dostepwiserecurse {10} {2020} {10} {
\title{Here we go #1!}
\start
\glyphscale#1\relax
\setupinterlinespace
\samplefile{ward}%
\bf
\samplefile{ward}
\par
\stop
\page
}
\stopbuffer

\getbuffer

\title{Scaled fonts}

Although \CONTEXT\ is quite efficient with fonts there is always room for
improvement. However, after years of finetuning the font mechanisms 
there was not

that much room left. This made me think of a different approach to scaling.
Nowadays fonts seldom come in design sizes. Also, in \CONTEXT\ \MKIV\ and
therefore \LMTX\ we always had so called dynamic features: apply additional
features locally, although that comes with a small penalty in 
performance, it
saves additional font instances. It is a good approach for the 
occasional small

stretch of glyphs, like small capped logos and such.

We now can also do dynamic font scaling, which means that we don't need 
to define
a new font instance when the same feature set is used. Or course in 
addition to

this features one can still use the dynamic features. This means that for
instance chapter titling can use the bodyfont instance and just apply 
additional
scaling. Although for a normal run the number of loaded fonts is 
normally small,
and the number of instances also isn't that impressive it can happen in 
a large
document that you end up with a few dozen. That number can now be 
reduced to half

a dozen.

Of course there can be side effects, whcich is why it's currently tagged as
experimental. There is also a small performance hit because we now need 
to track
it but that is gained back because we load less fonts and have less 
glyph runs.

It even works in math, although there some different trickery is needed.

\typebuffer

\stoptext







-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Hyphenation pattern for Esperanto

2020-12-22 Thread Alain Delmotte

  
  
Hi Hans and Mojca,
I join a file containing the list of
patterns for Esperanto and the definition/translation of the
general words for titles, glossaries,...
In fact there is a file for Esperanto
in:
  
C:\Users\Alain\context-lmlx\tex\texmf-context\tex\context\patterns\mkiv\lang-eo.lua
  
So it has been in mkiv.
Now regarding the labels, how could I
provide the translations? Should I add a line with translation
for each term (or the most important, I have seen that the
French labels are not all translated)?
Thanks, regards,
Alain
  
Le 14/10/2020 à 18:40, Hans Hagen a
  écrit :

On
  10/14/2020 5:36 PM, Alain Delmotte wrote: 
  Dear Mojca, 

I did just repond to the message without remarking that I was
only answering to you; I'll take care inthe future. 

I'll check if the two list are the same, after expanding the
condensed one. And after I'll go to Hans. 
  
  
  As Mojca pointed out, we need to setup a language: 
  
  % Artificial Languages: Esperanto 
  
  \installlanguage 
    [\s!esperanto] 
    [%\c!spacing=\v!packed, 
     %\c!leftsentence=\emdash, 
     %\c!rightsentence=\emdash, 
     %\c!leftsubsentence=\emdash, 
     %\c!rightsubsentence=\emdash, 
     %\c!leftquote=\lowerleftdoubleninequote, 
     %\c!rightquote=\upperrightdoublesixquote, 
     %\c!leftquotation=\lowerleftdoubleninequote, 
     %\c!rightquotation=\upperrightdoublesixquote, 
     %\c!date={\v!year,~m.,\space,\v!month,\space,\v!day,~d.}, 
     \s!patterns=eo, 
     \s!lefthyphenmin=2, 
     \s!righthyphenmin=2] 
  
  apart from labels as in lang-txt.lua because otherwise it's not
  that useful to have it I guess. 
  
  Hans 
  
  Alain 

Le 14/10/2020 à 13:16, Mojca Miklavec a écrit : 
Dear Alain, 
  
  Yes, I'm aware of the differences, but ConTeXt is not ready to
  read 
  patterns in the cryptic format. 
  (If the two sets of patterns are not equivalent, there's
  something 
  that needs fixing.) 
  
  Please be proactive on the list (chatting offline with me
  won't help 
  that much) and just try to convince Hans to add those
  patterns, or ask 
  him what you need to do in order to end up with patterns
  supported in 
  ConTeXt out of the box. 
  
  Mojca 
  
  On Mon, 12 Oct 2020 at 21:50, Alain Delmotte 
   
  wrote: 
  Hi Mojca, 

The pattern in TeXlive as I described (from hyph-eo.tex)
makes the list a littlebit  more compact: instead of 

1a2do, 1a2doj, 1a2dojn, 1a2don one write \nom{1a2d},
otherwise it looks the same (I didn't check the details). 

So I'll wait for the demand of Hans!! 

Alain 

Le 12/10/2020 à 16:38, Mojca Miklavec a écrit : 

Dear Alain, 

On Mon, 12 Oct 2020 at 13:44, Alain Delmotte wrote: 

I'd like to have hyphenation for Esperanto; it doesn't exist
in ConTeXt but exist in TeX/LaTeX. 

Would it be difficult to create the Esperanto file for
ConTeXt? 

The plain text version is already in the correct form: 

https://github.com/hyphenation/tex-hyphen/blob/master/hyph-utf8/tex/generic/hyph-utf8/patterns/txt/hyph-eo.pat.txt
so it should be just a matter of Hans including the patterns
in the 
distribution. 

Usually he asks for translations of basic strings (like
"Chapter" 
etc.) when adding support for a new language ;) 

Mojca 
___

If your question is of interest to others as well, please
add an entry to the Wiki! 

maillist :ntg-context@ntg.nl  /http://www.ntg.nl/mailman/listinfo/ntg-context

webpage  :http://www.pragma-ade.nl 
/http://context.aanhet.net

archive  :https://bitbucket.org/phg/context-mirror/commits/

wiki :http://contextgarden.net

___

  


___

If your question is of interest to others as well, please add an
entry to the Wiki! 

   

Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Hans Hagen

On 12/22/2020 7:32 PM, Neven Sajko wrote:

On Tue, 22 Dec 2020 at 18:15, Hans Hagen  wrote:


On 12/22/2020 6:57 PM, Neven Sajko wrote:

Oops, I forgot to attach the scriptlet before.



[...] I guess it could also be more performant, because Lua would
conceivably spend less time managing huge tables.


Now that I think about this some more, it doesn't actually make sense.
However I'm still interested in whether it is really necessary to have
that many globals exposed.

most of what you see in that generated file is either unicode data or
font resources ... all needed (and geared for performance)


OK, now that I think just about the real global variables (instead of
the "recursive" globals): would it make sense to transfer all the
non-Lua-default globals into two tables, one for Lua(Meta)Tex, and
another table for ConTeXt, so those would be the only two additional
global variables?


well, and then if you load some library that would add some global again 
... so it's a chicken egg problem.



I'm not proposing you do it, since it seems like it could be a lot of
work, I'm just wondering what you think about that, because it seems
like things would be much tidier like that (less chance of
accidentally accessing a global in Lua code, etc.).

You can create your own instance:

\definenamedlua[mylua]

\startmyluacode
global.context("USER 1")
context.par()
context("USER 2")
context.par()
if characters then
context("ACCESS directly")
elseif global.characters then
context("ACCESS via global")
else
context("NO ACCESS at all")
end
context.par()
if bogus then
context("ACCESS directly")
elseif global.bogus then
context("ACCESS via global")
else
context("NO ACCESS at all")
end
context.par()
\stopmyluacode


I admit that I never run into conflicts. Ok, I never load libraries, 
mostly because we havne plenty of helpers on board. Changing the 
approach now would invalidate a lot of user code (and also mean a lot of 
work plus probably make us run into the > 200 locals issue due to 
aliasing).


Hans

-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Neven Sajko
On Tue, 22 Dec 2020 at 18:15, Hans Hagen  wrote:
>
> On 12/22/2020 6:57 PM, Neven Sajko wrote:
> > Oops, I forgot to attach the scriptlet before.
> >
> >
> >> [...] I guess it could also be more performant, because Lua would
> >> conceivably spend less time managing huge tables.
> >
> > Now that I think about this some more, it doesn't actually make sense.
> > However I'm still interested in whether it is really necessary to have
> > that many globals exposed.
> most of what you see in that generated file is either unicode data or
> font resources ... all needed (and geared for performance)

OK, now that I think just about the real global variables (instead of
the "recursive" globals): would it make sense to transfer all the
non-Lua-default globals into two tables, one for Lua(Meta)Tex, and
another table for ConTeXt, so those would be the only two additional
global variables?

I'm not proposing you do it, since it seems like it could be a lot of
work, I'm just wondering what you think about that, because it seems
like things would be much tidier like that (less chance of
accidentally accessing a global in Lua code, etc.).

Thanks,
Neven
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Hans Hagen

On 12/22/2020 7:13 PM, Neven Sajko wrote:

Thank you very much for your answers!


you can run:

s-inf-01.mkiv
s-inf-03.mkiv
s-inf-05.mkiv

i need to update them but they show the picture

Hans

-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Hans Hagen

On 12/22/2020 6:57 PM, Neven Sajko wrote:

Oops, I forgot to attach the scriptlet before.



[...] I guess it could also be more performant, because Lua would
conceivably spend less time managing huge tables.


Now that I think about this some more, it doesn't actually make sense.
However I'm still interested in whether it is really necessary to have
that many globals exposed.
most of what you see in that generated file is either unicode data or 
font resources ... all needed (and geared for performance)


Hans

-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Neven Sajko
Thank you very much for your answers!

Neven
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Hans Hagen

On 12/22/2020 6:36 PM, Neven Sajko wrote:

Hello again,

While learning about how to drive TeX through Lua, I decided to
recursively list all Lua global variables (actually this is traversing
the _G table) in the LMTX environment, half to learn more Lua, half
for getting to know ConTeXt better.

I was quite surprised by the huge size of the environment, a file that
contains the listing of all the globals is 42 MB long! I wonder if it
would be possible to reduce the exposed globals by replacing some of
them by getter and setter-like functions? That seems like it would be


not sure what you refer to but even then you need to 'get' and 'set them 
someplace which then involves tables ... it's just a large system and 
that won't change


also, when you run such tests, don't include the characters.* tables as 
most is data, and of you do it from a tex run you also see fonts and 
their data (which then means shared tables too)



much nicer and less error prone - because Lua is so dynamic, polluting
the global namespace seems even more dangerous than in C-like
languages. I guess it could also be more performant, because Lua would
conceivably spend less time managing huge tables.


Wait, you traverse global tablers so their entries are *not* global.

\starttext

\startluacode
context.starttabulate { "|T|T|" }
for k, v in table.sortedhash(_G) do
context.NC() context(type(v)) context.NC() context(k) 
context.NC() context.NR()

end
context.stoptabulate()
\stopluacode

\stoptext

These are global. Many come from lua itself, then there are libraries 
than come with luametatex. The rest is context specific and again some 
are just helper modules. I notices some 6 stray locals that I fixed.



There were also some global variables with suspicious random variation
in values between runs of ConTeXt: I ran my Lua script like so
multiple times (attached, in case someone is interested in it):

 context s.lua
 rm s.tuc s.pdf s.log

And I found that the values of some variables unpredictably and randomly vary:


Maybe weak tables?


The variable resolvers.suffixmap.lua sometimes has the value
"scripts", and sometimes "lua". I think this means that files with the
file name suffix ".lua" are sometimes classified as general scripts
and sometimes as Lua scripts. This seems like it could even be a bug?


could be but probably more a side effect ... part of that resolver stuff 
is there for usage in tds and could be simplified in the meantime .. if 
there are hashes they can differ per document



The variables storage.tofmodules and storage.toftables are also
interesting: they vary from run to run like this:

tofmodules: 0.175483 0.149536 0.150493 0.150005

toftables: 0.008407 0.008118 0.008395 0.008116

I'd like to know what is their purpose, if it's not to involved to explain?

timers, so indeed they can differ per run

Hans

-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Neven Sajko
Oops, I forgot to attach the scriptlet before.


> [...] I guess it could also be more performant, because Lua would
> conceivably spend less time managing huge tables.

Now that I think about this some more, it doesn't actually make sense.
However I'm still interested in whether it is really necessary to have
that many globals exposed.

Thanks,
Neven


s.lua.gz
Description: application/gzip
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Snooping around in LMTX: Questions about Lua global variables

2020-12-22 Thread Neven Sajko
Hello again,

While learning about how to drive TeX through Lua, I decided to
recursively list all Lua global variables (actually this is traversing
the _G table) in the LMTX environment, half to learn more Lua, half
for getting to know ConTeXt better.

I was quite surprised by the huge size of the environment, a file that
contains the listing of all the globals is 42 MB long! I wonder if it
would be possible to reduce the exposed globals by replacing some of
them by getter and setter-like functions? That seems like it would be
much nicer and less error prone - because Lua is so dynamic, polluting
the global namespace seems even more dangerous than in C-like
languages. I guess it could also be more performant, because Lua would
conceivably spend less time managing huge tables.

There were also some global variables with suspicious random variation
in values between runs of ConTeXt: I ran my Lua script like so
multiple times (attached, in case someone is interested in it):

context s.lua
rm s.tuc s.pdf s.log

And I found that the values of some variables unpredictably and randomly vary:

The variable resolvers.suffixmap.lua sometimes has the value
"scripts", and sometimes "lua". I think this means that files with the
file name suffix ".lua" are sometimes classified as general scripts
and sometimes as Lua scripts. This seems like it could even be a bug?

The variables storage.tofmodules and storage.toftables are also
interesting: they vary from run to run like this:

tofmodules: 0.175483 0.149536 0.150493 0.150005

toftables: 0.008407 0.008118 0.008395 0.008116

I'd like to know what is their purpose, if it's not to involved to explain?

Thanks,
Neven
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] LMTX, PGF/TikZ absolute positioning

2020-12-22 Thread Jean-Philippe Rey
Dear list,

I have a problem with LMTX when I try to place a tikz drawing at an absolute 
location on the page. Here is a minimal failing example.

===
\usemodule[tikz]
\starttext
\starttikzpicture[remember picture, overlay]
\draw (current page.center) circle [radius=20mm];
\stoptikzpicture
\stoptext
===

The error is about \hoffset being undefined (see attached log file). Everything 
works fine with MkIV.

Thank you for any help.

-- 
Jean-Philippe Rey
jean-philippe@centralesupelec.fr
91192 Gif-sur-Yvette Cedex - France
Empreinte PGP : 807A 5B2C 69E4 D4B5 783A 428A 1B5E E83E 261B BF51


mfe-hoffset.log
Description: Binary data
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Regarding XML export and EPUB

2020-12-22 Thread Henning Hraban Ramm


> Am 21.12.2020 um 23:31 schrieb Andres Conrado Montoya 
> :
> 
> Hello, list. 
> I've been experimenting with the export scripts and the instructions and 
> manuals you can find in:
> 
> https://wiki.contextgarden.net/XML
> https://wiki.contextgarden.net/Export
> https://wiki.contextgarden.net/Epub
> https://wiki.contextgarden.net/ePub
> 
> This with the expectation of being able to make an epub file form a context 
> document that epubcheck can accept. I see there is an experimental support 
> for epub, and I have played around with the export options a bit. However, 
> what I would really want to know is. It's possible and if so, how can you map 
> context's elements so they can get translated to specific html tags? I mean, 
> the current export output uses divs with custom attributes and classes, or 
> custom tags; but I would like to map lists to ul, list-items to li, headings 
> to h1, h2, h3, paragraphs to  etc. In the manuals I see that you can do 
> the opposite: map xml/html tags to context's elements; I wonder if you can go 
> the other way around, and a good place to start reading about it. Apologies 
> if I missed something obvious. 

There’s no built-in mechanism.

I’m using XSLT to transform ConTeXt’s exported XML to the HTML I want.
Esp. with references (footnotes etc) it’s not trivial.

Hraban
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___