Re: [NTG-context] Getting the filename which is being typeset

2020-12-03 Thread Axel Kielhorn

> Am 01.12.2020 um 18:23 schrieb Wolfgang Schuster 
> :
> 
> The following example shows the output of all commands:
> 
> \starttext
> \starttabulate [|T|T|]
> \NC \type{\jobname}   \NC \jobname   \NC\NR
> \NC \type{\jobfilename}   \NC \jobfilename   \NC\NR
> \NC \type{\jobfilesuffix} \NC \jobfilesuffix \NC\NR
> \NC \type{\inputfilename} \NC \inputfilename \NC\NR
> \NC \type{\inputfilebarename} \NC \inputfilebarename \NC\NR
> \NC \type{\inputfilesuffix}   \NC \inputfilesuffix   \NC\NR
> \NC \type{\outputfilename}\NC \outputfilename\NC\NR
> \NC \type{\operatingsystem}   \NC \operatingsystem   \NC\NR
> \stoptabulate
> \stoptext

On MacOS 10.13 the result of \operatingsystem is „osx-ppc“.
That was a surprise:-)

Greetings
Axel

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Is anyone having issues with magazines in LMTX?

2020-12-03 Thread Jairo A. del Rio
Hi, list!

When I want to compile magazines from source (such as mag-1102,mkiv), LMTX
gives weird output (grayish background and misaligned titles). Compilation
with MkIV gives normal results. My ConTeXt version is 2020.12.03 19:02.
Thank you in advance.

Cordially,

Jairo :)
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Wolfgang Schuster

Andres Conrado Montoya schrieb am 03.12.2020 um 21:58:
Hi pablo, regaring 1) TeX assumes that each character is a variable on 
math mode. You need to tell it that each "word" is, well, a "word":


\starttext
   \startitemize[packed]
   \item Valor actual neto: $\text{\em beneficios} - \text{\em costes}$. 
% you can use typographical variations inside \text: \em, \bf, etc
   \item Beneficio por euro: $\frac{\text{\em beneficios}}{\text{\em 
costes}}$.

   \item Beneficio obtenido el primer año.
   \stopitemize
\stoptext


There are additional commands for italic, bold etc. text

\starttext

\startlines
$\mathtexttf{upright text}$
$\mathtextit{italic text}$
$\mathtextsl{slanted text}$
$\mathtextbf{bold text}$
$\mathtextbi{bolditalic text}$
$\mathtextbs{boldslanted text}$
\stoplines

\startlines
$\mathwordtf{upright word}$
$\mathwordit{italic word}$
$\mathwordsl{slanted word}$
$\mathwordbf{bold word}$
$\mathwordbi{bolditalic word}$
$\mathwordbs{boldslanted word}$
\stoplines

\stoptext

Wolfgang
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Pablo Rodriguez
On 12/3/20 10:00 PM, Wolfgang Schuster wrote:
> Pablo Rodriguez schrieb am 03.12.2020 um 21:52:
>> On 12/3/20 8:48 PM, Pablo Rodriguez wrote:
>>> [...]
>>> 3. Packed works fine with fractions, but packed columns is wrong with
>>> them. It adds too much space.
>> After clearing the issue with text, it seems that fractions add to much
>> space when in list with columns.
>> [...]
>> I don’t know whether this is a bug.
>
> Itemize uses the mixed columns when you enable columns with enabled
> grid snapping as default setting. To get rid of the extra line you
> can either disable the grid setting for itemize with
>
>      \setupitemize[grid=no]

Many thanks with your reply, Wolfgang.

This solves the issue.

> or you replace the mixed columns mechanism with a simpler column
> mechanism which doesn't produce an extra line, to do this add this at
> the begin of your document
>
>      \enableexperiments[itemize.columns]

This avoids the extra line, but columns are wrong: 3 columps with option
"four" and it split items in 3 for the first column and 1 for the second
column with option "two".

Many thanks for your help,

Pablo
--
http://www.ousia.tk
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Wolfgang Schuster

Pablo Rodriguez schrieb am 03.12.2020 um 21:52:

On 12/3/20 8:48 PM, Pablo Rodriguez wrote:

[...]
3. Packed works fine with fractions, but packed columns is wrong with
them. It adds too much space.

After clearing the issue with text, it seems that fractions add to much
space when in list with columns.

   \showframe\showgrid
   \starttext
   list:

   \startitemize[packed, columns, four]
   \item $\text{1} - \text{2}$
   \item item
   \item $\frac{1}{2}$
   \item item
   \stopitemize

   list:

   \startitemize[columns, four]
   \item $\text{1} - \text{2}$
   \item item
   \item $\frac{1}{2}$
   \item item
   \stopitemize
   \stoptext

I don’t know whether this is a bug.


Itemize uses the mixed columns when you enable columns with enabled grid 
snapping
as default setting. To get rid of the extra line you can either disable 
the grid setting

for itemize with

    \setupitemize[grid=no]

or you replace the mixed columns mechanism with a simpler column 
mechanism which
doesn't produce an extra line, to do this add this at the begin of your 
document


    \enableexperiments[itemize.columns]

Wolfgang

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Andres Conrado Montoya
Hi pablo, regaring 1) TeX assumes that each character is a variable on math
mode. You need to tell it that each "word" is, well, a "word":

\starttext
  \startitemize[packed]
  \item Valor actual neto: $\text{\em beneficios} - \text{\em costes}$. %
you can use typographical variations inside \text: \em, \bf, etc
  \item Beneficio por euro: $\frac{\text{\em beneficios}}{\text{\em
costes}}$.
  \item Beneficio obtenido el primer año.
  \stopitemize
\stoptext

2) will solve itself ather this. 3, I don't know.



-- 
Andrés Conrado Montoya
Andi Kú
andresconr...@gmail.com
http://sesentaycuatro.com
http://messier87.com
http://chiquitico.org

Los fines no justifican los medios, porque la medida verdadera de nuestro
carácter está dada por los medios que estamos dispuestos a utilizar, no por
los fines que proclamamos.


“You develop an instant global consciousness, a people orientation, an
intense dissatisfaction with the state of the world, and a compulsion to do
something about it. From out there on the moon, international politics look
so petty. You want to grab a politician by the scruff of the neck and drag
him a quarter of a million miles out and say, ‘Look at that, you son of a
bitch.’” — Apollo 14 astronaut Edgar Mitchell
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Pablo Rodriguez
On 12/3/20 8:48 PM, Pablo Rodriguez wrote:
> [...]
> 3. Packed works fine with fractions, but packed columns is wrong with
> them. It adds too much space.

After clearing the issue with text, it seems that fractions add to much
space when in list with columns.

  \showframe\showgrid
  \starttext
  list:

  \startitemize[packed, columns, four]
  \item $\text{1} - \text{2}$
  \item item
  \item $\frac{1}{2}$
  \item item
  \stopitemize

  list:

  \startitemize[columns, four]
  \item $\text{1} - \text{2}$
  \item item
  \item $\frac{1}{2}$
  \item item
  \stopitemize
  \stoptext

I don’t know whether this is a bug.

Many thanks f or your help,

Pablo
--
http://www.ousia.tk
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Pablo Rodriguez
On 12/3/20 9:23 PM, Hans Åberg wrote:
>> [...]
>> Again, formulas are totally foreign to my work with ConTeXt.
>
> When you do not use \text in math mode, it means implicit
multiplication, which has special spacing. Operator names use text
kerning. For example $sin x$ is implicit multiplication, and $\sin x$
the function "sin" applied to the variable x.

Many thanks for your explanation, Hans.

Now it’s clear to me that math and text modes are different in practice.

Many thanks for your help,

Pablo
--
http://www.ousia.tk
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Pablo Rodriguez
On 12/3/20 8:55 PM, Otared Kavian wrote:
> Hi Pablo,
>
> You may use \text in math formulas as in:

Many thanks for your reply, Otared.

\text is what I needed to fix the issue with ligatures and spacing.

Many thanks for your help,

Pablo
--
http://www.ousia.tk
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Hans Åberg

> On 3 Dec 2020, at 20:48, Pablo Rodriguez  wrote:
> 
> math is all Greek to me and I’m experiencing issues when typesetting the
> most basic inline formulas.
…
> The issues I’m experiencing are:
> 
> 1. Kernings around ligatures (fi) are wrong.
> 
> 2. Is there a way to enable ligatures?
> 
> 3. Packed works fine with fractions, but packed columns is wrong with
> them. It adds too much space.
> 
> Again, formulas are totally foreign to my work with ConTeXt.

When you do not use \text in math mode, it means implicit multiplication, which 
has special spacing. Operator names use text kerning. For example $sin x$ is 
implicit multiplication, and $\sin x$ the function "sin" applied to the 
variable x.


___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] basic issues with formulas

2020-12-03 Thread Otared Kavian
Hi Pablo,

You may use \text in math formulas as in:

 \startitemize[packed]
 \item Valor actual neto: $\text{beneficios} - \text{costes}$.
 \item Beneficio por euro: $\frac{\text{beneficios}}{\text{costes}}$.
 \item Beneficio obtenido el primer año.
 \stopitemize

Best regards: Otared

> On 3 Dec 2020, at 20:48, Pablo Rodriguez  wrote:
> 
> Dear list,
> 
> math is all Greek to me and I’m experiencing issues when typesetting the
> most basic inline formulas.
> 
>  \starttext
>  \startitemize[packed]
>  \item Valor actual neto: $beneficios - costes$.
>  \item Beneficio por euro: $\frac{beneficios}{costes}$.
>  \item Beneficio obtenido el primer año.
>  \stopitemize
> 
>  \startitemize[packed]
>  \item Valor actual neto: $beneficios - costes$.
>  \item Beneficio por euro: $beneficios - costes$.
>  \item Beneficio obtenido el primer año.
>  \stopitemize
> 
>  \startitemize[packed, columns, two]
>  \item Valor actual neto: $beneficios - costes$.
>  \item Beneficio por euro: $\frac{beneficios}{costes}$.
>  \item Beneficio obtenido el primer año.
>  \stopitemize
> 
>  \startitemize[packed, columns, two]
>  \item Valor actual neto: $beneficios - costes$.
>  \item Beneficio por euro: $beneficios - costes$.
>  \item Beneficio obtenido el primer año.
>  \stopitemize
>  \stoptext
> 
> The issues I’m experiencing are:
> 
> 1. Kernings around ligatures (fi) are wrong.
> 
> 2. Is there a way to enable ligatures?
> 
> 3. Packed works fine with fractions, but packed columns is wrong with
> them. It adds too much space.
> 
> Again, formulas are totally foreign to my work with ConTeXt.
> 
> I don’t know whether they are bugs in latest (2020.01.30 14:13) or there
> is a way to avoid the issues.
> 
> Many thanks for your help,
> 
> Pablo
> --
> http://www.ousia.tk
> ___
> If your question is of interest to others as well, please add an entry to the 
> Wiki!
> 
> maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
> webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
> archive  : https://bitbucket.org/phg/context-mirror/commits/
> wiki : http://contextgarden.net
> ___

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] basic issues with formulas

2020-12-03 Thread Pablo Rodriguez
Dear list,

math is all Greek to me and I’m experiencing issues when typesetting the
most basic inline formulas.

  \starttext
  \startitemize[packed]
  \item Valor actual neto: $beneficios - costes$.
  \item Beneficio por euro: $\frac{beneficios}{costes}$.
  \item Beneficio obtenido el primer año.
  \stopitemize

  \startitemize[packed]
  \item Valor actual neto: $beneficios - costes$.
  \item Beneficio por euro: $beneficios - costes$.
  \item Beneficio obtenido el primer año.
  \stopitemize

  \startitemize[packed, columns, two]
  \item Valor actual neto: $beneficios - costes$.
  \item Beneficio por euro: $\frac{beneficios}{costes}$.
  \item Beneficio obtenido el primer año.
  \stopitemize

  \startitemize[packed, columns, two]
  \item Valor actual neto: $beneficios - costes$.
  \item Beneficio por euro: $beneficios - costes$.
  \item Beneficio obtenido el primer año.
  \stopitemize
  \stoptext

The issues I’m experiencing are:

1. Kernings around ligatures (fi) are wrong.

2. Is there a way to enable ligatures?

3. Packed works fine with fractions, but packed columns is wrong with
them. It adds too much space.

Again, formulas are totally foreign to my work with ConTeXt.

I don’t know whether they are bugs in latest (2020.01.30 14:13) or there
is a way to avoid the issues.

Many thanks for your help,

Pablo
--
http://www.ousia.tk
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Bug (?) using \externalfigure with layers and backgrounds

2020-12-03 Thread Jairo A. del Rio
Oh, I see. Nice, thank you a lot.

Jairo :)

El jue, 3 de dic. de 2020 a la(s) 12:19, Wolfgang Schuster (
wolfgang.schuster.li...@gmail.com) escribió:

> Jairo A. del Rio schrieb am 03.12.2020 um 18:14:
> > Hi everyone. The following
> >
> > \setuppagenumbering[state=stop]
> > \definelayer[mybg]
> > [
> > repeat=yes,
> > x=0mm, y=0mm,
> > width=\paperwidth, height=\paperheight,
> > ]
> >
> \setlayer[mybg]{\externalfigure[cow][width=\paperwidth,height=\paperheight]}
> > \setupbackgrounds[page][background=mybg,state=repeat]
> > \setupindenting[yes,small]
> > \startsetups background:default
> > \setupbackgrounds[page][background=mybg,state=repeat]
> > \stopsetups
> > \startsetups background:void
> > \setupbackgrounds[page][background=]
> > \stopsetups
> > \starttext
> > \dorecurse{7}{\input knuth}
> > \page
> > \setups[background:void]
> > \dorecurse{8}{\input tufte}
> > \page
> > \setups[background:default]
> > \dorecurse{7}{\input knuth}
> > \page
> > \stoptext
> >
> > works with a local image (say, if "cow" is in the same folder as the
> > main file), but it doesn't find sample images (cow, hacker, etc.). Is
> > that a bug or an intended result? Thank you in advance.
>
> This is normal, to load figures from the tex directory use
> \setupexternalfigures[location=default].
>
> More information are on the wiki:
> https://wiki.contextgarden.net/Command/setupexternalfigure
>
> Wolfgang
>
>
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Bug (?) using \externalfigure with layers and backgrounds

2020-12-03 Thread Wolfgang Schuster

Jairo A. del Rio schrieb am 03.12.2020 um 18:14:

Hi everyone. The following

\setuppagenumbering[state=stop]
\definelayer[mybg]
[
repeat=yes,
x=0mm, y=0mm,
width=\paperwidth, height=\paperheight,
]
\setlayer[mybg]{\externalfigure[cow][width=\paperwidth,height=\paperheight]}
\setupbackgrounds[page][background=mybg,state=repeat]
\setupindenting[yes,small]
\startsetups background:default
\setupbackgrounds[page][background=mybg,state=repeat]
\stopsetups
\startsetups background:void
\setupbackgrounds[page][background=]
\stopsetups
\starttext
\dorecurse{7}{\input knuth}
\page
\setups[background:void]
\dorecurse{8}{\input tufte}
\page
\setups[background:default]
\dorecurse{7}{\input knuth}
\page
\stoptext

works with a local image (say, if "cow" is in the same folder as the 
main file), but it doesn't find sample images (cow, hacker, etc.). Is 
that a bug or an intended result? Thank you in advance.


This is normal, to load figures from the tex directory use 
\setupexternalfigures[location=default].


More information are on the wiki: 
https://wiki.contextgarden.net/Command/setupexternalfigure


Wolfgang

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Bug (?) using \externalfigure with layers and backgrounds

2020-12-03 Thread Jairo A. del Rio
Hi everyone. The following

\setuppagenumbering[state=stop]
\definelayer[mybg]
[
repeat=yes,
x=0mm, y=0mm,
width=\paperwidth, height=\paperheight,
]
\setlayer[mybg]{\externalfigure[cow][width=\paperwidth,height=\paperheight]}
\setupbackgrounds[page][background=mybg,state=repeat]
\setupindenting[yes,small]
\startsetups background:default
\setupbackgrounds[page][background=mybg,state=repeat]
\stopsetups
\startsetups background:void
\setupbackgrounds[page][background=]
\stopsetups
\starttext
\dorecurse{7}{\input knuth}
\page
\setups[background:void]
\dorecurse{8}{\input tufte}
\page
\setups[background:default]
\dorecurse{7}{\input knuth}
\page
\stoptext

works with a local image (say, if "cow" is in the same folder as the main
file), but it doesn't find sample images (cow, hacker, etc.). Is that a bug
or an intended result? Thank you in advance.

Best regards,

Jairo
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Stephen Gaito
Hans,

Again many thanks for your thoughts! (See below)

On Thu, 3 Dec 2020 13:15:28 +0100
Hans Hagen  wrote:

> On 12/3/2020 12:15 PM, Taco Hoekwater wrote:
> > 
> >   
> >> On 3 Dec 2020, at 11:35, Stephen Gaito 
> >> wrote:
> >>
> >> Hans,
> >>
> >> As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb
> >> of DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb
> >> of CPU cache...
> >>
> >> ... all well past its use by date for single threaded ConTeXt. ;-(
> >>
> >> So one way to get better performance for ConTeXt is to invest in a
> >> new ultra fast processor. Which will cost a lot, and use a lot of
> >> power which has to be cooled, which uses even more power  
> > 
> > Startup time can be improved quite a bit with an SSD. Even a cheap
> > SATA SSD is already much faster than a traditional harddisk.
> > Doesn’t help with longer documents, but it could be a fairly cheap
> > upgrade.  
> 
> also, an empty context run
> 
> \starttext
> \stoptext
> 
> only takes 0.490 seconds on my machine, which means:
> 
> - starting mtxrun, which includes quite a bit of lua plus loading the
> file database etc
> - loading mtx-context that itself does some checking
> - and then launches the engine (it could be intgerated but then we
> run into issues when we have fatal errors as well as initializations
> so in the end it doesn't pay off at all)
> - the tex runs means: loading the format and initializing hundreds of 
> lua scripts including all kind of unicode related stuff
> 
> so, the .5 sec is quite acceptable to me and i knwo that when i would 
> have amore recent machine it would go down to half of that
> 

I will agree that this is acceptable for the complexity ConTeXt
represents... ConTeXt has a complex task... it *will* have to take some
time... that is OK.

> now, making a tex run persistent is not really a solution: one has to 
> reset all kinds of counters, dimensions etc wipe node and token
> space, etc an done would also have to reset the pdf output which
> includes all kind of housekeeping states ... adding all kind of
> resetters and hooks for that (plus all the garbage collection needed)
> will never pay back and a 'wipe all and reload' is way more efficient
> then

I also agree, keeping a pool of "warm" running ConTeXts, as you are
essentially describing, would be nice... but I suspect the complexity
does preclude this approach. Keep it Simple... Simply killing and
restarting ConTeXt as a new process is OK.

> 
> of course, when i ever run into a secenario where I have to creeate
> tens of thousands of one/few page docs very fast i might add some
> 'reset the pdf state' because that is kind of doable with some extra
> code but to be honest, no one ever came up with a project that had
> any real demands on the engine that could not be met (the fact that
> tex is a good solution for rendering doesn't mean that there is
> demand for it ... it is seldom on the radar of those who deal with
> that, who then often prefer some pdf library, also because quality
> doesn't really matter)
> 
> these kind of performance things are demand driven (read: i need a 
> pretty good reason to spend time on it)

Understood.

> 
> > I can’t comment on how to speed up the rest of what you are doing,
> > but generally multi-threading TeX typesetting jobs is so hard as to
> > be impossible in practise. About the only step that can be split off
> > is the generation of the PDF, and even there the possible gain is
> > quite small (as you noticed already).  
> 
> indeed, see above
> 
> > Typesetting is a compilation job, so the two main ways to speed
> > things along are
> > 
> > 1) split the source into independent tasks, like in a code compiler
> > that splits code over separate .c / .cpp / .m / .p etc. files,
> > and then combine the results (using e.g. mutool)
> > 
> > 2) precompile recurring stuff (in TeX, that would mean embedding
> > separately generated pdfs or images)  
> right
> 
> (and we are old enough and have been around long enough to have some
> gut feeling about that)
> 

I have a deep respect for both your vision, and experience in this
matter.

However, way back when you started ConTeXt, very few people would
have said it was possible/worth embedding Lua in TeX

Given that Lua *is* now embedded inside ConTeXt, I am simply making a
*crude* attempt to see if I can parallelize the overall ConTeXt
production cycle (with out changing ConTeXt-LMTX itself).

"Fools rush in..."

> Hans
> 
> ps. When it comes to performance of tex, lua, context etc it is no 
> problem, when googling a bit, to run into 'nonsense' arguments of why 
> something is slow ... so don't take it for granted, just ask here on 
> this list
> 

I am not distracted by this noise... Complex things take time... what I
am attempting is complex... and requires the best tool available... and
ConTeXt is the *best* tool available!  Many many thanks for your vision
and work!

> 

Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Stephen Gaito
Taco,

Thanks for your comments... see below...

On Thu, 3 Dec 2020 12:15:46 +0100
Taco Hoekwater  wrote:

> > On 3 Dec 2020, at 11:35, Stephen Gaito 
> > wrote:
> > 
> > Hans,
> > 
> > As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb
> > of DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb
> > of CPU cache...
> > 
> > ... all well past its use by date for single threaded ConTeXt. ;-(
> > 
> > So one way to get better performance for ConTeXt is to invest in a
> > new ultra fast processor. Which will cost a lot, and use a lot of
> > power which has to be cooled, which uses even more power  
> 
> Startup time can be improved quite a bit with an SSD. Even a cheap
> SATA SSD is already much faster than a traditional harddisk. Doesn’t
> help with longer documents, but it could be a fairly cheap upgrade.
> 
> I can’t comment on how to speed up the rest of what you are doing,
> but generally multi-threading TeX typesetting jobs is so hard as to
> be impossible in practise. About the only step that can be split off
> is the generation of the PDF, and even there the possible gain is 
> quite small (as you noticed already).
> 
> Typesetting is a compilation job, so the two main ways to speed things
> along are
> 
> 1) split the source into independent tasks, like in a code compiler
>that splits code over separate .c / .cpp / .m / .p etc. files,
>and then combine the results (using e.g. mutool)
> 

This is basically my approach... *However*, while the dependency graph
for a standard compilation has been engineered to be an acyclic tree,
for a ConTeXt "compilation", the "*.tex" file has a cyclic dependency
on the (generated) "*.tuc" file.

Basically my parallelization "build manager" has to unroll or other
wise reimplement the mtx-context.lua 

```
  for currentrun=1,maxnofruns do
...
  end
```

loop until the `multipass_changed(oldhash,newhash)` returns `false`.

Followed by a "pdf-linking" stage (possibly implemented, as you suggest,
by `mutool`) 

Not great but it might work... (and I am willing to give it a
try)...

> 2) precompile recurring stuff (in TeX, that would mean embedding
>separately generated pdfs or images)
> 
> Best wishes,
> Taco
> 
> 
> 
> 
> 
> ___
> If your question is of interest to others as well, please add an
> entry to the Wiki!
> 
> maillist : ntg-context@ntg.nl /
> http://www.ntg.nl/mailman/listinfo/ntg-context webpage  :
> http://www.pragma-ade.nl / http://context.aanhet.net archive  :
> https://bitbucket.org/phg/context-mirror/commits/ wiki :
> http://contextgarden.net
> ___

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Parallelizing typesetting of large documents with lots of cross-references

2020-12-03 Thread Hans Hagen

On 12/3/2020 12:04 PM, Stephen Gaito wrote:


- very large (1,000+ pages),


not that large, literate code is often verbatim so that doesn't take 
much runtime either



- highly cross-referenced documents,


ok, that demands runs


- with embedded literate-programmed code (which needs
   concurrent compiling and execution),


you only need to process those snippets when something has changed and 
there are ways in context to deal with that (like \typesetbuffer and 
such which only processes when something changed between runs)



- containing multiple MetaFun graphics,


those don't take time assuming effecitne metapost code
 Hans


-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Hans Hagen

On 12/3/2020 12:15 PM, Taco Hoekwater wrote:




On 3 Dec 2020, at 11:35, Stephen Gaito  wrote:

Hans,

As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb of
DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb of CPU
cache...

... all well past its use by date for single threaded ConTeXt. ;-(

So one way to get better performance for ConTeXt is to invest in a new
ultra fast processor. Which will cost a lot, and use a lot of power
which has to be cooled, which uses even more power


Startup time can be improved quite a bit with an SSD. Even a cheap SATA
SSD is already much faster than a traditional harddisk. Doesn’t help
with longer documents, but it could be a fairly cheap upgrade.


also, an empty context run

\starttext
\stoptext

only takes 0.490 seconds on my machine, which means:

- starting mtxrun, which includes quite a bit of lua plus loading the
file database etc
- loading mtx-context that itself does some checking
- and then launches the engine (it could be intgerated but then we run 
into issues when we have fatal errors as well as initializations so in 
the end it doesn't pay off at all)
- the tex runs means: loading the format and initializing hundreds of 
lua scripts including all kind of unicode related stuff


so, the .5 sec is quite acceptable to me and i knwo that when i would 
have amore recent machine it would go down to half of that


now, making a tex run persistent is not really a solution: one has to 
reset all kinds of counters, dimensions etc wipe node and token space, 
etc an done would also have to reset the pdf output which includes all 
kind of housekeeping states ... adding all kind of resetters and hooks 
for that (plus all the garbage collection needed) will never pay back 
and a 'wipe all and reload' is way more efficient then


of course, when i ever run into a secenario where I have to creeate tens 
of thousands of one/few page docs very fast i might add some 'reset the 
pdf state' because that is kind of doable with some extra code but to be 
honest, no one ever came up with a project that had any real demands on 
the engine that could not be met (the fact that tex is a good solution 
for rendering doesn't mean that there is demand for it ... it is seldom 
on the radar of those who deal with that, who then often prefer some pdf 
library, also because quality doesn't really matter)


these kind of performance things are demand driven (read: i need a 
pretty good reason to spend time on it)



I can’t comment on how to speed up the rest of what you are doing,
but generally multi-threading TeX typesetting jobs is so hard as to
be impossible in practise. About the only step that can be split off
is the generation of the PDF, and even there the possible gain is
quite small (as you noticed already).


indeed, see above


Typesetting is a compilation job, so the two main ways to speed things
along are

1) split the source into independent tasks, like in a code compiler
that splits code over separate .c / .cpp / .m / .p etc. files,
and then combine the results (using e.g. mutool)

2) precompile recurring stuff (in TeX, that would mean embedding
separately generated pdfs or images)

right

(and we are old enough and have been around long enough to have some gut 
feeling about that)


Hans

ps. When it comes to performance of tex, lua, context etc it is no 
problem, when googling a bit, to run into 'nonsense' arguments of why 
something is slow ... so don't take it for granted, just ask here on 
this list


-
  Hans Hagen | PRAGMA ADE
  Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
   tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Parallelizing typesetting of large documents with lots of cross-references

2020-12-03 Thread Taco Hoekwater


> On 3 Dec 2020, at 12:04, Stephen Gaito  wrote:
> 
> 1. Are there any other known attempts to parallelize context?

Not that I know of, except for the tricks I mentioned in my earlier mail today.

> 2. Are there any other obvious problems with my approach?

The big problem with references is that changed / resolved references can 
change other (future) references because the typeset length can be different,
shifting a following reference to another page, which in turn can push
another reference to yet another page, perhaps changing a page break, et 
cetera. 

That is why the meta manual needs five runs, otherwise a max of two runs would 
always be enough (assuming no outside processing like generating a bibliography 
or index is needed). So your —once approach may fail in some cases, sorry.

Actually, the meta manual really *needs* only four runs. The last run is the 
one 
that verifies that the .tuc file has not changed (that is why a ConTeXt document
with no cross-references at all uses two runs, and is one of the reasons for 
the existence of the —once switch).

Depending on your docs, you may be able to skip a run by using —runs yourself.

Best wishes,
Taco



___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Taco Hoekwater


> On 3 Dec 2020, at 11:35, Stephen Gaito  wrote:
> 
> Hans,
> 
> As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb of
> DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb of CPU
> cache...
> 
> ... all well past its use by date for single threaded ConTeXt. ;-(
> 
> So one way to get better performance for ConTeXt is to invest in a new
> ultra fast processor. Which will cost a lot, and use a lot of power
> which has to be cooled, which uses even more power

Startup time can be improved quite a bit with an SSD. Even a cheap SATA
SSD is already much faster than a traditional harddisk. Doesn’t help
with longer documents, but it could be a fairly cheap upgrade.

I can’t comment on how to speed up the rest of what you are doing,
but generally multi-threading TeX typesetting jobs is so hard as to
be impossible in practise. About the only step that can be split off
is the generation of the PDF, and even there the possible gain is 
quite small (as you noticed already).

Typesetting is a compilation job, so the two main ways to speed things
along are

1) split the source into independent tasks, like in a code compiler
   that splits code over separate .c / .cpp / .m / .p etc. files,
   and then combine the results (using e.g. mutool)

2) precompile recurring stuff (in TeX, that would mean embedding
   separately generated pdfs or images)

Best wishes,
Taco





___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Parallelizing typesetting of large documents with lots of cross-references

2020-12-03 Thread Stephen Gaito
Hello,

This email is largely a simple notification of one "Fool's" dream...

("Only Fools rush in where Angels fear to tread").

I am currently attempting to create "a" (crude) "tool" with which I can
typeset:

- very large (1,000+ pages),
- highly cross-referenced documents,
- with embedded literate-programmed code (which needs
  concurrent compiling and execution),
- containing multiple MetaFun graphics,

all based upon ConTeXt-LMTX.

"In theory", it should be possible to typeset individual "sub-documents"
(any section which is known to start on a page boundary rather than
inside a page), and then re-combine the individual PDFs back into one
single PDF for the whole document (complete with control over the page
numbering).

The inherent problem is that the *whole* of a ConTeXt document depends
upon cross-references from *everywhere* else in the document. TeX and
ConTeXt "solve" this problem by using a multi-pass approach (in, for
example, 5 passes for the `luametatex` document).

Between each pass, ConTeXt saves this multi-pass data (page
numbers and cross-references) in the `*.tuc` file.

Clearly any parallelization approach needs to have a process which
coordinates the update and re-distribution of any changes in this
multi-pass data obtained by typesetting each "sub-document".

My current approach is to have a federation of Docker/Podman "pods".
Each "pod" would have a number of ConTeXt workers, as well as
(somewhere in the federation) a Lua based Multi-Pass-Data-coordinator.

All work would be coordinated by messages sent and received over a
corresponding federation of [NATS servers](https://nats.io/). (Neither
[Podman](https://podman.io/) pods nor NATS message coordination are
problems at the moment).


**The real problem**, for typesetting a ConTeXt document, is the design
of the critical process which will act as a
"Multi-Pass-Data-coordinator".


All ConTeXt sub-documents would be typeset in "once" mode using the
latest complete set of "Multi-Pass-Data" obtained from the central
coordinator. Then, once each typesetting run is complete, the resulting
"Multi-Pass-Data" would be sent back to the coordinator to be used to
update the coordinator's complete set of "Multi-Pass-Data" ready for
any required next typesetting pass.

(From the `context --help`:
>mtx-context | --once only run once (no multipass data file is produced)
I will clearly have to patch(?) the mtx-context.lua script to allow
multipass data to be produced... this is probably not a problem).

(There would also be a number of additional processes/containers for
dependency analysis, build sequencing, compilation of code,
execution or interpretation of the code, stitching the PDFs back into
one PDF, etc -- these processes are also not the really critical
problem at the moment).


QUESTIONS:

1. Are there any other known attempts to parallelize context?

2. Are there any other obvious problems with my approach?

3. Is there any existing documentation on the contents of the `*.tuc`
   file?

4. If there is no such documentation, is there any naming pattern of
   the Lua functions which get/set this multi-pass information that I
   should be aware of?


Many thanks for all of the very useful comments so far...

Regards,

Stephen Gaito
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Stephen Gaito
Hans,

As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb of
DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb of CPU
cache...

... all well past its use by date for single threaded ConTeXt. ;-(

So one way to get better performance for ConTeXt is to invest in a new
ultra fast processor. Which will cost a lot, and use a lot of power
which has to be cooled, which uses even more power

Alternatively, for the same costs (or less), I can buy cheaper slower
processors but have lots of threads (a cluster of Raspberry Pi 4 8Gb
cards)...

Alas this requires finding some way to parallelize ConTeXt

(Fools rush in where Angels fear to tread ;-(

Regards,

Stephen Gaito 

On Wed, 2 Dec 2020 14:04:18 +0100
Hans Hagen  wrote:

> On 12/2/2020 10:40 AM, Stephen Gaito wrote:
> 
> > Many thanks for your swift and helpful comments.
> > 
> > After some *very crude* tests using the `luametatex` and
> > `luametafun` documents, I find that while I *can* stop effective
> > processing at various points in the LuaMetaTeX pipeline, the time
> > difference overall is not really significant enough to bother with
> > this approach.
> > 
> > The principle problem is, as you suggested below, "stopping" the
> > pipeline at the PDF stage (using for example the
> > `pre_output_filter`) corrupted the `*.tuc` data which is for my
> > purposes, critical.
> > 
> > Your comment was:
> >   
> >> but keep in mind that multipass data is flushed as part of the
> >> shipout (because it is often location and order bound)  
> > 
> > For the record, using the `append_to_vlist_filter` callback, I did
> > manage to drastically reduce the "pages" (which were all blank, not
> > surprisingly).
> > 
> > However, on my elderly desktop from 2008, both callbacks
> > essentially cut only 6-8 seconds out of 18 seconds, for the
> > `luametatex` document, and 190 seconds, for the `luametafun`
> > document.  
> 
> hm, on my 2013 laptop the luametatex manual needs 10 sec (i have all
> the fonts, so that includes a bunch) and a metafun manual should do
> about 20
> 
> a test on am M1 mini needs half those times as reported yesterday
> 
> i bet that on a modern desktop the luatex manual will do < 5 sec
> 
> > In the case of the `luametafun` document, it is the MetaFun/MetaPost
> > processing which, of course, is taking a long time (as it should,
> > the graphics computations represent important but complex
> > computations).  
> 
> One run or many due to xref? Maybe your machine has no significant
> cpu cache? Do you run from disk or ssd? How much memory?
> 
> > My ultimate goal is to parallelize the production of large, heavily
> > cross-referenced, ConTeXt documents... more on this in a future
> > email...  
> Hans
> 
> -
>Hans Hagen | PRAGMA ADE
>Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 2/2

2020-12-03 Thread Stephen Gaito
Hans,

If my only constraints were ease of programming and moderate
performance, I would completely agree that using mostly Lua plus
(possibly) some C code for some targeted stuff that is really slow in
Lua is the correct solution we are actually in agreement.

Unfortunately, I have the *non-functional* requirement to *prove* the
code's correctness this is the heart of what I have to write about.

There is no getting out of this requirement

So, some day it would be very useful to be able to directly embed a
Lua wrapped ANSI-C shared library inside the new LuaMetaTex

However, at the moment, as part of my parallelization attempts I can
interact with my ANSI-C code over a network so I will use this
approach for the near to medium time frames.

Regards,

Stephen Gaito



On Wed, 2 Dec 2020 14:17:54 +0100
Hans Hagen  wrote:

> On 12/2/2020 11:43 AM, Stephen Gaito wrote:
> 
> > Again, to my knowledge, Lua v5.4 has only one implementation (though
> > this implementation *can* be compiled for a very wide range of
> > CPU's).  
> 
> Lua has not many demands ... it can even run on tiny cpu's. It's all 
> rather plain C code. (And in luametatex we have no c++ ... all is
> just C.)
> 
> > Finally, the computational complexity of my proof engine, will be
> > comparable to MetaFun/MetaPost... which I suspect you would not
> > consider implementing in pure Lua. Some things are faster in C.  
> 
> Hard to say ... I think that the parser / expansion machinery in mp
> is the bottleneck here (no fun to do that in lua). Redoing it in Lua
> also is asking for compatibility issues. (btw, extensions are done in
> lua anyway, as is the mp backend)
> 
> > So yes I do need to implement it in ANSI-C wrapped in Lua (so that
> > it can be used from *inside* ConTeXt).  
> 
> I would have to see the 'kind of code involved' in order to comment
> on that.
> 
> An option is to do most in lua and maybe some helpers for crititical 
> code in C.
>   Hans
> 
> 
> -
>Hans Hagen | PRAGMA ADE
>Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___