Re: [NTG-context] What’s wrong with installation?

2022-03-08 Thread Stephen Gaito via ntg-context
Hans, (and all)

I can confirm that the context garden's download for Intel/XUbuntu
21.10 

http://lmtx.pragma-ade.nl/install-lmtx/context-linux-64.zip

Works again!

Many thanks for your swift solution!

And as always **many many thanks for a wonderful tool!**

Regards,

Stephen Gaito

On Tue, 8 Mar 2022 14:03:54 +0100
Hans Hagen via ntg-context  wrote:

> On 3/8/2022 12:55 PM, Stephen Gaito via ntg-context wrote:
> > Hans,
> > 
> > I am having the same problem on an XUbuntu 21.10 :  
> hm, maybe the server has some issue ... i rebooted the machine
> 
> Hans
> 
> 
> -
>Hans Hagen | PRAGMA ADE
>Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -
> ___
> If your question is of interest to others as well, please add an
> entry to the Wiki!
> 
> maillist : ntg-context@ntg.nl /
> http://www.ntg.nl/mailman/listinfo/ntg-context webpage  :
> http://www.pragma-ade.nl / http://context.aanhet.net archive  :
> https://bitbucket.org/phg/context-mirror/commits/ wiki :
> http://contextgarden.net
> ___

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] What’s wrong with installation?

2022-03-08 Thread Stephen Gaito via ntg-context
Hans,

I am having the same problem on an XUbuntu 21.10 :
---
user@host:/tmp/testContext$ wget 
https://lmtx.pragma-ade.nl/install-lmtx/context-linux-64.zip
--2022-03-08 11:42:14--
https://lmtx.pragma-ade.nl/install-lmtx/context-linux-64.zip Resolving
lmtx.pragma-ade.nl (lmtx.pragma-ade.nl)... 213.125.29.165 Connecting to
lmtx.pragma-ade.nl (lmtx.pragma-ade.nl)|213.125.29.165|:443...
connected. HTTP request sent, awaiting response... 200 OK Length:
1538088 (1.5M) [application/zip] Saving to: ‘context-linux-64.zip’

context-linux-64.zi 100%[===>]   1.47M  1.57MB/sin 0.9s

2022-03-08 11:42:16 (1.57 MB/s) - ‘context-linux-64.zip’ saved [1538088/1538088]

user@host:/tmp/testContext$ ls
context-linux-64.zip
user@host:/tmp/testContext$ unzip context-linux-64.zip 
Archive:  context-linux-64.zip
   creating: bin/
  inflating: bin/mtxrun  
  inflating: bin/mtx-install.lua 
  inflating: bin/mtxrun.lua  
  inflating: install.sh  
  inflating: installation.pdf
user@host:/tmp/testContext$ ls
bin  context-linux-64.zip  installation.pdf  install.sh
user@host:/tmp/testContext$ sh install.sh 
mtxrun  | forcing cache reload
resolvers   | resolving | looking for regular 'texmfcnf.lua' on given path 
'/home/stg/texmf/web2c' from specification 'home:texmf/web2c'
resolvers   | resolving | looking for regular 'texmfcnf.lua' on given path 
'/tmp/texmf-local/web2c' from specification 'selfautoparent:/texmf-local/web2c'
resolvers   | resolving | looking for regular 'texmfcnf.lua' on given path 
'/tmp/texmf-context/web2c' from specification 
'selfautoparent:/texmf-context/web2c'
resolvers   | resolving | looking for regular 'texmfcnf.lua' on given path 
'/tmp/texmf-dist/web2c' from specification 'selfautoparent:/texmf-dist/web2c'
resolvers   | resolving | looking for regular 'texmfcnf.lua' on given path 
'/tmp/texmf/web2c' from specification 'selfautoparent:/texmf/web2c'
resolvers   | resolving | looking for fallback 'contextcnf.lua' on given 
path '/home/stg/texmf/web2c' from specification 'home:texmf/web2c'
resolvers   | resolving | looking for fallback 'contextcnf.lua' on given 
path '/tmp/texmf-local/web2c' from specification 
'selfautoparent:/texmf-local/web2c'
resolvers   | resolving | looking for fallback 'contextcnf.lua' on given 
path '/tmp/texmf-context/web2c' from specification 
'selfautoparent:/texmf-context/web2c'
resolvers   | resolving | looking for fallback 'contextcnf.lua' on given 
path '/tmp/texmf-dist/web2c' from specification 
'selfautoparent:/texmf-dist/web2c'
resolvers   | resolving | looking for fallback 'contextcnf.lua' on given 
path '/tmp/texmf/web2c' from specification 'selfautoparent:/texmf/web2c'
resolvers   | resolving |
resolvers   | resolving | warning: no lua configuration files found
resolvers   | resolving | no texmf paths are defined (using TEXMF)
resolvers   | resolving |
mtxrun  | the resolver databases are not present or outdated
mtx-install | provide valid server and instance

cp: cannot stat '/tmp/testContext/tex/texmf-linux-64/bin/mtxrun': No such file 
or directory
cp: cannot stat 
'/tmp/testContext/tex/texmf-context/scripts/context/lua/mtxrun.lua': No such 
file or directory
cp: cannot stat 
'/tmp/testContext/tex/texmf-context/scripts/context/lua/mtx-install.lua': No 
such file or directory

If you want to run ConTeXt everywhere, you need to adapt the path, like:

  export PATH=/tmp/testContext/tex/texmf-linux-64/bin:$PATH

If you run from an editor you can specify the full path to mtxrun:

  /tmp/testContext/tex/texmf-linux-64/bin/mtxrun --autogenerate --script 
context --autopdf ...

The following settings were used:

  server   : lmtx.contextgarden.net,lmtx.pragma-ade.com,lmtx.pragma-ade.nl
  instance : install-lmtx
  extras   : 
  ownpath  : /tmp/testContext
  platform : linux-64

---

On Tue, 8 Mar 2022 11:09:33 +0100
Hans Hagen via ntg-context  wrote:

> On 3/8/2022 10:06 AM, Ursula Hermann via ntg-context wrote:
> > Dear List,
> > I can't make the installation context-mswin.zip, too.  
> that's 32 bit windows ... are you sure you don't need 64 bit?
> 
> -
>Hans Hagen | PRAGMA ADE
>Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -
> ___
> If your question is of interest to others as well, please add an
> entry to the Wiki!
> 
> maillist : ntg-context@ntg.nl /
> http://www.ntg.nl/mailman/listinfo/ntg-context webpage  :
> http://www.pragma-ade.nl / 

Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Stephen Gaito
Hans,

Again many thanks for your thoughts! (See below)

On Thu, 3 Dec 2020 13:15:28 +0100
Hans Hagen  wrote:

> On 12/3/2020 12:15 PM, Taco Hoekwater wrote:
> > 
> >   
> >> On 3 Dec 2020, at 11:35, Stephen Gaito 
> >> wrote:
> >>
> >> Hans,
> >>
> >> As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb
> >> of DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb
> >> of CPU cache...
> >>
> >> ... all well past its use by date for single threaded ConTeXt. ;-(
> >>
> >> So one way to get better performance for ConTeXt is to invest in a
> >> new ultra fast processor. Which will cost a lot, and use a lot of
> >> power which has to be cooled, which uses even more power  
> > 
> > Startup time can be improved quite a bit with an SSD. Even a cheap
> > SATA SSD is already much faster than a traditional harddisk.
> > Doesn’t help with longer documents, but it could be a fairly cheap
> > upgrade.  
> 
> also, an empty context run
> 
> \starttext
> \stoptext
> 
> only takes 0.490 seconds on my machine, which means:
> 
> - starting mtxrun, which includes quite a bit of lua plus loading the
> file database etc
> - loading mtx-context that itself does some checking
> - and then launches the engine (it could be intgerated but then we
> run into issues when we have fatal errors as well as initializations
> so in the end it doesn't pay off at all)
> - the tex runs means: loading the format and initializing hundreds of 
> lua scripts including all kind of unicode related stuff
> 
> so, the .5 sec is quite acceptable to me and i knwo that when i would 
> have amore recent machine it would go down to half of that
> 

I will agree that this is acceptable for the complexity ConTeXt
represents... ConTeXt has a complex task... it *will* have to take some
time... that is OK.

> now, making a tex run persistent is not really a solution: one has to 
> reset all kinds of counters, dimensions etc wipe node and token
> space, etc an done would also have to reset the pdf output which
> includes all kind of housekeeping states ... adding all kind of
> resetters and hooks for that (plus all the garbage collection needed)
> will never pay back and a 'wipe all and reload' is way more efficient
> then

I also agree, keeping a pool of "warm" running ConTeXts, as you are
essentially describing, would be nice... but I suspect the complexity
does preclude this approach. Keep it Simple... Simply killing and
restarting ConTeXt as a new process is OK.

> 
> of course, when i ever run into a secenario where I have to creeate
> tens of thousands of one/few page docs very fast i might add some
> 'reset the pdf state' because that is kind of doable with some extra
> code but to be honest, no one ever came up with a project that had
> any real demands on the engine that could not be met (the fact that
> tex is a good solution for rendering doesn't mean that there is
> demand for it ... it is seldom on the radar of those who deal with
> that, who then often prefer some pdf library, also because quality
> doesn't really matter)
> 
> these kind of performance things are demand driven (read: i need a 
> pretty good reason to spend time on it)

Understood.

> 
> > I can’t comment on how to speed up the rest of what you are doing,
> > but generally multi-threading TeX typesetting jobs is so hard as to
> > be impossible in practise. About the only step that can be split off
> > is the generation of the PDF, and even there the possible gain is
> > quite small (as you noticed already).  
> 
> indeed, see above
> 
> > Typesetting is a compilation job, so the two main ways to speed
> > things along are
> > 
> > 1) split the source into independent tasks, like in a code compiler
> > that splits code over separate .c / .cpp / .m / .p etc. files,
> > and then combine the results (using e.g. mutool)
> > 
> > 2) precompile recurring stuff (in TeX, that would mean embedding
> > separately generated pdfs or images)  
> right
> 
> (and we are old enough and have been around long enough to have some
> gut feeling about that)
> 

I have a deep respect for both your vision, and experience in this
matter.

However, way back when you started ConTeXt, very few people would
have said it was possible/worth embedding Lua in TeX

Given that Lua *is* now embedded inside ConTeXt, I am simply making a
*crude* attempt to see if I can parallelize the overall ConTeXt
production cycle (with out changing ConTeXt-LMTX itself).

"Fools rush in..."

> Hans
> 
> ps. When it comes to p

Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Stephen Gaito
Taco,

Thanks for your comments... see below...

On Thu, 3 Dec 2020 12:15:46 +0100
Taco Hoekwater  wrote:

> > On 3 Dec 2020, at 11:35, Stephen Gaito 
> > wrote:
> > 
> > Hans,
> > 
> > As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb
> > of DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb
> > of CPU cache...
> > 
> > ... all well past its use by date for single threaded ConTeXt. ;-(
> > 
> > So one way to get better performance for ConTeXt is to invest in a
> > new ultra fast processor. Which will cost a lot, and use a lot of
> > power which has to be cooled, which uses even more power  
> 
> Startup time can be improved quite a bit with an SSD. Even a cheap
> SATA SSD is already much faster than a traditional harddisk. Doesn’t
> help with longer documents, but it could be a fairly cheap upgrade.
> 
> I can’t comment on how to speed up the rest of what you are doing,
> but generally multi-threading TeX typesetting jobs is so hard as to
> be impossible in practise. About the only step that can be split off
> is the generation of the PDF, and even there the possible gain is 
> quite small (as you noticed already).
> 
> Typesetting is a compilation job, so the two main ways to speed things
> along are
> 
> 1) split the source into independent tasks, like in a code compiler
>that splits code over separate .c / .cpp / .m / .p etc. files,
>and then combine the results (using e.g. mutool)
> 

This is basically my approach... *However*, while the dependency graph
for a standard compilation has been engineered to be an acyclic tree,
for a ConTeXt "compilation", the "*.tex" file has a cyclic dependency
on the (generated) "*.tuc" file.

Basically my parallelization "build manager" has to unroll or other
wise reimplement the mtx-context.lua 

```
  for currentrun=1,maxnofruns do
...
  end
```

loop until the `multipass_changed(oldhash,newhash)` returns `false`.

Followed by a "pdf-linking" stage (possibly implemented, as you suggest,
by `mutool`) 

Not great but it might work... (and I am willing to give it a
try)...

> 2) precompile recurring stuff (in TeX, that would mean embedding
>separately generated pdfs or images)
> 
> Best wishes,
> Taco
> 
> 
> 
> 
> 
> ___
> If your question is of interest to others as well, please add an
> entry to the Wiki!
> 
> maillist : ntg-context@ntg.nl /
> http://www.ntg.nl/mailman/listinfo/ntg-context webpage  :
> http://www.pragma-ade.nl / http://context.aanhet.net archive  :
> https://bitbucket.org/phg/context-mirror/commits/ wiki :
> http://contextgarden.net
> ___

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Parallelizing typesetting of large documents with lots of cross-references

2020-12-03 Thread Stephen Gaito
Hello,

This email is largely a simple notification of one "Fool's" dream...

("Only Fools rush in where Angels fear to tread").

I am currently attempting to create "a" (crude) "tool" with which I can
typeset:

- very large (1,000+ pages),
- highly cross-referenced documents,
- with embedded literate-programmed code (which needs
  concurrent compiling and execution),
- containing multiple MetaFun graphics,

all based upon ConTeXt-LMTX.

"In theory", it should be possible to typeset individual "sub-documents"
(any section which is known to start on a page boundary rather than
inside a page), and then re-combine the individual PDFs back into one
single PDF for the whole document (complete with control over the page
numbering).

The inherent problem is that the *whole* of a ConTeXt document depends
upon cross-references from *everywhere* else in the document. TeX and
ConTeXt "solve" this problem by using a multi-pass approach (in, for
example, 5 passes for the `luametatex` document).

Between each pass, ConTeXt saves this multi-pass data (page
numbers and cross-references) in the `*.tuc` file.

Clearly any parallelization approach needs to have a process which
coordinates the update and re-distribution of any changes in this
multi-pass data obtained by typesetting each "sub-document".

My current approach is to have a federation of Docker/Podman "pods".
Each "pod" would have a number of ConTeXt workers, as well as
(somewhere in the federation) a Lua based Multi-Pass-Data-coordinator.

All work would be coordinated by messages sent and received over a
corresponding federation of [NATS servers](https://nats.io/). (Neither
[Podman](https://podman.io/) pods nor NATS message coordination are
problems at the moment).


**The real problem**, for typesetting a ConTeXt document, is the design
of the critical process which will act as a
"Multi-Pass-Data-coordinator".


All ConTeXt sub-documents would be typeset in "once" mode using the
latest complete set of "Multi-Pass-Data" obtained from the central
coordinator. Then, once each typesetting run is complete, the resulting
"Multi-Pass-Data" would be sent back to the coordinator to be used to
update the coordinator's complete set of "Multi-Pass-Data" ready for
any required next typesetting pass.

(From the `context --help`:
>mtx-context | --once only run once (no multipass data file is produced)
I will clearly have to patch(?) the mtx-context.lua script to allow
multipass data to be produced... this is probably not a problem).

(There would also be a number of additional processes/containers for
dependency analysis, build sequencing, compilation of code,
execution or interpretation of the code, stitching the PDFs back into
one PDF, etc -- these processes are also not the really critical
problem at the moment).


QUESTIONS:

1. Are there any other known attempts to parallelize context?

2. Are there any other obvious problems with my approach?

3. Is there any existing documentation on the contents of the `*.tuc`
   file?

4. If there is no such documentation, is there any naming pattern of
   the Lua functions which get/set this multi-pass information that I
   should be aware of?


Many thanks for all of the very useful comments so far...

Regards,

Stephen Gaito
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-03 Thread Stephen Gaito
Hans,

As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb of
DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb of CPU
cache...

... all well past its use by date for single threaded ConTeXt. ;-(

So one way to get better performance for ConTeXt is to invest in a new
ultra fast processor. Which will cost a lot, and use a lot of power
which has to be cooled, which uses even more power

Alternatively, for the same costs (or less), I can buy cheaper slower
processors but have lots of threads (a cluster of Raspberry Pi 4 8Gb
cards)...

Alas this requires finding some way to parallelize ConTeXt

(Fools rush in where Angels fear to tread ;-(

Regards,

Stephen Gaito 

On Wed, 2 Dec 2020 14:04:18 +0100
Hans Hagen  wrote:

> On 12/2/2020 10:40 AM, Stephen Gaito wrote:
> 
> > Many thanks for your swift and helpful comments.
> > 
> > After some *very crude* tests using the `luametatex` and
> > `luametafun` documents, I find that while I *can* stop effective
> > processing at various points in the LuaMetaTeX pipeline, the time
> > difference overall is not really significant enough to bother with
> > this approach.
> > 
> > The principle problem is, as you suggested below, "stopping" the
> > pipeline at the PDF stage (using for example the
> > `pre_output_filter`) corrupted the `*.tuc` data which is for my
> > purposes, critical.
> > 
> > Your comment was:
> >   
> >> but keep in mind that multipass data is flushed as part of the
> >> shipout (because it is often location and order bound)  
> > 
> > For the record, using the `append_to_vlist_filter` callback, I did
> > manage to drastically reduce the "pages" (which were all blank, not
> > surprisingly).
> > 
> > However, on my elderly desktop from 2008, both callbacks
> > essentially cut only 6-8 seconds out of 18 seconds, for the
> > `luametatex` document, and 190 seconds, for the `luametafun`
> > document.  
> 
> hm, on my 2013 laptop the luametatex manual needs 10 sec (i have all
> the fonts, so that includes a bunch) and a metafun manual should do
> about 20
> 
> a test on am M1 mini needs half those times as reported yesterday
> 
> i bet that on a modern desktop the luatex manual will do < 5 sec
> 
> > In the case of the `luametafun` document, it is the MetaFun/MetaPost
> > processing which, of course, is taking a long time (as it should,
> > the graphics computations represent important but complex
> > computations).  
> 
> One run or many due to xref? Maybe your machine has no significant
> cpu cache? Do you run from disk or ssd? How much memory?
> 
> > My ultimate goal is to parallelize the production of large, heavily
> > cross-referenced, ConTeXt documents... more on this in a future
> > email...  
> Hans
> 
> -
>Hans Hagen | PRAGMA ADE
>Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 2/2

2020-12-03 Thread Stephen Gaito
Hans,

If my only constraints were ease of programming and moderate
performance, I would completely agree that using mostly Lua plus
(possibly) some C code for some targeted stuff that is really slow in
Lua is the correct solution we are actually in agreement.

Unfortunately, I have the *non-functional* requirement to *prove* the
code's correctness this is the heart of what I have to write about.

There is no getting out of this requirement

So, some day it would be very useful to be able to directly embed a
Lua wrapped ANSI-C shared library inside the new LuaMetaTex

However, at the moment, as part of my parallelization attempts I can
interact with my ANSI-C code over a network so I will use this
approach for the near to medium time frames.

Regards,

Stephen Gaito



On Wed, 2 Dec 2020 14:17:54 +0100
Hans Hagen  wrote:

> On 12/2/2020 11:43 AM, Stephen Gaito wrote:
> 
> > Again, to my knowledge, Lua v5.4 has only one implementation (though
> > this implementation *can* be compiled for a very wide range of
> > CPU's).  
> 
> Lua has not many demands ... it can even run on tiny cpu's. It's all 
> rather plain C code. (And in luametatex we have no c++ ... all is
> just C.)
> 
> > Finally, the computational complexity of my proof engine, will be
> > comparable to MetaFun/MetaPost... which I suspect you would not
> > consider implementing in pure Lua. Some things are faster in C.  
> 
> Hard to say ... I think that the parser / expansion machinery in mp
> is the bottleneck here (no fun to do that in lua). Redoing it in Lua
> also is asking for compatibility issues. (btw, extensions are done in
> lua anyway, as is the mp backend)
> 
> > So yes I do need to implement it in ANSI-C wrapped in Lua (so that
> > it can be used from *inside* ConTeXt).  
> 
> I would have to see the 'kind of code involved' in order to comment
> on that.
> 
> An option is to do most in lua and maybe some helpers for crititical 
> code in C.
>   Hans
> 
> 
> -
>Hans Hagen | PRAGMA ADE
>Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 2/2

2020-12-02 Thread Stephen Gaito
Hans,

Many thanks for your comments... see below.

On Mon, 30 Nov 2020 19:31:55 +0100
Hans Hagen  wrote:

> On 11/30/2020 10:51 AM, Stephen Gaito wrote:
> > Hello (again),
> > 
> > This email is further to my previous "Using ConTeXt-LMTX for modern
> > Mathematically-Literate-Programming 1/2" email...
> > 
> > My ultimate goal in using ConTeXt-LMTX as a
> > Mathematically-Literate-Programming tool, is to actually write a
> > kernel "Mathematical Language" in ANSI-C (wrapped in Lua) which is
> > then imported back into ConTeXt-LMTX as a standard Lua module (with
> > an ANSI-C shared library).
> 
> Just curious: do you think that using c instead of lua for that has 
> advantages?

This is a very good and important question. One I have asked myself
repeatedly.

My ultimate goal is to write a small mathematical kernel in ANSI-C,
which is, using [Frama-C](https://frama-c.com/), proven *correct*.

To my knowledge, Lua has no similar tool for correctness proofs.

Equally importantly, there are a very wide range of very different
compilers which compile ANSI-C for an equally very wide range of CPU's. 

Again, to my knowledge, Lua v5.4 has only one implementation (though
this implementation *can* be compiled for a very wide range of CPU's).

The problem here is that Mathematicians are inherently very
conservative about the concept of "proof" (it has taken well over 2,000
hard years to develop our current understanding). My kernel will be an
extensible "proof" engine. For mathematicians to trust it, this proof
engine must itself be proven correct (or as correct as currently
possible). It must also be simple enough to *see* that it is correct
(hence the Literate-Programming approach), *and* (since I can not even
hope to prove the compilers are *correct*), there must be many
*different* compiler implementations (to show that the results are not
artefacts of one particular implementation).

Finally, the computational complexity of my proof engine, will be
comparable to MetaFun/MetaPost... which I suspect you would not
consider implementing in pure Lua. Some things are faster in C.

So yes I do need to implement it in ANSI-C wrapped in Lua (so that it
can be used from *inside* ConTeXt).

Since this is a mathematical tool, "embedding" it in ConTeXt is ideal.

As a mathematician writes, what they write gets proof-checked
automatically... in the document they are writing, and by the
typesetting tool they are using for the finished PDF. :-)

ConTeXt (via LuaMetaTex) makes this possible in a way native TeX/LaTeX
never could.

So once again, many many thanks for the vision to create such a
flexible tool!

> 
> > This would allow the output of "code" in my "Mathematical Language"
> > to be directly embedded/typeset in the output of my Mathematical
> > document.
> > 
> > (The ultimate goal is to ensure that there is NO wishful thinking
> > that the code is "correct" ("just trust me")... all results would be
> > directly visible in the PDF).
> > 
> > Alas, while, for other reasons, trying to use the Lua-CJSON Lua
> > module from within ConTeXt-LMTX (which also makes use of a shared
> > library written in C), I find that the current ConTeXt-LMTX is
> > missing (among potentially others) the `lua_checkstack` symbol:
> 
> could be .. we dont' use it
> 
> >> ...Xt/tex/texmf-context/tex/context/base/mkiv/l-package.lua:333:
> >> error loading module 'cjson' from file
> >> '/usr/local/lib/lua/5.4/cjson.so':
> >> /usr/local/lib/lua/5.4/cjson.so: undefined symbol: lua_checkstack
> > 
> > even when using the ConTeXt/LuaMetaTeX `--permiteloadlib` switch.
> > 
> > (Note that this Lua-CJSON module does work with the native 5.4 Lua).
> 
> why not use the build in helpers

The test, which triggered the error message (above), was to prove that
I could send [NATS](https://nats.io/) messages from *inside* ConTeXt.

"Out of the box", the [Lua-NATS](https://github.com/DawnAngel/lua-nats)
requires:

- luasocket (which LuaMetaTex provides, many many thanks!)

- lua-cjson (which is an external shared library and is what I was
  testing)

Fortunately, I found a couple of pure Lua JSON tools which I could get
Lua-NATS to use with a one line change. (And, for the record, I *can*
send and receive messages from a NATS server from inside ConTeXt :-)  

If I find I need to make changes to the Lua-NATS code, I will probably
use LuaMetaTeX's internal JSON implementation as you suggest below
(again many thanks for embedding a JSON implementation).

Using Lua-NATS is part of my larger goal to parallelize the typesetting
of large documents using ConTeXt (more on this in another email).

> 
> \usemodule[json]
> 
> \starttext
> 
>

Re: [NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-12-02 Thread Stephen Gaito
Hans,

Many thanks for your swift and helpful comments.

After some *very crude* tests using the `luametatex` and `luametafun`
documents, I find that while I *can* stop effective processing at
various points in the LuaMetaTeX pipeline, the time difference overall
is not really significant enough to bother with this approach.

The principle problem is, as you suggested below, "stopping" the
pipeline at the PDF stage (using for example the `pre_output_filter`)
corrupted the `*.tuc` data which is for my purposes, critical.

Your comment was: 

> but keep in mind that multipass data is flushed as part of the
> shipout (because it is often location and order bound)

For the record, using the `append_to_vlist_filter` callback, I did
manage to drastically reduce the "pages" (which were all blank, not
surprisingly).

However, on my elderly desktop from 2008, both callbacks essentially cut
only 6-8 seconds out of 18 seconds, for the `luametatex` document, and
190 seconds, for the `luametafun` document.

In the case of the `luametafun` document, it is the MetaFun/MetaPost
processing which, of course, is taking a long time (as it should, the
graphics computations represent important but complex computations).

My ultimate goal is to parallelize the production of large, heavily
cross-referenced, ConTeXt documents... more on this in a future email...

Again, many thanks for your comments!

Regards,

Stephen Gaito

On Mon, 30 Nov 2020 19:59:07 +0100
Hans Hagen  wrote:

> On 11/30/2020 10:51 AM, Stephen Gaito wrote:
> > Hello,
> > 
> > I am slowly working on a Mathematical problem requiring underlying
> > computation.
> > 
> > As Mathematicians (myself included) are rather "conservative", I
> > need to discuss each "chunk" of code with the full set of
> > Mathematical notation.
> > 
> > A couple of years ago I started using ConTeXt-MKIV as a
> > Mathematically-Literate-Programming tool by using its excellent Lua
> > interface to capture the code and dump it to disk for external
> > compilation.
> > 
> > I am now revisiting my original design and want to redo my tools
> > using ConTeXt-LMTX.
> > 
> > I would *like* to be able to "stop" the ConTeXt typesetting at
> > various points for differing purposes:
> > 
> > 1. After all macro expansions (and hence after *my* calls into Lua)
> > but before line/paragraph/page layout begins.
> 
> maybe something
> 
> \startmystuff
> 
> \stopmystuff
> 
> and then you can hook something into startmystuff and \stopmystuff
> 
> > 2. After line/paragraph/page layout but before PDF generation.
> 
> pdf is generated per page, if needed one can kick in a shipout
> overload
> 
> but keep in mind that multipass data is flushed as part of the
> shipout (because it is often location and order bound)
> 
> > 3. After all PDF generated (ie. a "normal" "full" ConTeXt run).
> > 
> > Stopping after all macro expansions would allow my code generation
> > builds to proceed without the un-needed page setting or PDF
> > generation.
> 
> hm, the problem is always in the 'state' of all kind of variables
> 
> > Stopping after the line/paragraph/page layout would allow multiple
> > "faster(?)" ConTeXt runs while the "*.tuc" file converges to a
> > complete set of page numbers and cross references (etc). Then, once
> > the "*.tuc" file has converged, a full ConTeXt run with PDF output
> > could be done.
> 
> not sure what you mean here ... what is fast? or: how slow is it now? 
> what is the bottleneck? can you cache data that didn't change?
> 
> a large document is normally split up in sections that can be
> processed independent
> 
> \starttext
>  \dorecurse{1}{\samplefile{ward}\par}
> \stoptext
> 
> runs on my 2013 laptop at over 65 pages per second
> 
> quite often performance is hit by inefficient styling and such ..
> it's no problem to bring a tex system a grinding halt
> 
> > I am very aware that *internally* ConTeXt is probably structured as
> > a tight pipeline with each of the "traditional" TeX stages "Mouth",
> > "Stomach", "page setting", PDF generation tightly "chained"...
> > This means that there is no "one" place in the code where all macro
> > expansions have completed but before the page setting "starts", or
> > similarly, after the page setting has finished but before the PDF
> > generation "starts".
> 
> yes and often something is left over for a next page so it's kind of
> fluid
> 
> > 
> > QUESTION: Is it possible to u

[NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2

2020-11-30 Thread Stephen Gaito
Hello,

I am slowly working on a Mathematical problem requiring underlying
computation.

As Mathematicians (myself included) are rather "conservative", I need
to discuss each "chunk" of code with the full set of Mathematical
notation.

A couple of years ago I started using ConTeXt-MKIV as a
Mathematically-Literate-Programming tool by using its excellent Lua
interface to capture the code and dump it to disk for external
compilation.

I am now revisiting my original design and want to redo my tools using
ConTeXt-LMTX.

I would *like* to be able to "stop" the ConTeXt typesetting at various
points for differing purposes:

1. After all macro expansions (and hence after *my* calls into Lua)
   but before line/paragraph/page layout begins.

2. After line/paragraph/page layout but before PDF generation.

3. After all PDF generated (ie. a "normal" "full" ConTeXt run).

Stopping after all macro expansions would allow my code generation
builds to proceed without the un-needed page setting or PDF generation.

Stopping after the line/paragraph/page layout would allow multiple
"faster(?)" ConTeXt runs while the "*.tuc" file converges to a complete
set of page numbers and cross references (etc). Then, once the "*.tuc"
file has converged, a full ConTeXt run with PDF output could be
done.

I am very aware that *internally* ConTeXt is probably structured as a
tight pipeline with each of the "traditional" TeX stages "Mouth",
"Stomach", "page setting", PDF generation tightly "chained"...
This means that there is no "one" place in the code where all macro
expansions have completed but before the page setting "starts", or
similarly, after the page setting has finished but before the PDF
generation "starts".


QUESTION: Is it possible to use the new LuaMetaTeX callbacks (found in
chapter 10 of the "LuaMetaTEX Reference Manual") to "suppress" any
further computation at various points in the ConTeXt pipeline?


For example, could I use one of the "*_linebreak_filter"s (or the
"append_to_vlist_filter") to "return" an empty value and hence reduce
further computation downstream in the pipeline?

Could I use the "pre_output_filter" to "return" an empty value and
hence "stop" PDF generation?

(I realize that these callbacks *are* a currently fast moving target. I
am happy to follow their changes, equally I would be testing their
usefulness and/or impact)


ALTERNATIVE QUESTION: Would it be possible to provide official
ConTeXt-LMTX "modes" which suppress further computation at these points?


This alternative, while some more work for the writing of
ConTeXt-LMTX, would ensure less direct external dependence on the
LuaMetaTeX callbacks, but would almost certainly be welcomed by the
ConTeXt community.


QUESTION: Are the "stages" I have identified major, computationally
expensive, "steps" in the overall ConTeXt "computation"?


Many thanks for an excellent tool!

Regards,

Stephen Gaito
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 2/2

2020-11-30 Thread Stephen Gaito
Hello (again),

This email is further to my previous "Using ConTeXt-LMTX for modern
Mathematically-Literate-Programming 1/2" email...

My ultimate goal in using ConTeXt-LMTX as a
Mathematically-Literate-Programming tool, is to actually write a
kernel "Mathematical Language" in ANSI-C (wrapped in Lua) which is then
imported back into ConTeXt-LMTX as a standard Lua module (with an ANSI-C
shared library).

This would allow the output of "code" in my "Mathematical Language" to
be directly embedded/typeset in the output of my Mathematical document.

(The ultimate goal is to ensure that there is NO wishful thinking that
the code is "correct" ("just trust me")... all results would be
directly visible in the PDF).

Alas, while, for other reasons, trying to use the Lua-CJSON Lua module
from within ConTeXt-LMTX (which also makes use of a shared library
written in C), I find that the current ConTeXt-LMTX is missing (among
potentially others) the `lua_checkstack` symbol:

> ...Xt/tex/texmf-context/tex/context/base/mkiv/l-package.lua:333: error
> loading module 'cjson' from file '/usr/local/lib/lua/5.4/cjson.so':
> /usr/local/lib/lua/5.4/cjson.so: undefined symbol: lua_checkstack

even when using the ConTeXt/LuaMetaTeX `--permiteloadlib` switch.

(Note that this Lua-CJSON module does work with the native 5.4 Lua).

(The ConTeXt I am using identifies itself as: ConTeXt  ver: 2020.11.25
19:18 LMTX  fmt: 2020.11.25  int: english/english)

I note that the output of `luametatex --help` includes the following
statement:

> Loading libraries is blocked unless one explicitly permits it (will
> be in 2.08+):
>
>  --permitloadlib permit loading of external libraries (coming)


QUESTIONS:

1. Is this an oversight and `--permitloadlib` is meant to be working
   now?

2. Is this a trivial fix (and might be fixed soon -- time permitting)?

3. Is this a rather complex refactoring of the code/build system (and
   hence might take some time before a fix can be rolled
   out)?

4. Is this a case of "the `lua_checkstack` symbol will never be part of
   luametatex"?


Any of the above scenarios is OK (though scenario 4 would be a
disappointment as it means no shared library lua modules could be
used in ConTeXt)... 

... it would however be useful to have an idea of which scenario is
most likely.

Again Many thanks for a wonderful (and stable) tool!

Regards,

Stephen Gaito

___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___


[NTG-context] Purpose/use of loadsetup interface *.xml files

2017-02-23 Thread Stephen Gaito
Hello,

Problem:


I am having trouble getting a simple module's documentation to
load setup information.

Question:
-

Where can I find up-to-date documentation on the load setup
interface file use and required format?

Background:
--

I have a current standalone ConTeXt installation with only luatex(mkiv)
and all modules (updated yesterday).

I intend to write a number of ConTeXt modules to assist in my work.

My first module is ConTests a simple wrapper around lua's lunatest
framework. ( GitHub: stephengaito/ConTests )

As I work I like to write unit-tests and documentation
concurrently with the code itself (as much as possible).

I am basing my documentation effort on t-fancybreak.mkvi. As such I
have written a setup interface file t-contests.xml.

My directory structure is:

  t-contests/tex/context/interface/third/t-contests.xml
  t-contests/tex/context/third/ConTests/t-contests.mkiv
  t-contests/tex/context/third/ConTests/t-contests.lua

I am building my module documentation using:

  mtxrun --script modules --process t-contexts.mkiv

(inside the t-contests/tex/context/third/ConTests directory).

The resluting t-contests-mkiv.pdf file has 'MISSING SETUP' where I
would expect the \showsetup{loadSuite} output to be placed.

The mtxrun log output has the line:

  xml > core > load error in [id: setups:, file: .xml]:  empty xml file

I interpret this to mean that my interface file was not found (or
possibly not correctly specified).

The relevant contents of t-contests.mkiv are:

%M \usemodule[contests]
%M \loadsetups[t-contests.xml]

\writestatus{loading}{ConTeXt User Module / ConTests}

%D \subject{Implementation}

\unprotect

\ctxloadluafile{t-contests}

%D \macros{loadSuite}
%D
%D \showsetup{loadSuite}

\def\loadSuite#1{\directlua{thirddata.contests.lunatest.loadSuite('#1.lua')}}

\protect \endinput
-

The contents of t-contests.xml is:
--





http://www.pragma-ade.com/commands; name="context" 
language="en" version="2010.06.21">


  
 
  
  

  

  

-

Any and all assistance would be greatly appreciated.

Regards,
Stephen Gaito
___
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki : http://contextgarden.net
___