On Sat, 6 Oct 2018, Alan Braslau wrote:

On Sat, 6 Oct 2018 13:06:18 -0400 (EDT)
Aditya Mahajan <adit...@umich.edu> wrote:

In my opinion, a better long-term option is to write a jupyter client in
lua that can be called by context. Then we can easily interface with all
languages that provide a jupyter kernel
(https://github.com/jupyter/jupyter/wiki/Jupyter-kernels).

The interface of a jupter-client is available here
https://jupyter-client.readthedocs.io/en/stable/index.html. It seems
relatively straight forward (send a JSON message and receive a JSON
message). Translating the JSON messages to ConTeXt should also be easy.
Is there anyone who wants to play around trying to implement this?

jupyter runs python code.

Have you ever tried doing any real heavy data analysis using jupyter? My experience is that it chokes on large data sets... So why write lua code to call a jupyter kernel running python?

That's why I want to write a jupyter client in lua (so that there is no python code involved).

Would it not make more sense developing code directly in lua in this case?

Yes, but let me try to explain. When creating homework assignments for a
course I teach, I often have documents as follows:

    \starttext
    Consider an LTI system with the transfer function
    \placefigure[eq:sys] \startformula
      H(s) = \frac{1}{s^2 + 2s + 2}
    \stopformula
    The step response of the system is shown in Figure \in[fig:plot]. Note
    that the step response settles to a final value of $0.5$
    \startplacefigure
      [title={Step response of the LTI system described in \eqref[eq:sys]}]
      \externalfigure[step-response.pdf]
    \stopplacefigure
    \stoptext

What I want to do is to be able to change the transfer function (given in the
formula) and regenerate the plot. Something like the following:

    \defineLTIsystem[example][num={1}, den={1,2,2}]

    \starttext
    Consider an LTI system with the transfer function
    \placefigure[eq:sys] \startformula
      H(s) = \TF[example]
    \stopformula
    The step response of the system is shown in Figure \in[fig:plot]. Note
    that the step response settles to a final value of
    $\calculate{lim(s*TF[example], s, 0)}$.
    \startplacefigure
      [title={Step response of the LTI system described in \eqref[eq:sys]}]
      \STEP[example]
    \stopplacefigure
    \stoptext

Now, it is possible to write the code to generate the step response in
Lua/Metapost. But it quickly gets tiring and one essentially ends up creating
a domain specific computational library in Lua.

An alternative approach, is to use an existing library written in some other
programming language (say Matlab or R or Julia or whatever). It is possible to
do so using the `filter` module (plus some lua code). In this case, the user
simply calls "context filename" and ConTeXt macros take care of calling an
external program (say matlab) to generate the plot and do the algebraic
calculations.

Another approach which is taken by programs like Sweave and Knitr is to first
run the program through R (or someother programming language). These are
typically written for LaTeX. So code that is between \begin{Rcode} ..
\end{Rcode} and \Rexp{...} (or something similar, haven't used R in a decade)
is treated as R code and everything else is treated as comments. The evaluated
file can then be run through `latex` or `context` or any typesetting program.
The drawback of this approach is that not all programming languages have such
a program.

Now, what I want to do (at some stage) is to extend the functionality of the
filter module to call jupyter kernels. So, instead of passing messages between
context and the external program through text files, the messages can be
passed as JSON objects (using sockets, I believe). The advantage is that you
avoid multiple restarts of the external program (which is what the filter
module currently does).

The one thing that python (and jupyter) brings, or R for that matter, are libraries of calculation routines. These can be quite sophisticated, some efficient, and some not so efficient. My approach has always been to write my own routines or to adapt algorithms, at least then I know what the calculation is actually doing. Of course, this means that I spend time redoing what might have been done elsewhere, but the variety of routines that I actually use is rather small.

If you have the time (and the expertise) then this is a good strategy. For me,
this is not always the case.

Aditya
___________________________________________________________________________________
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki     : http://contextgarden.net
___________________________________________________________________________________

Reply via email to