Yes, rank 0 a valid assumption. Sort of:

J used to have two derivative words, d. (rank 0) and D. (jacobian).

As near as I can tell, the library only replaces d. and does not yet
provide a D. implementation.

That said... I would not think of this as a mathematica competitor --
more like an early version of macsyma.

-- 
Raul

On Fri, Sep 30, 2022 at 12:38 AM Elijah Stone <elro...@elronnd.net> wrote:
>
> I have no control over the calculus addon, but I mark the equivalent treatment
> of &. and &.: and therefore would like to ask a clarifying question.  Is it
> assumed that all verbs to be derived are applied monadically at rank 0?  I can
> imagine an implementation of multivariate calculus would be interesting, but
> that may be out of scope.  (But then, you mention neural networks, so maybe
> not?  I am very ignorant about those, but I understand gradient descent is
> usually done in a high number of dimensions.)
>
> The simplification pr is also interesting.  It hints that what is actually
> wanted is a j computer algebra system: the deriver and integrator should
> generate expressions that are as messy as they would like to be, and then a
> simplifier should make the results look nice.  And all three of those should
> just be components of a larger system.
>
> I do not think there are resources to build anything approaching the level of
> mathematica, but it is something to think about.  A good start would be to
> change the representation used to something more amenable to machine
> computation than j gerunds, generating j verbs only for final display to the
> user.  I suggest the following representation, which I have had great success
> with thus far in my j compiler: represent an expression with a dataflow graph,
> in two parts: the first, a pair of integer vectors 'from' and 'to' indicating
> the graph structure, such that, for any i, there is an edge going from node
> i{from to node i{to; the second, a collection of lists of node attributes.
> Ordering is expressed implicitly in the order of the edges; for instance, in
> x+y, the edge going from + to x must appear earlier than the edge going from +
> to y in from and to.  To begin with, a node can have only a name as an
> attribute (or a noun-value, if it is known to be constant).
>
> This representation is well-suited to concise, efficient, and parallel
> traversal using /. .  Transformations on the graph can be represented as
> deltas returned by the key function.  I may write more on this in the future.
>
> Bringing the system to the point where people can write patterns (such as
> x←→0+x, or (f+g)deriv n←→f deriv n+g deriv n) would make a huge
> difference, as it would open up the floor for contributions from anybody.
>
> I would also suggest the use of e-graphs (https://egraphs-good.github.io/), as
> it seems to me that they are uniquely suited to computer algebra systems, and
> in particular that they reduce the work required to build a decent one.  But I
> have not as yet given any thought to what a good representation of e-graphs
> would look like in j.
>
> I don't mean to impose any work on anybody.  These are just my thoughts on
> what it would take to make the calculus addon really amazing, which I do not
> have the time or energy to execute on myself right now; but if you (or anybody
> else) are interested in taking it over and exploring further, I am happy to
> discuss more.
>
>   -E
>
> On Thu, 29 Sep 2022, Jan-Pieter Jacobs wrote:
>
> > Hi folks,
> >
> > Mostly as a heads-up: I just submitted a pull-request for the
> > math/calculus addon which adds hooks, under, NVV forks and fixes some
> > bugs.
> >
> > You can find it here: https://github.com/jsoftware/math_calculus/pull/9 .
> >
> > Any comments are highly appreciated!
> > My main motivation for messing around with this addon is to:
> > a) Get to a point where I understand the code (mission accomplished, I 
> > think);
> > b) Get the add-on to a point where one can apply (p)deriv on anything
> > that makes sense and get a result out of it without falling back to
> > secant approximation.
> > c) On the long term, it would be great if we could get it to do
> > automatic differentiation
> > (https://en.wikipedia.org/wiki/Automatic_differentiation). This would
> > be great for implementing neural networks that do backpropagation,
> > where you'd just have to write a trivial line of J, and automatically
> > get gradients out by simply applying an adverb/conjunction.
> > d) On the even longer term, something like symbolic J, where you can
> > simplify, expand and reduce J verbs as one does with maths in a
> > computer algebra system. Having the machinery to do such manipulations
> > could also let the user convert J code to run on a GPU, via the
> > arrayfire interface, for instance.
> >
> > But this is too far out for me to do in a reasonable amount of time.
> > All help is welcome, especially from people knowing what they're doing
> > :).
> >
> > PS: Raul's pull-request, doing simplification of overly complicated
> > derivatives, is also still hanging at:
> > https://github.com/jsoftware/math_calculus/pull/5/files
> >
> > Best regards,
> > Jan-Pieter
> > ----------------------------------------------------------------------
> > For information about J forums see http://www.jsoftware.com/forums.htm
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to