Non-commercial licenses aren't open source by the OSI open source
definition https://opensource.org/osd-annotated (see points 5 and 6).
I think it's important that we don't use the term "open source" for a
license unless it fits that definition, and, ideally, is OSI approved.
There are a lot of
The license they chose is open source, but it just isn't readily compatible
with OSI approved licenses.
I was recently surprised to find out that CC-BY isn't even compatible:
FWIW the license they chose (CC-BY-NC) isn't actually open source. But
at least the code is there if you want to run it.
Aaron Meurer
On Thu, Apr 16, 2020 at 3:50 AM S.Y. Lee wrote:
>
> They have opened the source code and the dataset
> https://github.com/facebookresearch/SymbolicMathematics
>
They have opened the source code and the dataset
https://github.com/facebookresearch/SymbolicMathematics
On Saturday, January 11, 2020 at 2:25:40 AM UTC+9, Aaron Meurer wrote:
>
> For those who didn't see, the final paper was posted with many updates
> https://arxiv.org/abs/1912.01412. The
For those who didn't see, the final paper was posted with many updates
https://arxiv.org/abs/1912.01412. The newest version addresses some of
the things that were discussed here, and makes more use of SymPy,
including demonstrating some integrals that SymPy cannot solve, as
well as making it
(This mail is copied from my response at maxima mailing list.)
My opinion on this paper:
First, their dataset (section 4.1) can be greatly improved using
existing integration theory, Risch algorithm says that every elementary
function integration can be reduced to 3 cases: transcendental (only
On 28/09/2019 14:27, Oscar Benjamin wrote:
Neural nets are trained for a particular statistical distribution of
inputs and in the paper they describe their method for generating a
particular ensemble of possibilities. There might be something
inherent about the examples they give that means they
Neural nets are trained for a particular statistical distribution of
inputs and in the paper they describe their method for generating a
particular ensemble of possibilities. There might be something
inherent about the examples they give that means they are all solved
using a particular approach.
Their paper appears to be an attempt at using the transformer model for
language translation to symbolic math.
There is a Jupyter notebook with an example on how to create a translator
from Portuguese to English using the transformer model:
On Fri, Sep 27, 2019 at 11:56 PM Ondřej Čertík wrote:
>
> On Fri, Sep 27, 2019, at 12:48 PM, Aaron Meurer wrote:
> > There's a review paper for ICLR 2020 on training a neural network to
> > do symbolic integration. They claim that it outperforms Mathematica by
> > a large margin. Machine learning
On Fri, Sep 27, 2019, at 12:48 PM, Aaron Meurer wrote:
> There's a review paper for ICLR 2020 on training a neural network to
> do symbolic integration. They claim that it outperforms Mathematica by
> a large margin. Machine learning papers can sometimes make overzealous
> claims, so scepticism is
Integration by parts definitely needs ML.
Sent from Mail for Windows 10
From: Aaron Meurer
Sent: 28 September 2019 00:18
To: sympy
Subject: [sympy] Symbolic integrator using a neural network
There's a review paper for ICLR 2020 on training a neural network to
do symbolic integration. They claim
There's a review paper for ICLR 2020 on training a neural network to
do symbolic integration. They claim that it outperforms Mathematica by
a large margin. Machine learning papers can sometimes make overzealous
claims, so scepticism is in order.
https://openreview.net/pdf?id=S1eZYeHFDS
The don't
13 matches
Mail list logo