Re: AWS contributing ONNX-MXNet

2017-11-20 Thread Steffen Rochel
We modified the blog post and acknowledged Zhi's contributions.
It reads now: "*Special thanks to the dmlc/nnvm community and Zhi Zhang,
whose ONNX code was used as a reference for this implementation."*

Regards,
Steffen

On Fri, Nov 17, 2017 at 10:36 AM Tianqi Chen 
wrote:

> I have watched the issue for around two days. Here are my two cents.
>
> First of all, there is no legal constraint to enforce you do anything, but
> as you said(which I fully agree on), we need to assume others have best
> intentions and give goodwill
>
> - It is  great to reuse code, that is what open-source is about
>
> - It is un-arguably true that Zhi created and maintained most of part of
> the original code. While there are minor contributions from other
> contributors. I think Zhi should be personally acknowledged at least(he
> deserve more than that).
>   - As an analogy,  you are not the only one creating the onnx-mxnet
> repo, but never the less you are listed as the author, instead of simply
> saying that comes from AWS
>
> - I would recommend you start with the files of nnvm as your first commit,
> then apply changes to it.
>  - This will take around 5 min or so, copy the file from nnvm, commit,
> override with your new file, commit
>  - It makes it clear what changes are being done
>  - It makes your life easier to adopt new patches when there is a
> bugfix in nnvm or vice versa
>
>
> - Please maintain it, instead of leaving the job to the community. As with
> every great prize comes with great responsibility,  it is great that you
> push out the repo and takes the credit for doing it. The deep learning
> serializable IR land is still unstable and there demand the efforts to put
> in to maintain the code to keep up with the breaking changes and add
> coverage.
>
> Congrats on the release
> Tianqi
>
>
>
> On Thu, Nov 16, 2017 at 2:04 PM, Lupesko, Hagay  wrote:
>
> > Hey folks,
> >
> >
> >
> > Today AWS announced contributing ONNX-MXNet, an open source Python
> package
> > that imports ONNX models into MXNet. @roshrini and I (@lupesko) have
> worked
> > on the code, which is now publicly available [1], and published a blog
> post
> > demonstrating usage of the package [2]. Special thanks to dmlc/nnvm team,
> > whose ONNX code was used as a reference for this implementation.
> >
> >
> >
> > What is ONNX?
> >
> > ONNX is an open source format to encode deep learning models. ONNX
> defines
> > a format to store neural network's computational graph, as well as a
> > storage format for operators used within a neural network graph. For more
> > details, check out onnx.ai [3].
> >
> >
> >
> > Why I think ONNX is important for MXNet?
> >
> > ONNX is an emerging standard, that holds a lot of potential for Deep
> > Learning practitioners. With ONNX, people can create and train a network
> > with framework A, and deploy it for inference with framework B. The blog
> > post we published demonstrates using a Super Res model trained with
> > PyTorch, and importing it into MXNet Symbolic API for inference. I
> strongly
> > believe that adopting ONNX early on adds value for deep learning
> > practitioners, and thus supporting it adds value for MXNet as well.
> >
> >
> >
> > As for next steps, I was thinking that porting the functionality and code
> > into MXNet is the logical next step.
> >
> > Would love to get the community's feedback and contributions!
> >
> >
> >
> > [1] https://github.com/onnx/onnx-mxnet
> >
> > [2] https://aws.amazon.com/blogs/ai/announcing-onnx-support-
> > for-apache-mxnet/
> >
> > [3] https://onnx.ai
> >
> >
>


Re: AWS contributing ONNX-MXNet

2017-11-17 Thread Tianqi Chen
I have watched the issue for around two days. Here are my two cents.

First of all, there is no legal constraint to enforce you do anything, but
as you said(which I fully agree on), we need to assume others have best
intentions and give goodwill

- It is  great to reuse code, that is what open-source is about

- It is un-arguably true that Zhi created and maintained most of part of
the original code. While there are minor contributions from other
contributors. I think Zhi should be personally acknowledged at least(he
deserve more than that).
  - As an analogy,  you are not the only one creating the onnx-mxnet
repo, but never the less you are listed as the author, instead of simply
saying that comes from AWS

- I would recommend you start with the files of nnvm as your first commit,
then apply changes to it.
 - This will take around 5 min or so, copy the file from nnvm, commit,
override with your new file, commit
 - It makes it clear what changes are being done
 - It makes your life easier to adopt new patches when there is a
bugfix in nnvm or vice versa


- Please maintain it, instead of leaving the job to the community. As with
every great prize comes with great responsibility,  it is great that you
push out the repo and takes the credit for doing it. The deep learning
serializable IR land is still unstable and there demand the efforts to put
in to maintain the code to keep up with the breaking changes and add
coverage.

Congrats on the release
Tianqi



On Thu, Nov 16, 2017 at 2:04 PM, Lupesko, Hagay  wrote:

> Hey folks,
>
>
>
> Today AWS announced contributing ONNX-MXNet, an open source Python package
> that imports ONNX models into MXNet. @roshrini and I (@lupesko) have worked
> on the code, which is now publicly available [1], and published a blog post
> demonstrating usage of the package [2]. Special thanks to dmlc/nnvm team,
> whose ONNX code was used as a reference for this implementation.
>
>
>
> What is ONNX?
>
> ONNX is an open source format to encode deep learning models. ONNX defines
> a format to store neural network's computational graph, as well as a
> storage format for operators used within a neural network graph. For more
> details, check out onnx.ai [3].
>
>
>
> Why I think ONNX is important for MXNet?
>
> ONNX is an emerging standard, that holds a lot of potential for Deep
> Learning practitioners. With ONNX, people can create and train a network
> with framework A, and deploy it for inference with framework B. The blog
> post we published demonstrates using a Super Res model trained with
> PyTorch, and importing it into MXNet Symbolic API for inference. I strongly
> believe that adopting ONNX early on adds value for deep learning
> practitioners, and thus supporting it adds value for MXNet as well.
>
>
>
> As for next steps, I was thinking that porting the functionality and code
> into MXNet is the logical next step.
>
> Would love to get the community's feedback and contributions!
>
>
>
> [1] https://github.com/onnx/onnx-mxnet
>
> [2] https://aws.amazon.com/blogs/ai/announcing-onnx-support-
> for-apache-mxnet/
>
> [3] https://onnx.ai
>
>


Re: AWS contributing ONNX-MXNet

2017-11-17 Thread Hen
It was contributed to 'ONNX'. Which is a joint Facebook/Microsoft project
of some kind ( http://onnx.ai/ ).

Hen

On Fri, Nov 17, 2017 at 12:35 AM, Isabel Drost-Fromm 
wrote:

>
>
> Am 16. November 2017 23:04:05 MEZ schrieb "Lupesko, Hagay" <
> lupe...@gmail.com>:
> >Today AWS announced contributing ONNX-MXNet,
>
>
> Just for clarification: this package is going to be/ intended to be
> contributed where to? Or do you mean "published under a free and open
> source license"?
>
> Isabel
>
> --
> Diese Nachricht wurde von meinem Android-Gerät mit K-9 Mail gesendet.
>


Re: AWS contributing ONNX-MXNet

2017-11-17 Thread Isabel Drost-Fromm


Am 16. November 2017 23:04:05 MEZ schrieb "Lupesko, Hagay" :
>Today AWS announced contributing ONNX-MXNet,


Just for clarification: this package is going to be/ intended to be contributed 
where to? Or do you mean "published under a free and open source license"?

Isabel

-- 
Diese Nachricht wurde von meinem Android-Gerät mit K-9 Mail gesendet.


Re: AWS contributing ONNX-MXNet

2017-11-16 Thread Lupesko, Hagay
Thanks for your notes Mu.

The tutorial on both nnvm and onnx-mxnet is referenced from ONNX PyTorch 
tutorial. And both code repos ack that, as appropriate, and without including 
any specific names.
When I looked at the nnvm onnx code, the commit history shows more than just 
Tianqi and Zhi, which made it seem more appropriate to refer to the entire 
dmlc/nnvm community.

Hagay 

On 11/16/17, 19:49, "Mu Li"  
wrote:

Thanks for pointing it out, Hagay. I actually missed the "Special thanks to
dmlc/nnvm team" sentence. I was looking for Zhi and Tianqi's names, because
the onnx converter was mainly done by both of them. However I do not
entirely agree that I "had no comments". I actually pointed out the initial
draft is similar to the nnvm tutorial two days ago. Hagay made a few
updates, but unfortunately, I didn't have a chance to have a look at the
updated version before the announcement. Otherwise, I should leave a
comment to put individual contributors names on it.

On Thu, Nov 16, 2017 at 7:03 PM, Lupesko, Hagay  wrote:

> Chiming in as well.
>
> First and foremost, I agree wholeheartedly that acknowledgments are due
> when deserved. In fact, we took care to add acknowledgments in the code,
> and in the blog post for that precise reason!
> I also personally talked with Mu, to make sure these are in order and
> appropriate, and he had no comments.
> Have we missed acknowledgments? Maybe (more on that below). But why assume
> this was done on intention?
>
> Addressing specific points (I won’t repeat Henri’s points):
> - I’m happy to take another look and see whether more files need to have
> the “ack” statement. But looking into it again, import_onnx.py [1] is the
> only one that seem to have been missed, and ack was already added. Sheng –
> I’ll grab some time with you Monday to discuss in details.
> - The tutorial itself was actually referenced from PyTorch, not nnvm. This
> is acknowledged by onnx-mxnet code, as well as the nnvm code.
> - We intentionally ack-ed an open source community (dmlc/nnvm) and not
> individuals. There’s more than Tianqi and Zhi that worked on nnvm and 
onnx,
> it is a whole community that we thank to.
> - “I was wondering why your below email didn't include such
> acknowledgement?” – as noted by Hen, the email did include the ack.
>
> One last thing, quoting Sheng: “In general, to have a healthy community, I
> believe the right things to do would be…”
> I would stress out that in order to have a healthy community, we should
> always assume others have best intentions – this will make us a stronger
> community, one that works together, and one that if fun to be part of.
>
> Hagay
>
> [1] https://github.com/onnx/onnx-mxnet/blob/master/onnx_mxnet/
> import_onnx.py
>
> On 11/16/17, 18:06, "Hen"  wrote:
>
> On Thu, Nov 16, 2017 at 4:32 PM, Sheng Zha  wrote:
>
> > Hi Hagay,
> >
> > (cc'd Zhi, Tianqi to make sure real authors are aware)
> >
> >
> >
> > At first glance the code in the repo you shared (i.e.
> > https://github.com/onnx/onnx-mxnet) looks very
> >
> > familiar, so I did some searching. It looks like *almost all* the
> code
> > are adopted from the *nnvm onnx*
> >
> > frontend, but the main contributor (*Zhi Zhang*, committer of mxnet,
> and
> > intern at AWS) from this same
> >
> > community was not given his due credit in your email. To elaborate
> on why
> > I think almost all the
> >
> > onnx-mxnet code is from nnvm onnx frontend:
> >
> >
> >
> > The following is the content of this repo:
> >
> > ├── LICENSE.txt
> >
> > ├── README.md
> >
> > ├── onnx_mxnet
> >
> > │   ├── __init__.py
> >
> > │   ├── common.py
> >
> > │   ├── import_helper.py
> >
> > │   ├── import_onnx.py
> >
> > │   └── tests
> >
> > │   ├── test_models.py
> >
> > │   └── test_super_resolution.py
> >
> > ├── setup.py
> >
> > ├── super_res_input.jpg
> >
> > └── super_res_output.jpg
> >
> > (Also attached a screenshot of the commit history of onnx_mxnet at
> the
> > moment, as well as a copy of the git package, in case commit hash
> mismatch
> > happens)
> >
> >
> >
> >- Out of the 6 files under onnx_mxnet package
> >   - the following two files are marked as being derived from
> nnvm:
> > 

Re: AWS contributing ONNX-MXNet

2017-11-16 Thread Zha, Sheng
Hi Hagay,

> But why assume this was done on intention?
Given that you mentioned that you talked to Mu on this, would it be right to 
assume that you have paid sufficient attention and have been extra careful on 
acknowledgements already?

> import_onnx.py [1] is the only one that seem to have been missed… Monday to 
> discuss in details.
Sure, happy to talk to you then. I’d like to see the reasons why the other 
three files I listed don’t deserve acknowledgement.

Given the similarity in the code, I can’t help but conclude that code in one is 
copied from another source, and in order to do that you must already be aware 
of the origin of the code. So, how can files can be missed unintentionally? 
Otherwise, I’ll do my best in assuming the best intention in your actions. 
Thanks.

Best regards,
-sz

On 11/16/17, 7:04 PM, "Lupesko, Hagay"  wrote:

Chiming in as well.

First and foremost, I agree wholeheartedly that acknowledgments are due 
when deserved. In fact, we took care to add acknowledgments in the code, and in 
the blog post for that precise reason!
I also personally talked with Mu, to make sure these are in order and 
appropriate, and he had no comments.
Have we missed acknowledgments? Maybe (more on that below). But why assume 
this was done on intention?

Addressing specific points (I won’t repeat Henri’s points):
- I’m happy to take another look and see whether more files need to have 
the “ack” statement. But looking into it again, import_onnx.py [1] is the only 
one that seem to have been missed, and ack was already added. Sheng – I’ll grab 
some time with you Monday to discuss in details.
- The tutorial itself was actually referenced from PyTorch, not nnvm. This 
is acknowledged by onnx-mxnet code, as well as the nnvm code.
- We intentionally ack-ed an open source community (dmlc/nnvm) and not 
individuals. There’s more than Tianqi and Zhi that worked on nnvm and onnx, it 
is a whole community that we thank to.
- “I was wondering why your below email didn't include such 
acknowledgement?” – as noted by Hen, the email did include the ack.

One last thing, quoting Sheng: “In general, to have a healthy community, I 
believe the right things to do would be…”
I would stress out that in order to have a healthy community, we should 
always assume others have best intentions – this will make us a stronger 
community, one that works together, and one that if fun to be part of.

Hagay

[1] https://github.com/onnx/onnx-mxnet/blob/master/onnx_mxnet/import_onnx.py

On 11/16/17, 18:06, "Hen"  wrote:

On Thu, Nov 16, 2017 at 4:32 PM, Sheng Zha  wrote:

> Hi Hagay,
>
> (cc'd Zhi, Tianqi to make sure real authors are aware)
>
>
>
> At first glance the code in the repo you shared (i.e.
> https://github.com/onnx/onnx-mxnet) looks very
>
> familiar, so I did some searching. It looks like *almost all* the code
> are adopted from the *nnvm onnx*
>
> frontend, but the main contributor (*Zhi Zhang*, committer of mxnet, 
and
> intern at AWS) from this same
>
> community was not given his due credit in your email. To elaborate on 
why
> I think almost all the
>
> onnx-mxnet code is from nnvm onnx frontend:
>
>
>
> The following is the content of this repo:
>
> ├── LICENSE.txt
>
> ├── README.md
>
> ├── onnx_mxnet
>
> │   ├── __init__.py
>
> │   ├── common.py
>
> │   ├── import_helper.py
>
> │   ├── import_onnx.py
>
> │   └── tests
>
> │   ├── test_models.py
>
> │   └── test_super_resolution.py
>
> ├── setup.py
>
> ├── super_res_input.jpg
>
> └── super_res_output.jpg
>
> (Also attached a screenshot of the commit history of onnx_mxnet at the
> moment, as well as a copy of the git package, in case commit hash 
mismatch
> happens)
>
>
>
>- Out of the 6 files under onnx_mxnet package
>   - the following two files are marked as being derived from nnvm:
>  - common.py
>  

>  - import_helper.py
>  

>   - the rest four files that are not marked as being derived from
>   nnvm:
>  - __init__.py
>  
:
>  looks 

Re: AWS contributing ONNX-MXNet

2017-11-16 Thread Mu Li
Thanks for pointing it out, Hagay. I actually missed the "Special thanks to
dmlc/nnvm team" sentence. I was looking for Zhi and Tianqi's names, because
the onnx converter was mainly done by both of them. However I do not
entirely agree that I "had no comments". I actually pointed out the initial
draft is similar to the nnvm tutorial two days ago. Hagay made a few
updates, but unfortunately, I didn't have a chance to have a look at the
updated version before the announcement. Otherwise, I should leave a
comment to put individual contributors names on it.

On Thu, Nov 16, 2017 at 7:03 PM, Lupesko, Hagay  wrote:

> Chiming in as well.
>
> First and foremost, I agree wholeheartedly that acknowledgments are due
> when deserved. In fact, we took care to add acknowledgments in the code,
> and in the blog post for that precise reason!
> I also personally talked with Mu, to make sure these are in order and
> appropriate, and he had no comments.
> Have we missed acknowledgments? Maybe (more on that below). But why assume
> this was done on intention?
>
> Addressing specific points (I won’t repeat Henri’s points):
> - I’m happy to take another look and see whether more files need to have
> the “ack” statement. But looking into it again, import_onnx.py [1] is the
> only one that seem to have been missed, and ack was already added. Sheng –
> I’ll grab some time with you Monday to discuss in details.
> - The tutorial itself was actually referenced from PyTorch, not nnvm. This
> is acknowledged by onnx-mxnet code, as well as the nnvm code.
> - We intentionally ack-ed an open source community (dmlc/nnvm) and not
> individuals. There’s more than Tianqi and Zhi that worked on nnvm and onnx,
> it is a whole community that we thank to.
> - “I was wondering why your below email didn't include such
> acknowledgement?” – as noted by Hen, the email did include the ack.
>
> One last thing, quoting Sheng: “In general, to have a healthy community, I
> believe the right things to do would be…”
> I would stress out that in order to have a healthy community, we should
> always assume others have best intentions – this will make us a stronger
> community, one that works together, and one that if fun to be part of.
>
> Hagay
>
> [1] https://github.com/onnx/onnx-mxnet/blob/master/onnx_mxnet/
> import_onnx.py
>
> On 11/16/17, 18:06, "Hen"  wrote:
>
> On Thu, Nov 16, 2017 at 4:32 PM, Sheng Zha  wrote:
>
> > Hi Hagay,
> >
> > (cc'd Zhi, Tianqi to make sure real authors are aware)
> >
> >
> >
> > At first glance the code in the repo you shared (i.e.
> > https://github.com/onnx/onnx-mxnet) looks very
> >
> > familiar, so I did some searching. It looks like *almost all* the
> code
> > are adopted from the *nnvm onnx*
> >
> > frontend, but the main contributor (*Zhi Zhang*, committer of mxnet,
> and
> > intern at AWS) from this same
> >
> > community was not given his due credit in your email. To elaborate
> on why
> > I think almost all the
> >
> > onnx-mxnet code is from nnvm onnx frontend:
> >
> >
> >
> > The following is the content of this repo:
> >
> > ├── LICENSE.txt
> >
> > ├── README.md
> >
> > ├── onnx_mxnet
> >
> > │   ├── __init__.py
> >
> > │   ├── common.py
> >
> > │   ├── import_helper.py
> >
> > │   ├── import_onnx.py
> >
> > │   └── tests
> >
> > │   ├── test_models.py
> >
> > │   └── test_super_resolution.py
> >
> > ├── setup.py
> >
> > ├── super_res_input.jpg
> >
> > └── super_res_output.jpg
> >
> > (Also attached a screenshot of the commit history of onnx_mxnet at
> the
> > moment, as well as a copy of the git package, in case commit hash
> mismatch
> > happens)
> >
> >
> >
> >- Out of the 6 files under onnx_mxnet package
> >   - the following two files are marked as being derived from
> nnvm:
> >  - common.py
> >   common.py#L12>
> >  - import_helper.py
> >   import_helper.py#L12>
> >   - the rest four files that are not marked as being derived from
> >   nnvm:
> >  - __init__.py
> >   4ebc02e8f1cc523049f0928b6dbc566a93dd2f47/onnx_mxnet/__init__.py#L15>:
> >  looks like a copy from nnvm/frontend/onnx.py
> >   3da53e46db57c438b05fbebe8aa332ee8c5994d1/python/nnvm/frontend/onnx.py#L392
> >
> >  - import_onnx.py
> >   4ebc02e8f1cc523049f0928b6dbc566a93dd2f47/onnx_mxnet/import_onnx.py#L18>
> looks
> >  like a copy from 

Re: AWS contributing ONNX-MXNet

2017-11-16 Thread Mu Li
Thanks for letting the mxnet community know it. Given this new converter is
derived from the nnvm-onnx converted developed by Zhi and Tianqi, see this
link

and that link
.
Also the nnvm-onnx converter has been annouced on tvmlang.org
 , also with
this tutorial
.
I
think it is worth to *acknowledge* the contributions from our community
members in the announcement to maintain a healthy connection.

On Thu, Nov 16, 2017 at 2:04 PM, Lupesko, Hagay  wrote:

> Hey folks,
>
>
>
> Today AWS announced contributing ONNX-MXNet, an open source Python package
> that imports ONNX models into MXNet. @roshrini and I (@lupesko) have worked
> on the code, which is now publicly available [1], and published a blog post
> demonstrating usage of the package [2]. Special thanks to dmlc/nnvm team,
> whose ONNX code was used as a reference for this implementation.
>
>
>
> What is ONNX?
>
> ONNX is an open source format to encode deep learning models. ONNX defines
> a format to store neural network's computational graph, as well as a
> storage format for operators used within a neural network graph. For more
> details, check out onnx.ai [3].
>
>
>
> Why I think ONNX is important for MXNet?
>
> ONNX is an emerging standard, that holds a lot of potential for Deep
> Learning practitioners. With ONNX, people can create and train a network
> with framework A, and deploy it for inference with framework B. The blog
> post we published demonstrates using a Super Res model trained with
> PyTorch, and importing it into MXNet Symbolic API for inference. I strongly
> believe that adopting ONNX early on adds value for deep learning
> practitioners, and thus supporting it adds value for MXNet as well.
>
>
>
> As for next steps, I was thinking that porting the functionality and code
> into MXNet is the logical next step.
>
> Would love to get the community's feedback and contributions!
>
>
>
> [1] https://github.com/onnx/onnx-mxnet
>
> [2] https://aws.amazon.com/blogs/ai/announcing-onnx-support-
> for-apache-mxnet/
>
> [3] https://onnx.ai
>
>


AWS contributing ONNX-MXNet

2017-11-16 Thread Lupesko, Hagay
Hey folks,

 

Today AWS announced contributing ONNX-MXNet, an open source Python package that 
imports ONNX models into MXNet. @roshrini and I (@lupesko) have worked on the 
code, which is now publicly available [1], and published a blog post 
demonstrating usage of the package [2]. Special thanks to dmlc/nnvm team, whose 
ONNX code was used as a reference for this implementation.

 

What is ONNX?

ONNX is an open source format to encode deep learning models. ONNX defines a 
format to store neural network's computational graph, as well as a storage 
format for operators used within a neural network graph. For more details, 
check out onnx.ai [3].

 

Why I think ONNX is important for MXNet?

ONNX is an emerging standard, that holds a lot of potential for Deep Learning 
practitioners. With ONNX, people can create and train a network with framework 
A, and deploy it for inference with framework B. The blog post we published 
demonstrates using a Super Res model trained with PyTorch, and importing it 
into MXNet Symbolic API for inference. I strongly believe that adopting ONNX 
early on adds value for deep learning practitioners, and thus supporting it 
adds value for MXNet as well.

 

As for next steps, I was thinking that porting the functionality and code into 
MXNet is the logical next step.

Would love to get the community's feedback and contributions!

 

[1] https://github.com/onnx/onnx-mxnet

[2] https://aws.amazon.com/blogs/ai/announcing-onnx-support-for-apache-mxnet/

[3] https://onnx.ai