Hi Matthew & Juan,
Thank you for the kind words and for the feedback! I totally agree in
what you are suggesting, so I will just provide a little of context on
why it was presented that way.
First of all, when talking about type arithmetic I suggest to refer to
the latest presentation for the summit rather than from the Tensor
Typing Meetings
(https://docs.google.com/presentation/d/1NZzZfbUCXi78LDDe8C3hAaK9on8ZiX3Z2k4BlZXwsJ8/edit). Here I intentionally refer to the +,-,*,// syntax rather than the Add[] one. Moreover, I also write 1 + A rather than Literal[1] + A, assuming that we could write a PEP for enabling this behavior if needed, and as Matthew said, it seems feasible based on what was said in the
summit.
So yes, we totally agree that is important to make the syntax as
simple as possible, especially because expressions can get quite
complicated (e.g. convolutions). Nonetheless, I will also answer the
question of "Why in the presentation of the Tensor Typing meeting you
used the Prefix operators (Add) and did not support directly Subtract?"
The answer is that the goal was to show what was tecnically possible
to achieve with Type Arithmetic, and syntax does not matter in that
regard. Since we knew that the syntax could change any point later on
we just went for the prefix one after a quick 30 minutes discussion in
Pyre. However, the implementation for doing A+B instead of Add[A,B]
would have been the same complex, and can be changed with a few lines
of code.
About Subtract, it was once again, because we did not need a custom
Subtract operator to show what type arithmetic can do. However,
supporting it would have been a matter of a few lines in Pyre, I even
had a PR ready but we decided to make as few changes as possible since
it would not add any new functionality. After all, the idea was how to
bring a big feature like Type Arithmetic with as few changes to the
language as possible.
Best,
Alfonso.
Quoting Matthew Rahtz <mra...@google.com>:
Hi Juan
Thanks for the kind words!
why isn’t there a Subtract operator?
I /think/ this is just because the implementation in Pyre is
still an early prototype. Alfonso?
is there a prospect for using actual operators and actual literals here?
That's a really good question. We're optimistic that actual
literals should be possible - we assumed there was some subtle
reason we'd have to use Literal, but when we briefly talked about it
at PyCon, Guido was like "Nope, you should just write a PEP for
that". As for actual operators - I can't think of any
/technical/ reason this shouldn't be possible - when it comes time
to write a PEP for type arithmetic it's definitely something we
should propose. Alfonso, any thoughts?
Best,
Matthew
On Mon, 14 Jun 2021 at 03:51, Juan Nunez-Iglesias
<j...@fastmail.com> wrote:
Thank you so much Matthew, Pradeep, and everyone else helping with
this effort!
And thanks Matthew for sharing the slides — very informative!
I’ve asked Sebastian for the recording but in the meantime I
thought I’d ask a couple of questions about the syntax:
* why isn’t there a Subtract operator? It seems that
Subtract[A, B] is much more readable than Add[A,
Multiply[Literal[-1], B]]?
* is there a prospect for using actual operators and actual
literals here? I think that a lot of reticence about Python typing
comes from the annotations being so difficult to read. There’s been
some great improvements recently with “Optional[Tuple[Union[Float,
Int]]]” becoming “tuple[float | int] | None”, and I hope we can
leapfrog the awkward phase with this tensor typing stuff. Again, A
- B is much more readable again than Subtract[A, B]. And B-2 is
much more readable than Add[B, Literal[-2]].
Other than that, love the work, thank you again!
Juan.
On 11 Jun 2021, at 5:08 am, Matthew Rahtz <mra...@google.com> wrote:
Some links I promised to send around:
* https://github.com/deepmind/tensor_annotations
which has a bunch of experimental shape-aware stubs for TensorFlow
and JAX using 'semantic' axis labels
* https://github.com/pradeep90/pytorch_examples
ditto for TensorFlow and PyTorch using value-based axis labels
* https://arxiv.org/pdf/2102.13254.pdf Similar work
in TensorFlow for Swift
* Slides[1] for our presentation
* Tensor typing meeting details[2]
Thanks again, everyone!
On Wed, 9 Jun 2021 at 22:28, Sebastian Berg
<sebast...@sipsolutions.net> wrote:
On Tue, 2021-06-08 at 11:40 -0700, Stefan van der Walt wrote:
On Tue, Jun 8, 2021, at 03:23, Juan Nunez-Iglesias wrote:
> Hello! Any chance this could be recorded for those of us in useless
> time zones? 😂
I added a note to the agenda; we'll ask the speakers and record if
they don't mind.
We have made a recording, but I did not plan on making it available
publicly. Please contact me directly if you are interested.
Cheers,
Sebastian
Stéfan
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion
Links:
------
[1]
https://docs.google.com/presentation/d/1kLS-bz1ZmJIFizZExkm8N9jqAj2oOJzTmUaL8INGefc/edit?usp=sharing&resourcekey=0-eFDyGvcEeeHzF62x6Cwl6Q
[2]
https://docs.google.com/document/d/1oaG0V2ZE5BRDjd9N-Tr1N0IKGwZQcraIlZ0N8ayqVg8/edit
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion