thomelane commented on a change in pull request #15343: Revise Symbol tutorial
URL: https://github.com/apache/incubator-mxnet/pull/15343#discussion_r297301068
##########
File path: docs/tutorials/basic/symbol.md
##########
@@ -429,18 +356,93 @@ print({'input':arg, 'output':out})
### Variable Sharing
-To share the contents between several symbols,
-we can bind these symbols with the same array as follows:
+To share the contents between several symbols, we can bind these symbols with
the same array as follows:
```python
a = mx.sym.Variable('a')
b = mx.sym.Variable('b')
b = a + a * a
-data = mx.nd.ones((2,3))*2
+data = mx.nd.ones((2,3)) * 2
ex = b.bind(ctx=mx.cpu(), args={'a':data, 'b':data})
ex.forward()
ex.outputs[0].asnumpy()
```
+### Weight tying
+
+You can use same principle to tie weights of different layers. In the example
below two `FullyConnected` layers share same weights and biases, but process
different data. Find a full example below.
Review comment:
Could you explain a little more about where to look or what's going on? It's
quite a long code block.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services