Modified: incubator/singa/site/trunk/en/docs/layer.html URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/docs/layer.html?rev=1862313&r1=1862312&r2=1862313&view=diff ============================================================================== --- incubator/singa/site/trunk/en/docs/layer.html (original) +++ incubator/singa/site/trunk/en/docs/layer.html Sat Jun 29 14:42:24 2019 @@ -18,21 +18,15 @@ - <script type="text/javascript" src="../_static/js/modernizr.min.js"></script> + + - <script type="text/javascript" id="documentation_options" data-url_root="../" src="../_static/documentation_options.js"></script> - <script type="text/javascript" src="../_static/jquery.js"></script> - <script type="text/javascript" src="../_static/underscore.js"></script> - <script type="text/javascript" src="../_static/doctools.js"></script> - <script type="text/javascript" src="../_static/language_data.js"></script> - - <script type="text/javascript" src="../_static/js/theme.js"></script> - + - <link rel="stylesheet" href="../_static/css/theme.css" type="text/css" /> + <link rel="stylesheet" href="../_static/css/theme.css" type="text/css" /> <link rel="stylesheet" href="../_static/pygments.css" type="text/css" /> <link rel="index" title="Index" href="../genindex.html" /> <link rel="search" title="Search" href="../search.html" /> @@ -50,16 +44,21 @@ } </style> + + + <script src="../_static/js/modernizr.min.js"></script> + </head> <body class="wy-body-for-nav"> <div class="wy-grid-for-nav"> + <nav data-toggle="wy-nav-shift" class="wy-nav-side"> <div class="wy-side-scroll"> - <div class="wy-side-nav-search" > + <div class="wy-side-nav-search"> @@ -104,7 +103,6 @@ <li class="toctree-l1 current"><a class="reference internal" href="index.html">Documentation</a><ul class="current"> <li class="toctree-l2"><a class="reference internal" href="installation.html">Installation</a></li> <li class="toctree-l2"><a class="reference internal" href="software_stack.html">Software Stack</a></li> -<li class="toctree-l2"><a class="reference internal" href="benchmark.html">Benchmark for Distributed training</a></li> <li class="toctree-l2"><a class="reference internal" href="device.html">Device</a></li> <li class="toctree-l2"><a class="reference internal" href="tensor.html">Tensor</a></li> <li class="toctree-l2 current"><a class="current reference internal" href="#">Layer</a><ul> @@ -263,33 +261,40 @@ layer using the engine attribute.</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Layer</code><span class="sig-paren">(</span><em>name</em>, <em>conf=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <code class="xref py py-class docutils literal notranslate"><span class="pre">object</span></code></p> <p>Base Python layer class.</p> -<dl class="simple"> -<dt>Typically, the life cycle of a layer instance includes:</dt><dd><ol class="arabic simple"> -<li><p>construct layer without input_sample_shapes, goto 2; -construct layer with input_sample_shapes, goto 3;</p></li> -<li><p>call setup to create the parameters and setup other meta fields</p></li> -<li><p>call forward or access layer members</p></li> -<li><p>call backward and get parameters for update</p></li> +<dl class="docutils"> +<dt>Typically, the life cycle of a layer instance includes:</dt> +<dd><ol class="first last arabic simple"> +<li>construct layer without input_sample_shapes, goto 2; +construct layer with input_sample_shapes, goto 3;</li> +<li>call setup to create the parameters and setup other meta fields</li> +<li>call forward or access layer members</li> +<li>call backward and get parameters for update</li> </ol> </dd> </dl> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>name</strong> (<em>str</em>) â layer name</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>name</strong> (<em>str</em>) â layer name</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Layer.setup"> <code class="descname">setup</code><span class="sig-paren">(</span><em>in_shapes</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer.setup" title="Permalink to this definition">¶</a></dt> <dd><p>Call the C++ setup function to create params and set some meta data.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is a single tuple specifying the inpute Tensor shape; if the layer accepts multiple input Tensor (e.g., the concatenation layer), -in_shapes is a tuple of tuples, each for one input Tensor</p> -</dd> -</dl> +in_shapes is a tuple of tuples, each for one input Tensor</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> @@ -302,22 +307,28 @@ in_shapes is a tuple of tuples, each for <dt id="singa.layer.Layer.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Layer.param_names"> <code class="descname">param_names</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer.param_names" title="Permalink to this definition">¶</a></dt> -<dd><dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a list of strings, one for the name of one parameter Tensor</p> -</dd> -</dl> +<dd><table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a list of strings, one for the name of one parameter Tensor</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> @@ -327,62 +338,76 @@ has multiple outputs</p> <p>Parameter tensors are not stored as layer members because cpp Tensor could be moved onto diff devices due to the change of layer device, which would result in inconsistency.</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a list of tensors, one for each paramter</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a list of tensors, one for each paramter</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Layer.forward"> <code class="descname">forward</code><span class="sig-paren">(</span><em>flag</em>, <em>x</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer.forward" title="Permalink to this definition">¶</a></dt> <dd><p>Forward propagate through this layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â True (kTrain) for training (kEval); False for evaluating; -other values for furture use.</p></li> -<li><p><strong>x</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â an input tensor if the layer is +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â True (kTrain) for training (kEval); False for evaluating; +other values for furture use.</li> +<li><strong>x</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â an input tensor if the layer is connected from a single layer; a list of tensors if the layer -is connected from multiple layers.</p></li> +is connected from multiple layers.</li> </ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>a tensor if the layer is connected to a single layer; a list of +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last">a tensor if the layer is connected to a single layer; a list of tensors if the layer is connected to multiple layers;</p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Layer.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>flag</em>, <em>dy</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Backward propagate gradients through this layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> (<em>int</em>) â for future use.</p></li> -<li><p><strong>dy</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â the gradient tensor(s) y w.r.t the -objective loss</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p><dx, <dp1, dp2..>>, dx is a (set of) tensor(s) for the gradient of x +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> (<em>int</em>) â for future use.</li> +<li><strong>dy</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â the gradient tensor(s) y w.r.t the +objective loss</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last"><dx, <dp1, dp2..>>, dx is a (set of) tensor(s) for the gradient of x , dpi is the gradient of the i-th parameter</p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Layer.to_device"> <code class="descname">to_device</code><span class="sig-paren">(</span><em>device</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Layer.to_device" title="Permalink to this definition">¶</a></dt> <dd><p>Move layer state tensors onto the given device.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>device</strong> â swig converted device, created using singa.device</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>device</strong> â swig converted device, created using singa.device</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> @@ -402,26 +427,32 @@ objective loss</p></li> <dt id="singa.layer.Dummy.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Dummy.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Dummy.setup"> <code class="descname">setup</code><span class="sig-paren">(</span><em>input_sample_shape</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Dummy.setup" title="Permalink to this definition">¶</a></dt> <dd><p>Call the C++ setup function to create params and set some meta data.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is a single tuple specifying the inpute Tensor shape; if the layer accepts multiple input Tensor (e.g., the concatenation layer), -in_shapes is a tuple of tuples, each for one input Tensor</p> -</dd> -</dl> +in_shapes is a tuple of tuples, each for one input Tensor</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> @@ -443,24 +474,26 @@ in_shapes is a tuple of tuples, each for <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Conv2D</code><span class="sig-paren">(</span><em>name</em>, <em>nb_kernels</em>, <em>kernel=3</em>, <em>stride=1</em>, <em>border_mode='same'</em>, <em>cudnn_prefer='fastest'</em>, <em>workspace_byte_limit=1024</em>, <em>data_format='NCHW'</em>, <em>use_bias=True</em>, <em>W_specs=None</em>, <em>b_specs=None</em>, <em>pad=None</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Conv2D" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Construct a layer for 2D convolution.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>nb_kernels</strong> (<em>int</em>) â num of the channels (kernels) of the input Tensor</p></li> -<li><p><strong>kernel</strong> â an integer or a pair of integers for kernel height and width</p></li> -<li><p><strong>stride</strong> â an integer or a pair of integers for stride height and width</p></li> -<li><p><strong>border_mode</strong> (<em>string</em>) â padding mode, case in-sensitive, +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>nb_kernels</strong> (<em>int</em>) â num of the channels (kernels) of the input Tensor</li> +<li><strong>kernel</strong> â an integer or a pair of integers for kernel height and width</li> +<li><strong>stride</strong> â an integer or a pair of integers for stride height and width</li> +<li><strong>border_mode</strong> (<em>string</em>) â padding mode, case in-sensitive, âvalidâ -> padding is 0 for height and width âsameâ -> padding is half of the kernel (floor), the kernel must be -odd number.</p></li> -<li><p><strong>cudnn_prefer</strong> (<em>string</em>) â the preferred algorithm for cudnn convolution +odd number.</li> +<li><strong>cudnn_prefer</strong> (<em>string</em>) â the preferred algorithm for cudnn convolution which could be âfastestâ, âautotuneâ, âlimited_workspaceâ and -âno_workspaceâ</p></li> -<li><p><strong>workspace_byte_limit</strong> (<em>int</em>) â max workspace size in MB (default is 512MB)</p></li> -<li><p><strong>data_format</strong> (<em>string</em>) â either âNCHWâ or âNHWCâ</p></li> -<li><p><strong>use_bias</strong> (<em>bool</em>) â True or False</p></li> -<li><p><strong>pad</strong> â an integer or a pair of integers for padding height and width</p></li> -<li><p><strong>W_specs</strong> (<em>dict</em>) â used to specify the weight matrix specs, fields +âno_workspaceâ</li> +<li><strong>workspace_byte_limit</strong> (<em>int</em>) â max workspace size in MB (default is 512MB)</li> +<li><strong>data_format</strong> (<em>string</em>) â either âNCHWâ or âNHWCâ</li> +<li><strong>use_bias</strong> (<em>bool</em>) â True or False</li> +<li><strong>pad</strong> â an integer or a pair of integers for padding height and width</li> +<li><strong>W_specs</strong> (<em>dict</em>) â used to specify the weight matrix specs, fields include, ânameâ for parameter name âlr_multâ for learning rate multiplier @@ -469,25 +502,30 @@ include, âxavierâ and ââ âstdâ, âmeanâ, âhighâ, âlowâ for corresponding init methods TODO(wangwei) âclampâ for gradient constraint, value is scalar -âregularizerâ for regularization, currently support âl2â</p></li> -<li><p><strong>b_specs</strong> (<em>dict</em>) â hyper-parameters for bias vector, similar as W_specs</p></li> -<li><p><strong>name</strong> (<em>string</em>) â layer name.</p></li> -<li><p><strong>input_sample_shape</strong> â 3d tuple for the shape of the input Tensor +âregularizerâ for regularization, currently support âl2â</li> +<li><strong>b_specs</strong> (<em>dict</em>) â hyper-parameters for bias vector, similar as W_specs</li> +<li><strong>name</strong> (<em>string</em>) â layer name.</li> +<li><strong>input_sample_shape</strong> â 3d tuple for the shape of the input Tensor without the batchsize, e.g., (channel, height, width) or -(height, width, channel)</p></li> +(height, width, channel)</li> </ul> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Conv2D.setup"> <code class="descname">setup</code><span class="sig-paren">(</span><em>in_shape</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Conv2D.setup" title="Permalink to this definition">¶</a></dt> <dd><p>Set up the kernel, stride and padding; then call the C++ setup function to create params and set some meta data.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>is a tuple of int for the input sample shape</strong> (<em>in_shapes</em>) â </p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>is a tuple of int for the input sample shape</strong> (<em>in_shapes</em>) â </td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -505,12 +543,15 @@ length</p> <dt id="singa.layer.Conv1D.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Conv1D.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -521,22 +562,28 @@ has multiple outputs</p> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>2D pooling layer providing max/avg pooling.</p> <p>All args are the same as those for Conv2D, except the following one</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>mode</strong> â pooling type, model_pb2.PoolingConf.MAX or -model_pb2.PoolingConf.AVE</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>mode</strong> â pooling type, model_pb2.PoolingConf.MAX or +model_pb2.PoolingConf.AVE</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Pooling2D.setup"> <code class="descname">setup</code><span class="sig-paren">(</span><em>in_shape</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Pooling2D.setup" title="Permalink to this definition">¶</a></dt> <dd><p>Set up the kernel, stride and padding; then call the C++ setup function to create params and set some meta data.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>is a tuple of int for the input sample shape</strong> (<em>in_shapes</em>) â </p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>is a tuple of int for the input sample shape</strong> (<em>in_shapes</em>) â </td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -561,12 +608,15 @@ function to create params and set some m <dt id="singa.layer.MaxPooling1D.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.MaxPooling1D.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -579,12 +629,15 @@ has multiple outputs</p> <dt id="singa.layer.AvgPooling1D.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.AvgPooling1D.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -594,11 +647,13 @@ has multiple outputs</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">BatchNormalization</code><span class="sig-paren">(</span><em>name</em>, <em>momentum=0.9</em>, <em>beta_specs=None</em>, <em>gamma_specs=None</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.BatchNormalization" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Batch-normalization.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>momentum</strong> (<em>float</em>) â for running average mean and variance.</p></li> -<li><p><strong>beta_specs</strong> (<em>dict</em>) â dictionary includes the fields for the beta +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>momentum</strong> (<em>float</em>) â for running average mean and variance.</li> +<li><strong>beta_specs</strong> (<em>dict</em>) â dictionary includes the fields for the beta param: ânameâ for parameter name âlr_multâ for learning rate multiplier @@ -607,13 +662,15 @@ param: âxavierâ and ââ âstdâ, âmeanâ, âhighâ, âlowâ for corresponding init methods âclampâ for gradient constraint, value is scalar -âregularizerâ for regularization, currently support âl2â</p></li> -<li><p><strong>gamma_specs</strong> (<em>dict</em>) â similar to beta_specs, but for the gamma param.</p></li> -<li><p><strong>name</strong> (<em>string</em>) â layer name</p></li> -<li><p><strong>input_sample_shape</strong> (<em>tuple</em>) â with at least one integer</p></li> -</ul> -</dd> -</dl> +âregularizerâ for regularization, currently support âl2â</li> +<li><strong>gamma_specs</strong> (<em>dict</em>) â similar to beta_specs, but for the gamma param.</li> +<li><strong>name</strong> (<em>string</em>) â layer name</li> +<li><strong>input_sample_shape</strong> (<em>tuple</em>) â with at least one integer</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -625,52 +682,63 @@ param: <dt id="singa.layer.L2Norm.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.L2Norm.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.L2Norm.forward"> <code class="descname">forward</code><span class="sig-paren">(</span><em>is_train</em>, <em>x</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.L2Norm.forward" title="Permalink to this definition">¶</a></dt> <dd><p>Forward propagate through this layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â True (kTrain) for training (kEval); False for evaluating; -other values for furture use.</p></li> -<li><p><strong>x</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â an input tensor if the layer is +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â True (kTrain) for training (kEval); False for evaluating; +other values for furture use.</li> +<li><strong>x</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â an input tensor if the layer is connected from a single layer; a list of tensors if the layer -is connected from multiple layers.</p></li> +is connected from multiple layers.</li> </ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>a tensor if the layer is connected to a single layer; a list of +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last">a tensor if the layer is connected to a single layer; a list of tensors if the layer is connected to multiple layers;</p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.L2Norm.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>is_train</em>, <em>dy</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.L2Norm.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Backward propagate gradients through this layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> (<em>int</em>) â for future use.</p></li> -<li><p><strong>dy</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â the gradient tensor(s) y w.r.t the -objective loss</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p><dx, <dp1, dp2..>>, dx is a (set of) tensor(s) for the gradient of x +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> (<em>int</em>) â for future use.</li> +<li><strong>dy</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a><em> or </em><em>list<Tensor></em>) â the gradient tensor(s) y w.r.t the +objective loss</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last"><dx, <dp1, dp2..>>, dx is a (set of) tensor(s) for the gradient of x , dpi is the gradient of the i-th parameter</p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -680,16 +748,20 @@ objective loss</p></li> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">LRN</code><span class="sig-paren">(</span><em>name</em>, <em>size=5</em>, <em>alpha=1</em>, <em>beta=0.75</em>, <em>mode='cross_channel'</em>, <em>k=1</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.LRN" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Local response normalization.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>size</strong> (<em>int</em>) â # of channels to be crossed -normalization.</p></li> -<li><p><strong>mode</strong> (<em>string</em>) â âcross_channelâ</p></li> -<li><p><strong>input_sample_shape</strong> (<em>tuple</em>) â 3d tuple, (channel, height, width)</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>size</strong> (<em>int</em>) â # of channels to be crossed +normalization.</li> +<li><strong>mode</strong> (<em>string</em>) â âcross_channelâ</li> +<li><strong>input_sample_shape</strong> (<em>tuple</em>) â 3d tuple, (channel, height, width)</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -698,12 +770,14 @@ normalization.</p></li> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Apply linear/affine transformation, also called inner-product or fully connected layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>num_output</strong> (<em>int</em>) â output feature length.</p></li> -<li><p><strong>use_bias</strong> (<em>bool</em>) â add a bias vector or not to the transformed feature</p></li> -<li><p><strong>W_specs</strong> (<em>dict</em>) â specs for the weight matrix +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>num_output</strong> (<em>int</em>) â output feature length.</li> +<li><strong>use_bias</strong> (<em>bool</em>) â add a bias vector or not to the transformed feature</li> +<li><strong>W_specs</strong> (<em>dict</em>) â specs for the weight matrix ânameâ for parameter name âlr_multâ for learning rate multiplier âdecay_multâ for weight decay multiplier @@ -711,13 +785,15 @@ fully connected layer.</p> âxavierâ and ââ âstdâ, âmeanâ, âhighâ, âlowâ for corresponding init methods âclampâ for gradient constraint, value is scalar -âregularizerâ for regularization, currently support âl2â</p></li> -<li><p><strong>b_specs</strong> (<em>dict</em>) â specs for the bias vector, same fields as W_specs.</p></li> -<li><p><strong>W_transpose</strong> (<em>bool</em>) â if true, output=x*W.T+b;</p></li> -<li><p><strong>input_sample_shape</strong> (<em>tuple</em>) â input feature length</p></li> -</ul> -</dd> -</dl> +âregularizerâ for regularization, currently support âl2â</li> +<li><strong>b_specs</strong> (<em>dict</em>) â specs for the bias vector, same fields as W_specs.</li> +<li><strong>W_transpose</strong> (<em>bool</em>) â if true, output=x*W.T+b;</li> +<li><strong>input_sample_shape</strong> (<em>tuple</em>) â input feature length</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -725,14 +801,18 @@ fully connected layer.</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Dropout</code><span class="sig-paren">(</span><em>name</em>, <em>p=0.5</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Dropout" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Droput layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>p</strong> (<em>float</em>) â probability for dropping out the element, i.e., set to 0</p></li> -<li><p><strong>name</strong> (<em>string</em>) â layer name</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>p</strong> (<em>float</em>) â probability for dropping out the element, i.e., set to 0</li> +<li><strong>name</strong> (<em>string</em>) â layer name</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -740,15 +820,19 @@ fully connected layer.</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Activation</code><span class="sig-paren">(</span><em>name</em>, <em>mode='relu'</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Activation" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Activation layers.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>name</strong> (<em>string</em>) â layer name</p></li> -<li><p><strong>mode</strong> (<em>string</em>) â âreluâ, âsigmoidâ, or âtanhâ</p></li> -<li><p><strong>input_sample_shape</strong> (<em>tuple</em>) â shape of a single sample</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>name</strong> (<em>string</em>) â layer name</li> +<li><strong>mode</strong> (<em>string</em>) â âreluâ, âsigmoidâ, or âtanhâ</li> +<li><strong>input_sample_shape</strong> (<em>tuple</em>) â shape of a single sample</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -756,15 +840,19 @@ fully connected layer.</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Softmax</code><span class="sig-paren">(</span><em>name</em>, <em>axis=1</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Softmax" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Apply softmax.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>axis</strong> (<em>int</em>) â reshape the input as a matrix with the dimension -[0,axis) as the row, the [axis, -1) as the column.</p></li> -<li><p><strong>input_sample_shape</strong> (<em>tuple</em>) â shape of a single sample</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>axis</strong> (<em>int</em>) â reshape the input as a matrix with the dimension +[0,axis) as the row, the [axis, -1) as the column.</li> +<li><strong>input_sample_shape</strong> (<em>tuple</em>) â shape of a single sample</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -772,15 +860,19 @@ fully connected layer.</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Flatten</code><span class="sig-paren">(</span><em>name</em>, <em>axis=1</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Flatten" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Reshape the input tensor into a matrix.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>axis</strong> (<em>int</em>) â reshape the input as a matrix with the dimension -[0,axis) as the row, the [axis, -1) as the column.</p></li> -<li><p><strong>input_sample_shape</strong> (<em>tuple</em>) â shape for a single sample</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>axis</strong> (<em>int</em>) â reshape the input as a matrix with the dimension +[0,axis) as the row, the [axis, -1) as the column.</li> +<li><strong>input_sample_shape</strong> (<em>tuple</em>) â shape for a single sample</li> +</ul> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="class"> @@ -788,36 +880,45 @@ fully connected layer.</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Merge</code><span class="sig-paren">(</span><em>name</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Merge" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Sum all input tensors.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>input_sample_shape</strong> â sample shape of the input. The sample shape of all -inputs should be the same.</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>input_sample_shape</strong> â sample shape of the input. The sample shape of all +inputs should be the same.</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Merge.setup"> <code class="descname">setup</code><span class="sig-paren">(</span><em>in_shape</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Merge.setup" title="Permalink to this definition">¶</a></dt> <dd><p>Call the C++ setup function to create params and set some meta data.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is a single tuple specifying the inpute Tensor shape; if the layer accepts multiple input Tensor (e.g., the concatenation layer), -in_shapes is a tuple of tuples, each for one input Tensor</p> -</dd> -</dl> +in_shapes is a tuple of tuples, each for one input Tensor</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Merge.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Merge.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> @@ -828,25 +929,30 @@ has multiple outputs</p> :param flag: not used. :param inputs: a list of tensors :type inputs: list</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>A single tensor as the sum of all input tensors</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">A single tensor as the sum of all input tensors</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Merge.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>flag</em>, <em>grad</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Merge.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Replicate the grad for each input source layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>grad</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) â </p> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>A list of replicated grad, one per source layer</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>grad</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) â </td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body">A list of replicated grad, one per source layer</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -856,70 +962,86 @@ has multiple outputs</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">Split</code><span class="sig-paren">(</span><em>name</em>, <em>num_output</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Split" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Replicate the input tensor.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>num_output</strong> (<em>int</em>) â number of output tensors to generate.</p></li> -<li><p><strong>input_sample_shape</strong> â includes a single integer for the input sample -feature size.</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>num_output</strong> (<em>int</em>) â number of output tensors to generate.</li> +<li><strong>input_sample_shape</strong> â includes a single integer for the input sample +feature size.</li> +</ul> +</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Split.setup"> <code class="descname">setup</code><span class="sig-paren">(</span><em>in_shape</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Split.setup" title="Permalink to this definition">¶</a></dt> <dd><p>Call the C++ setup function to create params and set some meta data.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>in_shapes</strong> â if the layer accepts a single input Tensor, in_shapes is a single tuple specifying the inpute Tensor shape; if the layer accepts multiple input Tensor (e.g., the concatenation layer), -in_shapes is a tuple of tuples, each for one input Tensor</p> -</dd> -</dl> +in_shapes is a tuple of tuples, each for one input Tensor</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Split.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Split.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Split.forward"> <code class="descname">forward</code><span class="sig-paren">(</span><em>flag</em>, <em>input</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Split.forward" title="Permalink to this definition">¶</a></dt> <dd><p>Replicate the input tensor into mutiple tensors.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â not used</p></li> -<li><p><strong>input</strong> â a single input tensor</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>a list a output tensor (each one is a copy of the input)</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â not used</li> +<li><strong>input</strong> â a single input tensor</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last">a list a output tensor (each one is a copy of the input)</p> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Split.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>flag</em>, <em>grads</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Split.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Sum all grad tensors to generate a single output tensor.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><p><strong>grads</strong> (<em>list of Tensor</em>) â </p> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>a single tensor as the sum of all grads</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>grads</strong> (<em>list of Tensor</em>) â </td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body">a single tensor as the sum of all grads</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -930,50 +1052,63 @@ has multiple outputs</p> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Concatenate tensors vertically (axis = 0) or horizontally (axis = 1).</p> <p>Currently, only support tensors with 2 dimensions.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>axis</strong> (<em>int</em>) â 0 for concat row; 1 for concat columns;</p></li> -<li><p><strong>input_sample_shapes</strong> â a list of sample shape tuples, one per input tensor</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>axis</strong> (<em>int</em>) â 0 for concat row; 1 for concat columns;</li> +<li><strong>input_sample_shapes</strong> â a list of sample shape tuples, one per input tensor</li> +</ul> +</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Concat.forward"> <code class="descname">forward</code><span class="sig-paren">(</span><em>flag</em>, <em>inputs</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Concat.forward" title="Permalink to this definition">¶</a></dt> <dd><p>Concatenate all input tensors.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â same as Layer::forward()</p></li> -<li><p><strong>input</strong> â a list of tensors</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>a single concatenated tensor</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â same as Layer::forward()</li> +<li><strong>input</strong> â a list of tensors</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last">a single concatenated tensor</p> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Concat.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>flag</em>, <em>dy</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Concat.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Backward propagate gradients through this layer.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â same as Layer::backward()</p></li> -<li><p><strong>dy</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) â the gradient tensors of y w.r.t objective loss</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p><dl class="simple"> -<dt><dx, []>, dx is a list tensors for the gradient of the inputs; []</dt><dd><p>is an empty list.</p> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â same as Layer::backward()</li> +<li><strong>dy</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) â the gradient tensors of y w.r.t objective loss</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last"><dl class="docutils"> +<dt><dx, []>, dx is a list tensors for the gradient of the inputs; []</dt> +<dd><p class="first last">is an empty list.</p> </dd> </dl> </p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -984,64 +1119,80 @@ has multiple outputs</p> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Slice the input tensor into multiple sub-tensors vertially (axis=0) or horizontally (axis=1).</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>axis</strong> (<em>int</em>) â 0 for slice rows; 1 for slice columns;</p></li> -<li><p><strong>slice_point</strong> (<em>list</em>) â positions along the axis to do slice; there are n-1 -points for n sub-tensors;</p></li> -<li><p><strong>input_sample_shape</strong> â input tensor sample shape</p></li> -</ul> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>axis</strong> (<em>int</em>) â 0 for slice rows; 1 for slice columns;</li> +<li><strong>slice_point</strong> (<em>list</em>) â positions along the axis to do slice; there are n-1 +points for n sub-tensors;</li> +<li><strong>input_sample_shape</strong> â input tensor sample shape</li> +</ul> +</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.Slice.get_output_sample_shape"> <code class="descname">get_output_sample_shape</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Slice.get_output_sample_shape" title="Permalink to this definition">¶</a></dt> <dd><p>Called after setup to get the shape of the output sample(s).</p> -<dl class="field-list simple"> -<dt class="field-odd">Returns</dt> -<dd class="field-odd"><p>a tuple for a single output Tensor or a list of tuples if this layer -has multiple outputs</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">a tuple for a single output Tensor or a list of tuples if this layer +has multiple outputs</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Slice.forward"> <code class="descname">forward</code><span class="sig-paren">(</span><em>flag</em>, <em>x</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Slice.forward" title="Permalink to this definition">¶</a></dt> <dd><p>Slice the input tensor on the given axis.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â same as Layer::forward()</p></li> -<li><p><strong>x</strong> â a single input tensor</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p>a list a output tensor</p> -</dd> -</dl> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â same as Layer::forward()</li> +<li><strong>x</strong> â a single input tensor</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last">a list a output tensor</p> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.Slice.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>flag</em>, <em>grads</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.Slice.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Concate all grad tensors to generate a single output tensor</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â same as Layer::backward()</p></li> -<li><p><strong>grads</strong> â a list of tensors, one for the gradient of one sliced tensor</p></li> -</ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p><dl class="simple"> -<dt>a single tensor for the gradient of the original user, and an empty</dt><dd><p>list.</p> +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â same as Layer::backward()</li> +<li><strong>grads</strong> â a list of tensors, one for the gradient of one sliced tensor</li> +</ul> +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last"><dl class="docutils"> +<dt>a single tensor for the gradient of the original user, and an empty</dt> +<dd><p class="first last">list.</p> </dd> </dl> </p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -1051,85 +1202,99 @@ has multiple outputs</p> <em class="property">class </em><code class="descclassname">singa.layer.</code><code class="descname">RNN</code><span class="sig-paren">(</span><em>name</em>, <em>hidden_size</em>, <em>rnn_mode='lstm'</em>, <em>dropout=0.0</em>, <em>num_stacks=1</em>, <em>input_mode='linear'</em>, <em>bidirectional=False</em>, <em>param_specs=None</em>, <em>input_sample_shape=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.RNN" title="Permalink to this definition">¶</a></dt> <dd><p>Bases: <a class="reference internal" href="#singa.layer.Layer" title="singa.layer.Layer"><code class="xref py py-class docutils literal notranslate"><span class="pre">singa.layer.Layer</span></code></a></p> <p>Recurrent layer with 4 types of units, namely lstm, gru, tanh and relu.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>hidden_size</strong> â hidden feature size, the same for all stacks of layers.</p></li> -<li><p><strong>rnn_mode</strong> â decides the rnn unit, which could be one of âlstmâ, âgruâ, -âtanhâ and âreluâ, refer to cudnn manual for each mode.</p></li> -<li><p><strong>num_stacks</strong> â num of stacks of rnn layers. It is different to the -unrolling seqence length.</p></li> -<li><p><strong>input_mode</strong> â âlinearâ convert the input feature x by by a linear +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple"> +<li><strong>hidden_size</strong> â hidden feature size, the same for all stacks of layers.</li> +<li><strong>rnn_mode</strong> â decides the rnn unit, which could be one of âlstmâ, âgruâ, +âtanhâ and âreluâ, refer to cudnn manual for each mode.</li> +<li><strong>num_stacks</strong> â num of stacks of rnn layers. It is different to the +unrolling seqence length.</li> +<li><strong>input_mode</strong> â âlinearâ convert the input feature x by by a linear transformation to get a feature vector of size hidden_size; âskipâ does nothing but requires the input feature size equals -hidden_size</p></li> -<li><p><strong>bidirection</strong> â True for bidirectional RNN</p></li> -<li><p><strong>param_specs</strong> â config for initializing the RNN parameters.</p></li> -<li><p><strong>input_sample_shape</strong> â includes a single integer for the input sample -feature size.</p></li> -</ul> -</dd> -</dl> +hidden_size</li> +<li><strong>bidirection</strong> â True for bidirectional RNN</li> +<li><strong>param_specs</strong> â config for initializing the RNN parameters.</li> +<li><strong>input_sample_shape</strong> â includes a single integer for the input sample +feature size.</li> +</ul> +</td> +</tr> +</tbody> +</table> <dl class="method"> <dt id="singa.layer.RNN.forward"> <code class="descname">forward</code><span class="sig-paren">(</span><em>flag</em>, <em>inputs</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.RNN.forward" title="Permalink to this definition">¶</a></dt> <dd><p>Forward inputs through the RNN.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>flag</strong> â True(kTrain) for training; False(kEval) for evaluation; -others values for future use.</p></li> -<li><p><strong><x1</strong><strong>, </strong><strong>x2</strong><strong>,</strong><strong>..xn</strong><strong>, </strong><strong>hx</strong><strong>, </strong><strong>cx></strong><strong>, </strong><strong>where xi is the input tensor for the</strong> (<em>inputs</em><em>,</em>) â i-th position, its shape is (batch_size, input_feature_length); +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>flag</strong> â True(kTrain) for training; False(kEval) for evaluation; +others values for future use.</li> +<li><strong><x1</strong><strong>, </strong><strong>x2</strong><strong>,</strong><strong>..xn</strong><strong>, </strong><strong>hx</strong><strong>, </strong><strong>cx></strong><strong>, </strong><strong>where xi is the input tensor for the</strong> (<em>inputs</em><em>,</em>) â i-th position, its shape is (batch_size, input_feature_length); the batch_size of xi must >= that of xi+1; hx is the initial hidden state of shape (num_stacks * bidirection?2:1, batch_size, hidden_size). cx is the initial cell state tensor of the same shape as hy. cx is valid for only lstm. For other RNNs there is no cx. Both hx and cx could be dummy tensors without shape and -data.</p></li> +data.</li> </ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p><dl class="simple"> -<dt><y1, y2, ⦠yn, hy, cy>, where yi is the output tensor for the i-th</dt><dd><p>position, its shape is (batch_size, +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last"><dl class="docutils"> +<dt><y1, y2, ⦠yn, hy, cy>, where yi is the output tensor for the i-th</dt> +<dd><p class="first last">position, its shape is (batch_size, hidden_size * bidirection?2:1). hy is the final hidden state tensor. cx is the final cell state tensor. cx is only used for lstm.</p> </dd> </dl> </p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> <dl class="method"> <dt id="singa.layer.RNN.backward"> <code class="descname">backward</code><span class="sig-paren">(</span><em>flag</em>, <em>grad</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.layer.RNN.backward" title="Permalink to this definition">¶</a></dt> <dd><p>Backward gradients through the RNN.</p> -<dl class="field-list simple"> -<dt class="field-odd">Parameters</dt> -<dd class="field-odd"><ul class="simple"> -<li><p><strong>for future use.</strong> (<em>flag</em><em>,</em>) â </p></li> -<li><p><strong><dy1</strong><strong>, </strong><strong>dy2</strong><strong>,</strong><strong>..dyn</strong><strong>, </strong><strong>dhy</strong><strong>, </strong><strong>dcy></strong><strong>, </strong><strong>where dyi is the gradient for the</strong> (<em>grad</em><em>,</em>) â </p></li> -<li><p><strong>output</strong><strong>, </strong><strong>its shape is</strong><strong> (</strong><strong>batch_size</strong><strong>, </strong><strong>hidden_size*bidirection?2</strong> (<em>i-th</em>) â 1); +<table class="docutils field-list" frame="void" rules="none"> +<col class="field-name" /> +<col class="field-body" /> +<tbody valign="top"> +<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple"> +<li><strong>for future use.</strong> (<em>flag</em><em>,</em>) â </li> +<li><strong><dy1</strong><strong>, </strong><strong>dy2</strong><strong>,</strong><strong>..dyn</strong><strong>, </strong><strong>dhy</strong><strong>, </strong><strong>dcy></strong><strong>, </strong><strong>where dyi is the gradient for the</strong> (<em>grad</em><em>,</em>) â </li> +<li><strong>output</strong><strong>, </strong><strong>its shape is</strong><strong> (</strong><strong>batch_size</strong><strong>, </strong><strong>hidden_size*bidirection?2</strong> (<em>i-th</em>) â 1); dhy is the gradient for the final hidden state, its shape is (num_stacks * bidirection?2:1, batch_size, hidden_size). dcy is the gradient for the final cell state. cx is valid only for lstm. For other RNNs there is no cx. Both dhy and dcy could be dummy tensors without shape and -data.</p></li> +data.</li> </ul> -</dd> -<dt class="field-even">Returns</dt> -<dd class="field-even"><p><dl class="simple"> -<dt><dx1, dx2, ⦠dxn, dhx, dcx>, where dxi is the gradient tensor for</dt><dd><p>the i-th input, its shape is (batch_size, +</td> +</tr> +<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last"><dl class="docutils"> +<dt><dx1, dx2, ⦠dxn, dhx, dcx>, where dxi is the gradient tensor for</dt> +<dd><p class="first last">the i-th input, its shape is (batch_size, input_feature_length). dhx is the gradient for the initial hidden state. dcx is the gradient for the initial cell state, which is valid only for lstm.</p> </dd> </dl> </p> -</dd> -</dl> +</td> +</tr> +</tbody> +</table> </dd></dl> </dd></dl> @@ -1170,7 +1335,7 @@ supported layers</p> <a href="net.html" class="btn btn-neutral float-right" title="FeedForward Net" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a> - <a href="tensor.html" class="btn btn-neutral float-left" title="Tensor" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a> + <a href="tensor.html" class="btn btn-neutral" title="Tensor" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a> </div> @@ -1179,7 +1344,7 @@ supported layers</p> <div role="contentinfo"> <p> - © Copyright 2019 The Apache Software Foundation. All rights reserved. Apache SINGA, Apache, the Apache feather logo, and the Apache SINGA project logos are trademarks of The Apache Software Foundation. All other marks mentioned may be trademarks or registered trademarks of their respective owners. + © Copyright 2019 The Apache Software Foundation. All rights reserved. Apache SINGA, Apache, the Apache feather logo, and the Apache SINGA project logos are trademarks of The Apache Software Foundation. All other marks mentioned may be trademarks or registered trademarks of their respective owners.. </p> </div> @@ -1196,17 +1361,36 @@ supported layers</p> + + + <script type="text/javascript"> + var DOCUMENTATION_OPTIONS = { + URL_ROOT:'../', + VERSION:'1.1.0', + LANGUAGE:'None', + COLLAPSE_INDEX:false, + FILE_SUFFIX:'.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt' + }; + </script> + <script type="text/javascript" src="../_static/jquery.js"></script> + <script type="text/javascript" src="../_static/underscore.js"></script> + <script type="text/javascript" src="../_static/doctools.js"></script> + + + + + + <script type="text/javascript" src="../_static/js/theme.js"></script> + + <script type="text/javascript"> jQuery(function () { SphinxRtdTheme.Navigation.enable(true); }); </script> - - - - - <div class="rst-versions" data-toggle="rst-versions" role="note" aria-label="versions"> <span class="rst-current-version" data-toggle="rst-current-version"> <span class="fa fa-book"> incubator-singa </span>
