[ 
https://issues.apache.org/jira/browse/SINGA-180?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15313795#comment-15313795
 ] 

ASF subversion and git services commented on SINGA-180:
-------------------------------------------------------

Commit 3e2507b7af8c4fe3746f3156f29eba99a30e546f in incubator-singa's branch 
refs/heads/dev from jixin
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=3e2507b ]

SINGA-180 Add Activation layer and Softmax layer

Add cpu and cudnn implementation for activation and softmax layer.

Note: activation layer currently support sigmoid/tanh function and relu forward 
computation.

Remove tensor softmax function. Instead, use tensor op(*) and function(Sum) to 
impletment softmax function.

Add test files for activation and softmax layer.

Add Element-wise implementation for activation functions (relu/tanh/sigmoid).

Add tensor scaler comparison function (<, <=, >, >=), i.e., to compare a tensor 
with a constant.

Add implementation for tensor math functions (exp, log, pow).

Add functions for matrix op vector, where op is multiply and div.

Pass all tests.


> Add Activation layer and Softmax layer
> --------------------------------------
>
>                 Key: SINGA-180
>                 URL: https://issues.apache.org/jira/browse/SINGA-180
>             Project: Singa
>          Issue Type: New Feature
>            Reporter: Xin Ji
>
> Activation and Softmax layer are implemented using Tensor math functions.
> CudnnActivation are implemented using both cudnn 4 and cudnn 5.
> CudnnSoftmax layer are implemented using cudnn 5, the same apis as cudnn 4.
> Test files are added for testing the correctness of the above four layers.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to