SINGA-349 Create layer operations for autograd
1. Modified the new designed API, fix bugs.
It can work in my computer.
2. Do some codes relocation:
- Integrate operations into autograd.py
- Delete extra files
- Delete operations in tensor.py
Project:
SINGA-349 Create layer operations for autograd
A cnn example for new designed API.
it works well in my computer
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-singa/commit/8146852c
Tree:
SINGA-349 Create layer operations for autograd
1. fix bugs for the new design API
2. add flags for training or evaluation process.
3. add changeable initialization method
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit:
SINGA-349 Create layer operations for autograd
1. Change the API of Conv2d operations into Pytorch style. next step is to
confirm the new design workable.
2. Add flags in Conv2d forward function
3. Delete extra file
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit:
SINGA-349 Create layer operations for autograd
1.realize a simple convolution network based on autograd for test use.
2.the code is runnable on my computer, the training effect is obvious, the
network parameters explainable.
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
SINGA-349 Create layer operations for autograd
Modified the design of convolution operation to let it trainable.
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-singa/commit/51c242b5
Tree:
SINGA-349 Create layer operations for autograd
1. cascade the new generated layer operations as well as some existing
operations like malmul, softmax to test the compatibility.
2. test the autograd engine on these new developed operations to confirm these
operations workable.
Project:
SINGA-349 Create layer operations for autograd
1.Implemente some layer operations by packaging Pylayer.
These layers is runnable.
2.Change the location of layer_ops to python/singa
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit:
SINGA-349 Create layer operations for autograd
1. Add Xavier initialization method.
2. package matmul and add_bias operation to form dense function
3. modified examples
4. remove unfriendly API [0]
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit:
SINGA-349 Create layer operations for autograd
1. rewrite Linear opertion
2. avoid absolute path
3. modified mnist_cnn example
4. delete unnecessary codes
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit:
SINGA-349 Create layer operations for autograd
1. modified the operation: Linear
2. reassemble codes: move Operation, Dummy from tensor.py to autograd.py
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit:
SINGA-349 Create layer operations for autograd
clean the code and add comments
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-singa/commit/6d7d629b
Tree:
Repository: incubator-singa
Updated Branches:
refs/heads/master 6bcd5d0e9 -> 6d7d629bf
SINGA-349 Create layer operations for autograd
1. layer operations works well.
2. following steps :
- to change the API to Pytorch style.
- add flags in forward and backward function
- realize
SINGA-349 Create layer operations for autograd
1.realize a simple convolution network based on autograd for test use.
2.the code is runnable on my computer, the training effect is obvious, the
network parameters explainable.
Project: http://git-wip-us.apache.org/repos/asf/incubator-singa/repo
14 matches
Mail list logo