Improve stability of the ELU grad check test. Previously, the input value variance was small enough that an incorrect implementation of the ELU function could have been masked. This increases the variance of X in order to improve the stability and utility of this gradient test case.
Project: http://git-wip-us.apache.org/repos/asf/systemml/repo Commit: http://git-wip-us.apache.org/repos/asf/systemml/commit/2cee9bb9 Tree: http://git-wip-us.apache.org/repos/asf/systemml/tree/2cee9bb9 Diff: http://git-wip-us.apache.org/repos/asf/systemml/diff/2cee9bb9 Branch: refs/heads/master Commit: 2cee9bb9f5d8ad43759e747397ba517b0675a7d3 Parents: 91b040d Author: Mike Dusenberry <[email protected]> Authored: Thu Mar 8 23:19:38 2018 -0800 Committer: Mike Dusenberry <[email protected]> Committed: Thu Mar 8 23:19:38 2018 -0800 ---------------------------------------------------------------------- scripts/nn/test/grad_check.dml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/systemml/blob/2cee9bb9/scripts/nn/test/grad_check.dml ---------------------------------------------------------------------- diff --git a/scripts/nn/test/grad_check.dml b/scripts/nn/test/grad_check.dml index 6150287..8fbfa76 100644 --- a/scripts/nn/test/grad_check.dml +++ b/scripts/nn/test/grad_check.dml @@ -2469,7 +2469,7 @@ elu = function() { N = 3 # num examples M = 10 # num neurons - X = rand(rows=N, cols=M) + X = rand(rows=N, cols=M, min=-5, max=5) y = rand(rows=N, cols=M) out = elu::forward(X, 1) @@ -2477,7 +2477,7 @@ elu = function() { dX = elu::backward(dout, X, 1) # Grad check - h = 1e-6 + h = 1e-5 print(" - Grad checking X.") for (i in 1:nrow(X)) { for (j in 1:ncol(X)) {
