[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-05 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r310489393
 
 

 ##
 File path: test/python/test_operation.py
 ##
 @@ -322,6 +335,48 @@ def test_LeakyRelu(self):
 np.testing.assert_array_almost_equal(tensor.to_numpy(result), XT)
 self.check_shape(dx.shape(), (3, 2))
 
+def test_Mean_gpu(self):
+x0 = np.array([-0.9, -0.3, -0.1, 0.1, 0.5, 0.9]).reshape(3, 
2).astype(np.float32)
+x1 = np.array([0, -0.3, 0, 0.1, 0, 0.9]).reshape(3, 
2).astype(np.float32)
+y = (x0+x1)/2
+lossf =lambda x,y:np.sum((x+y)/2)
+grad=eval_numerical_gradient(lossf,x0,x1)
+grad1=eval_numerical_gradient(lossf,x1,x0)
 
 Review comment:
   I guess you can use this function:
   ```
   def eval_numerical_gradient_b(f, x, y, reverse = False):
   h = 0.1
   grad = np.zeros(x.shape)
   t = y if reverse else x
   fx = f(x, y)
   it = np.nditer(t, flags=['multi_index'], op_flags=['readwrite'])
   while not it.finished:
   _it = it.multi_index
   old_value = t[_it]
   t[_it] = old_value + h # increment by h
   fth = f(x, y) # evaluate f(x + h)
   t[_it] = old_value # restore to previous value (very important!) 
   grad[_it] = (fth - fx) / h # the slope
   it.iternext() # step to next dimension
   return grad
   ```
   if the reverse if False, this function gets the grads of x based on f(x,y), 
if reverse if true, it gets the grads of y still based on f(x,y)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-05 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r310467053
 
 

 ##
 File path: test/python/test_operation.py
 ##
 @@ -322,6 +335,48 @@ def test_LeakyRelu(self):
 np.testing.assert_array_almost_equal(tensor.to_numpy(result), XT)
 self.check_shape(dx.shape(), (3, 2))
 
+def test_Mean_gpu(self):
+x0 = np.array([-0.9, -0.3, -0.1, 0.1, 0.5, 0.9]).reshape(3, 
2).astype(np.float32)
+x1 = np.array([0, -0.3, 0, 0.1, 0, 0.9]).reshape(3, 
2).astype(np.float32)
+y = (x0+x1)/2
+lossf =lambda x,y:np.sum((x+y)/2)
+grad=eval_numerical_gradient(lossf,x0,x1)
+grad1=eval_numerical_gradient(lossf,x1,x0)
 
 Review comment:
   For example, the function is ```(X + X')/2```, at the first call, you 
calculate the gradient of x0 based on (x0+x1)/2, but at the second call, you 
calculate the gradient at x1 based on (x1+x0)/2. I think the two are still same.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-05 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r310463919
 
 

 ##
 File path: test/python/test_operation.py
 ##
 @@ -322,6 +335,48 @@ def test_LeakyRelu(self):
 np.testing.assert_array_almost_equal(tensor.to_numpy(result), XT)
 self.check_shape(dx.shape(), (3, 2))
 
+def test_Mean_gpu(self):
+x0 = np.array([-0.9, -0.3, -0.1, 0.1, 0.5, 0.9]).reshape(3, 
2).astype(np.float32)
+x1 = np.array([0, -0.3, 0, 0.1, 0, 0.9]).reshape(3, 
2).astype(np.float32)
+y = (x0+x1)/2
+lossf =lambda x,y:np.sum((x+y)/2)
+grad=eval_numerical_gradient(lossf,x0,x1)
+grad1=eval_numerical_gradient(lossf,x1,x0)
 
 Review comment:
   you can write it here because it doesn't matter when you change the order of 
x0 and x1 for ADD function. But when you write DIV function, you can not just 
switch it, because 'x0/x1' is different from '1x/x0' 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-05 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r310463919
 
 

 ##
 File path: test/python/test_operation.py
 ##
 @@ -322,6 +335,48 @@ def test_LeakyRelu(self):
 np.testing.assert_array_almost_equal(tensor.to_numpy(result), XT)
 self.check_shape(dx.shape(), (3, 2))
 
+def test_Mean_gpu(self):
+x0 = np.array([-0.9, -0.3, -0.1, 0.1, 0.5, 0.9]).reshape(3, 
2).astype(np.float32)
+x1 = np.array([0, -0.3, 0, 0.1, 0, 0.9]).reshape(3, 
2).astype(np.float32)
+y = (x0+x1)/2
+lossf =lambda x,y:np.sum((x+y)/2)
+grad=eval_numerical_gradient(lossf,x0,x1)
+grad1=eval_numerical_gradient(lossf,x1,x0)
 
 Review comment:
   you can write it here because it doesn't matter when you change the order of 
x0 and x1. But when you write DIV function, you can not just switch it, because 
'x0/x1' is different from '1x/x0' 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-01 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r309588478
 
 

 ##
 File path: python/singa/autograd.py
 ##
 @@ -354,6 +354,40 @@ def backward(self, dy):
 return singa.__mul__(dy, dx)
 
 
+def relu(x):
+return ReLU()(x)[0]
+
+class Mean(Operation):
 
 Review comment:
   do you need some comments to indicate this function is element-wise mean?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-01 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r309588039
 
 

 ##
 File path: test/python/test_operation.py
 ##
 @@ -322,6 +335,46 @@ def test_LeakyRelu(self):
 np.testing.assert_array_almost_equal(tensor.to_numpy(result), XT)
 self.check_shape(dx.shape(), (3, 2))
 
+def test_Mean_gpu(self):
+x0 = np.array([-0.9, -0.3, -0.1, 0.1, 0.5, 0.9]).reshape(3, 
2).astype(np.float32)
+x1 = np.array([0, -0.3, 0, 0.1, 0, 0.9]).reshape(3, 
2).astype(np.float32)
+y = (x0+x1)/2
+lossf =lambda x,y:np.sum((x+y)/2)
+grad=eval_numerical_gradient(lossf,x0,x1)
+x0 = tensor.from_numpy(x0)
+x1 = tensor.from_numpy(x1)
+x0.to_device(gpu_dev)
+x1.to_device(gpu_dev)
+
+result = autograd.mean(x0,x1)
+dy = tensor.from_numpy(np.ones((3,2)).astype(np.float32))
+dy.to_device(gpu_dev)
+dx0,dx1 = result.creator.backward(dy.data)
+
+np.testing.assert_array_almost_equal(tensor.to_numpy(result), y, 
decimal=5)
+
np.testing.assert_array_almost_equal(tensor.to_numpy(tensor.from_raw_tensor(dx0)),
 grad, decimal=2)
+
np.testing.assert_array_almost_equal(tensor.to_numpy(tensor.from_raw_tensor(dx1)),
 grad, decimal=2)
 
 Review comment:
   This grad is the gradient at x0, not x1.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-01 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r309588325
 
 

 ##
 File path: test/python/test_operation.py
 ##
 @@ -37,7 +37,20 @@
 def _tuple_to_string(t):
 lt = [str(x) for x in t]
 return '(' + ', '.join(lt) + ')'
-
+def eval_numerical_gradient(f, x,y):
 
 Review comment:
   This function calculates the gradient of f at x, not y. Please be careful 
when you use it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-01 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r309570624
 
 

 ##
 File path: python/singa/autograd.py
 ##
 @@ -354,6 +354,40 @@ def backward(self, dy):
 return singa.__mul__(dy, dx)
 
 
+def relu(x):
+return ReLU()(x)[0]
+
+class Mean(Operation):
+def __init__(self):
 
 Review comment:
   do you need some comments to indicate this function is element-wise mean?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-singa] joddiy commented on a change in pull request #496: SINGA-474 Mean operator

2019-08-01 Thread GitBox
joddiy commented on a change in pull request #496: SINGA-474 Mean operator
URL: https://github.com/apache/incubator-singa/pull/496#discussion_r309570624
 
 

 ##
 File path: python/singa/autograd.py
 ##
 @@ -354,6 +354,40 @@ def backward(self, dy):
 return singa.__mul__(dy, dx)
 
 
+def relu(x):
+return ReLU()(x)[0]
+
+class Mean(Operation):
+def __init__(self):
 
 Review comment:
   do you need some comments to indicate this function is element-wise mean?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services