Hi!

Toying around with Gradiant Descent I ran into problems using
pderiv_jcalculus_ for second derivatives of a loss function.

It seems the problem lies in the case when the verb cannot be symbolically
derived:
for instance:
   Ddot =: pderiv_jcalculus_

*: Ddot 2

2"0 NB. fine

! Ddot 2

0&(! (2 : 0) 2)

:

NB. Get function to use: u. or higher-order secant

if. n = 1 do. func =. u.@] else. func =. u. derivsecant (n-1) end.

NB. x must be an atom or conform to shape of y

if. 0=#@$x do. x =. ($y)$x end. NB. replicate atom

assert. x -:&$ y NB. shapes must agree

x =. x + 1e_7 * 0 = x NB. replace 0 by epsilon

newy =. y +"(#@$y) (,~$x) $ (#~ 1 j. #) ,x NB. array of moved points, each
an array with 1 nonzero

f0 =. x func y NB. the function at the initial point

((x func"(#@$y) newy) -"(#@$f0) f0) % x NB. evaluate function at moved
points, calc slope. x used only for higher orders

)

! Ddot 2] 3

|value error: derivsecant

| func=.u. derivsecant(n-1)

First derivative does work:

! Ddot 1]3

7.53671


It appears that adding _jcalculus_ to derivsecant (line 37 of the addon)
makes also ! Ddot 2 ]3 work, but I don't know whether it has any
unpredictable side effects.

Best regards,

Jan-Pieter
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to