Have you seen www.juliadiff.org ?

Best,

Tamas

On Tue, Apr 12 2016, Jameson Quinn wrote:

> Tamas is right, automatic differentiation is far better than symbolic 
> differentiation for this problem. But that doesn't really change anything 
> about my question. Both AD and SD require passing number-like objects 
> instead of numbers to an expression, in this case a density function. In 
> the case of AD, it's dual (or hyper) numbers, in the case of SD it's 
> symbolic variables. So my original question stands: what's the Julian way 
> to have a common declaration for something that may be either a 
> distribution (using numbers) or an AD-able distribution (using dual 
> numbers)? 
>
> On Monday, 11 April 2016 16:06:26 UTC-4, Tamas Papp wrote:
>>
>> Not sure this helps, but AFAICT all practical programs that need 
>> derivatives of the likelihood (or the posterior) use automatic 
>> differentiation (eg Stan). Symbolic calculations get unmanageably 
>> complex very quickly. 
>>
>> Best, 
>>
>> Tamas 
>>
>> On Mon, Apr 11 2016, Jameson Quinn wrote: 
>>
>> > I applied to GSoC with an idea about allowing MLE/MPD estimation using 
>> > Mamba models. In order to do this, it would be useful to be able to do 
>> > symbolic calculus on log-pdf values (so as to get the "score", "observed 
>> > information", etc.) I'm thinking about what the right julian way to do 
>> this 
>> > would be. 
>> > 
>> > Say I have a Mamba model which includes the following statement: 
>> > 
>> > 
>> >   alpha = Stochastic(1, 
>> >     (mu_alpha, s2_alpha) -> Normal(mu_alpha, sqrt(s2_alpha)), 
>> >     false 
>> >   ) 
>> > 
>> > I'd like to be able to pass a SymPy.Sym symbolic variable to the lambda 
>> on 
>> > the second line, and get a SymbolicNormal object sn; then do logpdf(sn, 
>> > alpha::SymPy.Sym) and get an expression which I could then differentiate 
>> > over any of its three variables (mu_alpha, s2_alpha, or alpha). 
>> > 
>> > As far as I can tell, abusing the type hierarchy so that a call to 
>> > Normal(a::SymPy.Sym,b::SymPy.Sym) returns a SymbolicNormal (which is not 
>> a 
>> > subclass of Normal but rather a sibling class which inherits from 
>> > AbstractNormal) is probably a no-no. And making mamba users use some 
>> > GiveMeANormalHere() function instead of just the eponymous Normal() 
>> > constructor is annoying, The next option that comes to mind is to use 
>> > metaprogramming to decompile the lambda and replace the Normal() call 
>> with 
>> > a SymbolicNormal() call.... Note that this silly decompiling, replacing, 
>> > calculus, etc. could all happen just once, and actual optimization could 
>> > still be a fast loop. 
>> > 
>> > Am I going way too crazy with this last idea? If not, how do you get the 
>> > (lowered?) AST for a "stabby lambda"? "code_lowered" and friends do not 
>> > seem to work, at least not in the ways I'd expect them to. 
>> > 
>> > Thanks, 
>> > Jameson Quinn 
>>
>>

Reply via email to