Thanks Eric, I failed to mention that I am one of the main developers of 
Lora ( so I basically wrote this README note :) ). I was just trying to 
figure out why Lora fails with Julia 0.5, but I can now see that it is 
almost certainly due to a recent change in ForwardDiff, since the latter 
package has a "data" function.

On Sunday, 3 January 2016 00:18:38 UTC, Eric Forgy wrote:
>
> Hi,
>
> I don't use Lora.jl, but opening up the repo on GitHub, I see the tests on 
> master are failing. It's probably a good idea to stick with the latest 
> tagged release.
>
> The README also has this warning:
>
> "Lora has undergone a major upgrade. Some of its recent changes include:
>
> Models are represented internally by graphs.
>
> Memory allocation and garbage collection have been reduced by using 
> mutating functions associated with targets.
>
> It is possible to select storing output in memory or in file at runtime.
>
> Automatic differentiation is available allowing to choose between forward 
> mode and reverse mode (the latter relying on source transformation).
>
> To run the current version of Lora, it is needed to Pkg.checkout() both 
> Lora and ReverseDiffSource. Both packages will be pushed to METADATA very 
> soon.
>
> Some of the old one has not been fully ported. The full porting of old 
> functionality, as well as further developments, will be completed shortly. 
> Progress is being tracked systematically via issues and milestones.
>
> The documentation is out of date, but will be brought up-to-date fairly 
> soon. In the meantime, this README file provides a few examples of the new 
> interface, explaining how to get up to speed with the new face of Lora."
>
> Hope this helps.
>
>

Reply via email to