On Tue, 18 May 2010, Nachiket Gokhale wrote:

> What's a "sensitivity gradient"? How does it different from a
> finite-difference gradient? Is it an automatic differentiation based
> gradient?



sorry,

I meant the gradient of the objective function computed using
the sensitivity (derivative of the solution with respect to one of your 
model parameters). The sensitivity is an intermediate step between
the FD and adjoint approach that arises from the chain rule 
differentiation of the objective function.

ie for an objective function/QOI, f(u), that has some nonlinear 
dependence, g(u), on solution u,

f (u) = \int g(u) dx

the derivative of f w/ respect to model parameter m, dfdm, is

dfdm = \int dgdu(u) dudm   dx


where dudm would be the sensitivity and dgdu would be the derivative
of the nonlinear dependence with respect to solution.



do you call it another name?






>
> -Nachiket
>
> On Tue, May 18, 2010 at 10:46 AM, David Fuentes <[email protected]> wrote:
>
>> On Tue, 18 May 2010, Roy Stogner wrote:
>>
>>>
>>> On Mon, 17 May 2010, David Fuentes wrote:
>>>
>>>> Can the
>>>>
>>>>  src/systems/implicit_system.C
>>>>    ImplicitSystem::qoi_parameter_hessian_vector_product
>>>>
>>>> routine be used
>>>
>>> Unfortunately the answer is probably "no" right here - the forward and
>>> adjoint sensitivity methods are passing verification benchmark tests
>>> right now, but the hessian is failing and the hessian_vector_product
>>> hasn't been tested yet.
>>
>>
>> Thanks Roy,
>>
>> I've always wondered where are the libMesh regression tests stored?
>> Are they available to the public?
>>
>> For verification benchmarks of my adjoint hessian, I am currently
>>
>>   1) comparing adjoint gradient against finite difference gradient
>>   2) comparing sensitivity gradient against adjoint gradient
>>   3) and finally comparing adjoint hessian calculations against
>>      finite difference of the gradient to compute the hessian
>>
>> Are there any other verfication benchmarks are you currently finding
>> useful?
>>
>>
>>
>>
>>>
>>>> w/ a time dependent system where you need to store the
>>>> entire solution history for the state variable, 1st adjoint problem, and
>>>> the linear combination of the "weighted" sensitivity to form the hessian
>>>> vector product?
>>>
>>> And this answer is definitely "no".  Right now adjoint_solve() assumes
>>> you're solving a steady system.
>>>
>>>> In what data structures would you store the solution history?
>>>
>>> Not sure yet (which is part of the reason for that second "no").  For
>>> large systems with implicit solvers the best thing to do is to write
>>> and read (asynchronously) to disk.  For smaller implicit systems we'd
>>> want to keep the whole thing in memory.  We're not planning on any
>>> (efficient) support for explicit systems.
>>
>>
>> What did you have in mind for this ?  For smaller implicit systems, I
>> have my own (hacked) version of your TransientSystem Class where I created
>> a (class member) std::vector of NumericVectors to store the data.
>> Solving heat equation, I run out of memory pretty quick for larger
>> number of time steps.
>>
>>
>>
>>
>> df
>>
>>
>>
>>
>>> ---
>>> Roy
>>>
>>
>>
>> ------------------------------------------------------------------------------
>>
>> _______________________________________________
>> Libmesh-users mailing list
>> [email protected]
>> https://lists.sourceforge.net/lists/listinfo/libmesh-users
>>
>

------------------------------------------------------------------------------

_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to