Andrew, Send your entire code. It will be easier and faster than talking past each other.
Barry > On Mar 20, 2015, at 5:00 PM, Andrew Spott <[email protected]> wrote: > > I’m sorry, I’m not trying to be difficult, but I’m not following. > > The manual states (for my special case): > • u ̇ = A(t)u. Use > > TSSetProblemType(ts,TS LINEAR); > TSSetRHSFunction(ts,NULL,TSComputeRHSFunctionLinear,NULL); > TSSetRHSJacobian(ts,A,A,YourComputeRHSJacobian,&appctx); > > where YourComputeRHSJacobian() is a function you provide that computes A as a > function of time. Or use ... > My `func` does this. It is 7 lines: > > context* c = static_cast<context*>( G_u ); > PetscScalar e = c->E( t_ ); > MatCopy( c->D, A, SAME_NONZERO_PATTERN ); > MatShift( A, e ); > MatDiagonalSet( A, c->H0, INSERT_VALUES); > MatShift( A, std::complex<double>( 0, -1 ) ); > return 0; > > SHOULD `func` touch U? If so, what should `func` do to U? I thought that > the RHSJacobian function was only meant to create A, since dG/du = A(t) (for > this special case). > > -Andrew > > > > On Fri, Mar 20, 2015 at 3:26 PM, Matthew Knepley <[email protected]> wrote: > > On Fri, Mar 20, 2015 at 3:09 PM, Andrew Spott <[email protected]> wrote: > So, it doesn’t seem that zeroing the given vector in the function passed to > TSSetRHSJacobian is the problem. When I do that, it just zeros out the > solution. > > I would think you would zero the residual vector (if you add to it to > construct the residual, as in FEM methods), not the solution. > > The function that is passed to TSSetRHSJacobian has only one responsibility — > to create the jacobian — correct? In my case this is A(t). The solution > vector is given for when you are solving nonlinear problems (A(t) also > depends on U(t)). In my case, I don’t even look at the solution vector > (because my A(t) doesn’t depend on it). > > Are you initializing the Jacobian to 0 first? > > Thanks, > > Matt > > Is this the case? or is there some other responsibility of said function? > > -Andrew > > >Ah ha! > > > >The function passed to TSSetRHSJacobian needs to zero the solution vector? > > > >As a point, this isn’t mentioned in any documentation that I can find. > > > >-Andrew > > On Friday, Mar 20, 2015 at 2:17 PM, Matthew Knepley <[email protected]>, > wrote: > This sounds like a problem in your calculation function where a Vec or Mat > does not get reset to 0, but it does in your by hand code. > > Matt > > On Mar 20, 2015 2:52 PM, "Andrew Spott" <[email protected]> wrote: > I have a fairly simple problem that I’m trying to timestep: > > u’ = A(t) u > > I’m using the crank-nicholson method, which I understand (for this problem) > to be: > > u(t + h) = u(t) + h/2[A(t+h)*u(t+h) + A(t)*u(t)] > or > [1 - h/2 * A(t+1)] u(t+1) = [1 + h/2 * A(t)] u(t) > > When I attempt to timestep using PETSc, the norm of `u` blows up. When I do > it directly (using the above), the norm of `u` doesn’t blow up. > > It is important to note that the solution generated after the first step is > identical for both, but the second step for Petsc has a norm of ~2, while for > the directly calculated version it is ~1. The third step for petsc has a > norm of ~4, while the directly calculated version it is still ~1. > > I’m not sure what I’m doing wrong. > > PETSc code is taken out of the manual and is pretty simple: > > TSCreate( comm, &ts ); > TSSetProblemType( ts, TS_LINEAR); > TSSetType( ts, TSCN ); > TSSetInitialTimeStep( ts, 0, 0.01 ); > TSSetDuration( ts, 5, 0.03 ); > TSSetFromOptions( ts ); > TSSetRHSFunction( ts, NULL, TSComputeRHSFunctionLinear, NULL ); > TSSetRHSJacobian( ts, A, A, func, &cntx ); > TSSolve( ts, psi0 ); > > `func` just constructs A(t) at the time given. The same code for calculating > A(t) is used in both calculations, along with the same initial vector psi0, > and the same time steps. > > Let me know what other information is needed. I’m not sure what could be the > problem. `func` doesn’t touch U at all (should it?). > > -Andrew > > > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener >
