Re: [deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-25 Thread Alexander Gary Zimmerman
It looks like I might have left a loose end here. JP indeed answered my question :) Thanks for all of the help. On Friday, April 7, 2017 at 9:52:21 PM UTC+2, Timo Heister wrote: > > > Just post a question/request and hope for the best ;-) I'm pretty sure > that > > Timo will read this at some

Re: [deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-07 Thread Timo Heister
> Just post a question/request and hope for the best ;-) I'm pretty sure that > Timo will read this at some point (won't you Timo?). Wait, what? :-) I am skimming most threads of course. Sorry I didn't reply yet. Are any questions left after JP's reply? > Yes, I agree that the fact that there's

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-07 Thread Jean-Paul Pelteret
Hi Alex, > Yes I began this project by reviewing step-8 > > and > the Handling vector valued problems >

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-06 Thread Jean-Paul Pelteret
Ok great, I'm glad that you're getting somewhere. So lets try to get to the bottom of this indexing issue (as a side note, and in case anyone else is interested, in this issue we've started discussing how to document the index-related functions in

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-06 Thread Alex Zimmerman
For the record, this makes much more sense written as a tensor contraction, rather than grouping $(w \cdot \nabla)$. The paper I'm referencing just writes that grouping and the equivalent summation, but skipped over writing it as a tensor contraction, which would have helped me a lot. I wrote

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-06 Thread Alex Zimmerman
Correction! When I tested your idea in my code, I didn't swap the positions of v and w, so I had w on the left and v on the right :) Now I see that you're correct, and I'm sure that writing down the contraction by hand would confirm it. So my question about the indexing of gradv is still open,

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-06 Thread Alex Zimmerman
Thanks for following up. I don't think that double sum = v*gradz*w is equivalent to double sum = 0.; for (unsigned int i = 0; i < dim ; ++i) { for (unsigned int j = 0; j < dim; ++j) { sum += _w[j]*_gradz[i][j]*_v[i]; } } Most importantly, making the change definitely

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-05 Thread Jean-Paul Pelteret
> > Hi Alex, >> > > Thanks for the clarification, Jean-Paul. > I'm sorry to say that you shouldn't be too quick in this instance to thank me... > I'm glad to hear that my syntax is correct. > > I am indeed using an extractor, e.g. here's a line from my code: > const Tensor<2, dim> gradv

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-05 Thread Alex Zimmerman
Thanks for the clarification, Jean-Paul. I'm glad to hear that my syntax is correct. I am indeed using an extractor, e.g. here's a line from my code: const Tensor<2, dim> gradv = fe_values[velocity].gradient(j, quad); I'm having some trouble finding an example of the S[i][j] syntax in either

[deal.II] Re: Indexing a Tensor<2,dim> representing the gradient of a Tensor<1,dim>

2017-04-05 Thread Jean-Paul Pelteret
Dear Alex, What you've stated is correct. The tensor component $S_{ij}$ is exactly component S[i][j] of a Tensor<2,dim> S if "S" represents $S_{ij} = \partial v_{i} \partial x_{j}$. As for an example, both steps 18