On Thu, 29 Apr 2010, Lorenzo Botti wrote:
Roy, what's going on here? I thought you said that this option would
do
nodal blocking (aka field interlacing).
It does, but in dG dofs belong to elements not to nodes. No way to obtain
nodal blocking...
Sort of. With one dof per variable per DoF
>
>
>
> Roy, what's going on here? I thought you said that this option would do
> nodal blocking (aka field interlacing).
>
>
It does, but in dG dofs belong to elements not to nodes. No way to obtain
nodal blocking...
Lorenzo
--
On Thu, 29 Apr 2010 15:46:54 +0200, Lorenzo Botti
wrote:
> Ok, I'm starting to understand.
> So the command line options won't consider the cases when you have more than
> one dof per node or (as in my dG code) per element.
It *only* handles the case where you have one kind of node, and *all*
do
Ok, I'm starting to understand.
So the command line options won't consider the cases when you have more than
one dof per node or (as in my dG code) per element.
Even if I use --node_major_dofs I'm getting something like
u0,u1,u2,u3,v0,v1,v2,v3,w0,w1,w2,w3,p0,p1,p2,p3 in each element for a first
ord
On Thu, 29 Apr 2010 14:18:23 +0200, Lorenzo Botti
wrote:
> >
> >
> >
> > Yeah, specifying the fields with options like -pc_fieldsplit_0_field
> > 0,1,2 requires a collocated discretization where the fields at each node
> > are interlaced (this gives better performance anyway). It won't work
> >
>
>
>
> Yeah, specifying the fields with options like -pc_fieldsplit_0_field
> 0,1,2 requires a collocated discretization where the fields at each node
> are interlaced (this gives better performance anyway). It won't work
> for mixed discretizations, or where the fields are not interlaced. In
>
On Thu, 29 Apr 2010 13:02:49 +0200, Lorenzo Botti
wrote:
> Thank you for the hint,
> I had to put also -pc_type fieldsplit, just in case someone else wants to
> give it a try...
>
> Is there a way to monitor the convergence of the solvers?
The most common monitors are
-ksp_monitor, -ksp_moni
Thank you for the hint,
I had to put also -pc_type fieldsplit, just in case someone else wants to
give it a try...
Is there a way to monitor the convergence of the solvers?
Another question,
is bs, the block size, always equal to the total number of fields?
Sorry but I'm a beginner with PETSc.
L
On Wed, 28 Apr 2010 12:06:47 +0200, Lorenzo Botti
wrote:
> If my variables are u v w p (3D stokes equation) and I have an equal order
> approximation
> Can I just do
>
> -pc_fieldsplit_0_fields 0,1,2
> -pc_fieldsplit_1_fields 3
> -pc_fieldsplit_type schur
> -pc_filedsplit_block_size
Yes, and y
Sorry to bother you, just to understand if I got it.
If your variables are interlaced like
>
> u0,v0,w0,u1,v1,w1,...
>
> Then you can (a) set a block size for the matrix with MatSetBlockSize,
> (b) use BAIJ in which case the block size is built in, or (c) use
> -pc_fieldsplit_block_size. Then t
On Thu, 22 Apr 2010 10:56:02 -0400, Boyce Griffith
wrote:
>
>
> On 4/22/10 10:46 AM, Jed Brown wrote:
> > On Thu, 22 Apr 2010 10:42:52 -0400, Boyce Griffith
> > wrote:
> >> OK, then I am apparently confused about how PCFieldSplitSetIS is
> >> intended to work. Does one call PCFieldSplitSetIS
On 4/22/10 10:46 AM, Jed Brown wrote:
> On Thu, 22 Apr 2010 10:42:52 -0400, Boyce Griffith
> wrote:
>> OK, then I am apparently confused about how PCFieldSplitSetIS is
>> intended to work. Does one call PCFieldSplitSetIS once for each field?
>
> Call it once per split (it creates a new split e
On Thu, 22 Apr 2010 10:42:52 -0400, Boyce Griffith
wrote:
> OK, then I am apparently confused about how PCFieldSplitSetIS is
> intended to work. Does one call PCFieldSplitSetIS once for each field?
Call it once per split (it creates a new split each time you call it).
Jed
--
On 4/22/10 10:36 AM, Jed Brown wrote:
> On Thu, 22 Apr 2010 10:21:26 -0400, Boyce Griffith
> wrote:
>>
>>
>> On 4/22/10 9:56 AM, Jed Brown wrote:
>>> On Thu, 22 Apr 2010 09:32:12 -0400, Boyce Griffith
>>> wrote:
How should one setup the index set? Is it just a mapping from Vec index
>>
On Thu, 22 Apr 2010 10:21:26 -0400, Boyce Griffith
wrote:
>
>
> On 4/22/10 9:56 AM, Jed Brown wrote:
> > On Thu, 22 Apr 2010 09:32:12 -0400, Boyce Griffith
> > wrote:
> >> How should one setup the index set? Is it just a mapping from Vec index
> >> to field index? E.g., each Vec entry is as
On 4/22/10 9:56 AM, Jed Brown wrote:
> On Thu, 22 Apr 2010 09:32:12 -0400, Boyce Griffith
> wrote:
>> How should one setup the index set? Is it just a mapping from Vec index
>> to field index? E.g., each Vec entry is assigned to field 0 or 1 (or 2
>> or ...)?
>
> If your variables are interla
On Thu, 22 Apr 2010 09:32:12 -0400, Boyce Griffith
wrote:
> How should one setup the index set? Is it just a mapping from Vec index
> to field index? E.g., each Vec entry is assigned to field 0 or 1 (or 2
> or ...)?
If your variables are interlaced like
u0,v0,w0,u1,v1,w1,...
Then you can
On 4/22/10 6:56 AM, Jed Brown wrote:
> On Tue, 20 Apr 2010 22:37:17 -0400, Boyce Griffith
> wrote:
>> I have a system involving two variables for which I would like to use a
>> preconditioner which involves solvers for the two variables separately.
>>(E.g., block Jacobi or block Gauss-Seidel
On Tue, 20 Apr 2010 22:37:17 -0400, Boyce Griffith
wrote:
> Hi, Folks --
>
> I have a system involving two variables for which I would like to use a
> preconditioner which involves solvers for the two variables separately.
> (E.g., block Jacobi or block Gauss-Seidel using one subdomain per
Hi, Folks --
I have a system involving two variables for which I would like to use a
preconditioner which involves solvers for the two variables separately.
(E.g., block Jacobi or block Gauss-Seidel using one subdomain per
variable.) Is there a recommended way to do this other than making a
20 matches
Mail list logo