A question about PCASMGetSubKSP(), it says that we must call KSPSetUp() before
calling
PCASMGetSubKSP(). But what ksp should be SetUp, the ksp using PCASM or what?
Thanks!
>>
>> On Tue, May 15, 2012 at 10:30 AM, Hui Zhang <mike.hui.zhang at hotmail.com>
>> wrote:
>> thanks for reply!
>> On May 15, 2012, at 5:19 PM, Dmitry Karpeev wrote:
>>
>>>
>>>
>>> On Tue, May 15, 2012 at 9:55 AM, Hui Zhang <mike.hui.zhang at hotmail.com>
>>> wrote:
>>> Dmitry,
>>>
>>> thanks for remind. I have a new question about PCASM / PCGASM:
>>>
>>> can I get the restricted extension operators, which maps an overlapping
>>> subdomain solution
>>> to the global domain?
>>>
>>> I'm not exactly sure what you mean.
>>>
>>> Are you talking about embedding the subdomain vectors back into the
>>> original vector?
>>
>> Yes, exactly.
>>
>>> If so, there is substantial difference in how this is handled in ASM and
>>> GASM:
>>> ASM has a bunch of sequential vectors that can be scattered back into the
>>> global vector,
>>
>> Yes. Is there a method to get the scatter?
>>
>> In the ASM case it's a bunch of scatters -- one for each subdomain.
>> Currently there is no method to
>> retrieve them.
>
> this hint is very helpful. Thanks!
>
>> What requires this functionality?
>
> I am writing some modified ASM method. In construction of energy minimization
> coarse basis,
> I need to solve individual subdomain problems and not to sum them, just to
> extend them separately.
> I wonder whether you guys have ever done this coarse basis.
> Is there a reference where the basis is described?
> Dmtiry.
>
> Thanks,
> Hui
>
>>
>> In the ASM case you can construct the scatters yourself easily enough,
>> since you have all of the requisite information -- the array of subdomain
>> ISs and the global vector x.
>> The only piece of data you might not have is the set of outer subdomains
>> that have been obtained
>> by applying overlap increase to the original inner (nonoverlapping)
>> subdomains.
>>
>>> because the subdomains are always local to at most one processor.
>>>
>>> In the GASM case this is rather different, since the subdomains can live on
>>> arbitrary subcommunicators
>>> and there is only one scatter, which is applied to the direct sum of all
>>> the subdomain vectors on the original communicator. I'm not sure how useful
>>> that last scatter would be for you, since the details of the structure
>>> of the direct sum vector are internal to GASM.
>>
>> I would prefer to have the scatter for individual subdomain before direct
>> sum.
>> But if I can get the scatter PCGASM has, maybe it is still useful. Please
>> tell me how to get it?
>> There are no individual subdomain scatters, but, as in the case of ASM, you
>> can construct them
>> easily enough, except that those would have to operate on subcommunicators.
>> In GASM we pack them into a single scatter on the original communicator.
>> Currently there is no method
>> to expose this scatter. Why do you need this functionality?
>>
>> Dmitry.
>> Thanks!
>>
>>>
>>> Dmitry.
>>>
>>> Thanks!
>>>
>>> On May 15, 2012, at 3:29 PM, Dmitry Karpeev wrote:
>>>
>>>> There are some additional minor fixes that mostly have to do with
>>>> outputting the subdomain information with -pc_gasm_view_subdomains (in
>>>> PCView()) and with -pc_gasm_print_subdomains (during PCSetUp()).
>>>> You might want to pull those latest patches, but it won't interfere with
>>>> your work if you don't use subdomain output.
>>>>
>>>> Thanks.
>>>> Dmitry.
>>>>
>>>> On Tue, May 15, 2012 at 7:14 AM, Hui Zhang <mike.hui.zhang at hotmail.com>
>>>> wrote:
>>>> Dmitry,
>>>>
>>>> thanks for reply. I re-download the codes and tried it again and now it
>>>> works correctly!
>>>>
>>>> Everything seems ok.
>>>>
>>>> Thanks,
>>>> Hui
>>>>
>>>>
>>>> On May 15, 2012, at 2:01 PM, Dmitry Karpeev wrote:
>>>>
>>>>> Hui,
>>>>> I'm trying to reproduce this problem, unsuccessfully, so far.
>>>>> One thing that looks odd is that the output below claims the PC is of
>>>>> type "asm", even though you are running with -dd_type gasm. Could you
>>>>> verify that's the correct output?
>>>>>
>>>>> Here's the output I get with
>>>>> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec -np 1 ./gasm_test -n 64 -dd_type
>>>>> asm -dd_ksp_view
>>>>>
>>>>> PC Object:(dd_) 1 MPI processes
>>>>> type: asm
>>>>> Additive Schwarz: total subdomain blocks = 2, user-defined overlap
>>>>> Additive Schwarz: restriction/interpolation type - RESTRICT
>>>>> Local solve is same for all blocks, in the following KSP and PC
>>>>> objects:
>>>>> KSP Object: (dd_sub_) 1 MPI processes
>>>>> type: preonly
>>>>> maximum iterations=10000, initial guess is zero
>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>>>>> left preconditioning
>>>>> using NONE norm type for convergence test
>>>>> <snip>
>>>>>
>>>>> and with
>>>>> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec -np 1 ./gasm_test -n 64 -dd_type
>>>>> gasm -dd_ksp_view
>>>>>
>>>>> PC Object:(dd_) 1 MPI processes
>>>>> type: gasm
>>>>> Generalized additive Schwarz:
>>>>> Restriction/interpolation type: RESTRICT
>>>>> user-defined overlap
>>>>> total number of subdomains = 2
>>>>> number of local subdomains = 2
>>>>> max number of local subdomains = 2
>>>>> [0:1] number of locally-supported subdomains = 2
>>>>> Subdomain solver info is as follows:
>>>>> <snip>
>>>>>
>>>>> What convergence are you seeing with the two PC types? It should be the
>>>>> same with 1 and 2 procs for both PCASM and PCGASM.
>>>>>
>>>>> Thanks.
>>>>> Dmitry.
>>>>>
>>>>> On Tue, May 15, 2012 at 4:03 AM, Hui Zhang <mike.hui.zhang at
>>>>> hotmail.com> wrote:
>>>>> Dmitry,
>>>>>
>>>>> I got the newest petsc-dev and I run the test by
>>>>>
>>>>> mpirun -np 1 ./gasm_test -dd_type gasm -n 64 -dd_ksp_view
>>>>>
>>>>> which gives the following output
>>>>>
>>>>> PC Object:(dd_) 1 MPI processes
>>>>> type: asm
>>>>> Additive Schwarz: total subdomain blocks = 1, amount of overlap = 1
>>>>> ^^^
>>>>> note the above number, it should
>>>>> be 2
>>>>>
>>>>> While PCASM has no such problem.
>>>>>
>>>>> Thanks,
>>>>> Hui
>>>>>
>>>>>
>>>>>
>>>>>> Hui,
>>>>>>
>>>>>> The convergence issue should be resolved in the latest petsc-dev.
>>>>>> I'm attaching a slightly modified gasm_test.c (reflecting some upcoming
>>>>>> API changes)
>>>>>> that should verify that.
>>>>>>
>>>>>> Let me know if it works for you.
>>>>>> Thanks.
>>>>>> Dmitry.
>>>>>> On Fri, May 11, 2012 at 12:31 PM, Hui Zhang <mike.hui.zhang at
>>>>>> hotmail.com> wrote:
>>>>>> Hi Dmitry,
>>>>>>
>>>>>> thanks for useful hints. Good day!
>>>>>>
>>>>>> Hui
>>>>>>
>>>>>> On May 11, 2012, at 7:17 PM, Dmitry Karpeev wrote:
>>>>>>
>>>>>>> You can call PCSetUp(pc) on either ASM or GASM, and that will destroy
>>>>>>> and recreate the matrices (including calling
>>>>>>> your modification subroutine), but not the subdomains or the subdomain
>>>>>>> solvers.
>>>>>>> If you just want to modify the submatrices, you can call
>>>>>>> PC(G)ASMGetSubmatrices() and modify the matrices it returns
>>>>>>> (in the same order as the subdomains were set). That's a bit of a hack,
>>>>>>> since you will essentially be modifying the PC's internal data
>>>>>>> structures. As long as you are careful, you should be okay, since you
>>>>>>> already effectively have the same type of access to the submatrices
>>>>>>> through the Modify callback.
>>>>>>>
>>>>>>> Dmitry.
>>>>>>>
>>>>>>> On Fri, May 11, 2012 at 11:52 AM, Hui Zhang <mike.hui.zhang at
>>>>>>> hotmail.com> wrote:
>>>>>>> I just have a question about reuse of PCASM or PCGASM.
>>>>>>> Suppose I have seted up the PCASM and related KSP and I solved one time.
>>>>>>> Next for the same linear system (matrix and RHS), I just want PCASM
>>>>>>> modify the submatrices (PCSetModifySubmatrices) in a different way,
>>>>>>> using the same routine for modifying but with
>>>>>>> different user context for the modifying routine.
>>>>>>>
>>>>>>> What can I do for this task? Currently, I destroy the KSP and
>>>>>>> re-construct it. I guess
>>>>>>> even for PCASM I can re-use it because the partition of subdomains
>>>>>>> remain the same.
>>>>>>>
>>>>>>> Thanks!
>>>>>>>
>>>>>>>
>>>>>>> On May 10, 2012, at 6:37 PM, Dmitry Karpeev wrote:
>>>>>>>
>>>>>>>> Hui,
>>>>>>>> There've been several changes to PCGASM ahead of the new release.
>>>>>>>> Let me go back and see if it affected the convergence problem.
>>>>>>>> Dmitry.
>>>>>>>>
>>>>>>>> On Thu, May 10, 2012 at 4:16 AM, Hui Zhang <mike.hui.zhang at
>>>>>>>> hotmail.com> wrote:
>>>>>>>> Hi Dmitry,
>>>>>>>>
>>>>>>>> is there any news about PCGASM?
>>>>>>>>
>>>>>>>> thanks,
>>>>>>>> Hui
>>>>>>>>
>>>>>>>> On Feb 20, 2012, at 6:38 PM, Dmitry Karpeev wrote:
>>>>>>>>
>>>>>>>>> Okay, thanks.
>>>>>>>>> I'll take a look.
>>>>>>>>>
>>>>>>>>> Dmitry.
>>>>>>>>>
>>>>>>>>> On Mon, Feb 20, 2012 at 11:30 AM, Hui Zhang <mike.hui.zhang at
>>>>>>>>> hotmail.com> wrote:
>>>>>>>>> For reference, my results are attached.
>>>>>>>>>
>>>>>>>>> asm1.txt for asm with 1 process,
>>>>>>>>> asm2.txt for asm with 2 processes,
>>>>>>>>> gasm1.txt for gasm with 1 process, (with the iteration numbers
>>>>>>>>> different from others)
>>>>>>>>> gasm2.txt for gasm with 2 processes
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> thank you,
>>>>>>>>> Hui
>>>>>>>>>
>>>>>>>>> On Feb 20, 2012, at 3:06 PM, Dmitry Karpeev wrote:
>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Mon, Feb 20, 2012 at 12:59 AM, Hui Zhang <mike.hui.zhang at
>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>> On Feb 20, 2012, at 12:41 AM, Dmitry Karpeev wrote:
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sun, Feb 19, 2012 at 3:08 PM, Hui Zhang <mike.hui.zhang at
>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>> I have a new problem: the results from ASM and GASM are different
>>>>>>>>>>> and it seems
>>>>>>>>>>> GASM has something wrong with SetModifySubMatrices. Numerical tests
>>>>>>>>>>> are with
>>>>>>>>>>> each subdomain supported only by one subdomain. There are no
>>>>>>>>>>> problems when
>>>>>>>>>>> I did not modify submatrices. But when I modify submatrices, there
>>>>>>>>>>> are problems
>>>>>>>>>>> with GASM but no problems with ASM.
>>>>>>>>>>>
>>>>>>>>>>> For example, I use two subdomains. In the first case each subdomain
>>>>>>>>>>> is supported by
>>>>>>>>>>> one processor and there seems no problem with GASM. But when I use
>>>>>>>>>>> run my program
>>>>>>>>>>> with only one proc. so that it supports both of the two subdomains,
>>>>>>>>>>> the iteration
>>>>>>>>>>> number is different from the first case and is much larger. On the
>>>>>>>>>>> other hand
>>>>>>>>>>> ASM has no such problem.
>>>>>>>>>>>
>>>>>>>>>>> Are the solutions the same?
>>>>>>>>>>> What problem are you solving?
>>>>>>>>>>
>>>>>>>>>> Yes, the solutions are the same. That's why ASM gives the same
>>>>>>>>>> results with one or
>>>>>>>>>> two processors. But GASM did not.
>>>>>>>>>> Sorry, I wasn't clear: ASM and GASM produced different solutions in
>>>>>>>>>> the case of two domains per processor?
>>>>>>>>>> I'm solving the Helmholtz equation. Maybe
>>>>>>>>>> I can prepare a simpler example to show this difference.
>>>>>>>>>> That would be helpful.
>>>>>>>>>> Thanks.
>>>>>>>>>>
>>>>>>>>>> Dmitry.
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Dmitry.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>
>>>>>>>>>>>> You should be able to.
>>>>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>>>>> I am in transit right now, but I can take a closer look in Friday.
>>>>>>>>>>>>
>>>>>>>>>>>> Dmitry
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang <mike.hui.zhang at
>>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi Dmitry,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>>>>>>> another question
>>>>>>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has
>>>>>>>>>>>>>> the prototype
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void
>>>>>>>>>>>>>> *ctx);
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I think the coloumns from the parameter 'col' are always the
>>>>>>>>>>>>>> same as the rows
>>>>>>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains()
>>>>>>>>>>>>>> only accepts
>>>>>>>>>>>>>> index sets but not rows and columns. Has I misunderstood
>>>>>>>>>>>>>> something?
>>>>>>>>>>>>>
>>>>>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>>>>>
>>>>>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping()
>>>>>>>>>>>>> for the submat's
>>>>>>>>>>>>> in the above func()?
>>>>>>>>>>>>>
>>>>>>>>>>>>> thanks,
>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks,
>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Yes, that's right.
>>>>>>>>>>>>>>> There is no good way to help the user assemble the subdomains
>>>>>>>>>>>>>>> at the moment beyond the 2D stuff.
>>>>>>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>>>>>>>> each having the indices with the same color and the subcomm
>>>>>>>>>>>>>>> that supports that color. It is
>>>>>>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Dmitry.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang <mike.hui.zhang at
>>>>>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>>>>>>> supported by
>>>>>>>>>>>>>>> multiple processors, shall I always create the arguments
>>>>>>>>>>>>>>> 'is[s]' and 'is_local[s]'
>>>>>>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>>>>>>> subdomain 's'?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> <gasm_test.c>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120516/397c59ef/attachment-0001.htm>