On Friday, 7 October 2016, frank wrote:
> Dear all,
>
> Thank you so much for the advice.
>
> All setup is done in the first solve.
>
>
>> ** The time for 1st solve does not scale.
>> In practice, I am solving a variable coefficient Poisson equation. I
>> need to build the
丁老师 writes:
> Dear professor:
> How to Get the last absolute residual that has been computed
SNESGetFunctionNorm?
signature.asc
Description: PGP signature
Barry Smith writes:
> There is still something wonky here, whether it is the MPI implementation
> or how PETSc handles the assembly. Without any values that need to be
> communicated it is unacceptably that these calls take so long. If we
> understood __exactly__ why
KSPGetResidualNorm() if you wish the true (and not preconditioned residual)
you must call KSPSetNormType() be the KSPSolve
SNESGetFunctionNorm()
> On Oct 7, 2016, at 10:41 PM, 丁老师 wrote:
>
> Dear professor:
> How to Get the last absolute residual that has been
> On Oct 7, 2016, at 10:44 PM, Jed Brown wrote:
>
> Barry Smith writes:
>>VecAssemblyBegin/End() does a couple of all reduces and then message
>> passing (if values need to be moved) to get the values onto the correct
>> processes. So these calls
Dear professor:
How to Get the last absolute residual that has been computed
> On Oct 7, 2016, at 6:41 PM, frank wrote:
>
> Hello,
>
>>> Another thing, the vector assemble and scatter take more time as I
>>> increased the cores#:
>>>
>>> cores# 4096 8192
>>> 16384 32768
Hello,
Another thing, the vector assemble and scatter take more time as I increased
the cores#:
cores# 4096 8192
16384 32768 65536
VecAssemblyBegin 2982.91E+002.87E+008.59E+002.75E+01
> On Oct 7, 2016, at 4:49 PM, frank wrote:
>
> Dear all,
>
> Thank you so much for the advice.
>> All setup is done in the first solve.
>>
>> ** The time for 1st solve does not scale.
>> In practice, I am solving a variable coefficient Poisson equation. I
>> need to
Fande,
If you can reproduce the problem with PETSc 3.7.4 please send us sample
code that produces it so we can work with Sherry to get it fixed ASAP.
Barry
> On Oct 7, 2016, at 10:23 AM, Satish Balay wrote:
>
> On Fri, 7 Oct 2016, Kong, Fande wrote:
>
>> On
On Fri, 7 Oct 2016, Kong, Fande wrote:
> On Fri, Oct 7, 2016 at 9:04 AM, Satish Balay wrote:
>
> > On Fri, 7 Oct 2016, Anton Popov wrote:
> >
> > > Hi guys,
> > >
> > > are there any news about fixing buggy behavior of SuperLU_DIST, exactly
> > what
> > > is described here:
>
On Fri, Oct 7, 2016 at 10:16 AM, Kong, Fande wrote:
> On Fri, Oct 7, 2016 at 9:04 AM, Satish Balay wrote:
>
>> On Fri, 7 Oct 2016, Anton Popov wrote:
>>
>> > Hi guys,
>> >
>> > are there any news about fixing buggy behavior of SuperLU_DIST, exactly
>> what
On Fri, Oct 7, 2016 at 9:04 AM, Satish Balay wrote:
> On Fri, 7 Oct 2016, Anton Popov wrote:
>
> > Hi guys,
> >
> > are there any news about fixing buggy behavior of SuperLU_DIST, exactly
> what
> > is described here:
> >
> >
On Fri, 7 Oct 2016, Anton Popov wrote:
> Hi guys,
>
> are there any news about fixing buggy behavior of SuperLU_DIST, exactly what
> is described here:
>
> http://lists.mcs.anl.gov/pipermail/petsc-users/2015-August/026802.html ?
>
> I'm using 3.7.4 and still get SEGV in pdgssvx routine.
Hi guys,
are there any news about fixing buggy behavior of SuperLU_DIST, exactly
what is described here:
http://lists.mcs.anl.gov/pipermail/petsc-users/2015-August/026802.html ?
I'm using 3.7.4 and still get SEGV in pdgssvx routine. Everything works
fine with 3.5.4.
Do I still have to
On 7 October 2016 at 02:05, Matthew Knepley wrote:
> On Thu, Oct 6, 2016 at 7:33 PM, frank wrote:
>
>> Dear Dave,
>> Follow your advice, I solve the identical equation twice and time two
>> steps separately. The result is below:
>>
>> Test: 1024^3 grid
16 matches
Mail list logo