On 22/05/18 17:11, Xiaoye S. Li wrote:
> Numerical factorization is always parallel (based on number of MPI
> tasks and OMP_NUM_THREADS you set), the issue here is only related to
> symbolic factorization (figuring out the nonzero pattern in the LU
> factors). Default setting is to use
> On 24 May 2018, at 06:24, Michael Becker
> wrote:
>
> Could you have a look at the attached log_view files and tell me if something
> is particularly odd? The system size per processor is 30^3 and the simulation
> ran over 1000 timesteps, which means
-simulation
Closing date is the 29th July 2018.
For informal enquiries, please contact David Ham
.
Cheers,
Lawrence Mitchell
> On 6 Jul 2018, at 17:30, zakaryah wrote:
>
> Thanks for your help, Barry.
>
> I agree about the preconditioning. I still don't understand why I don't need
> a particular solver for my shell matrix. My reasoning is that KSP is easy
> with M but difficult with A, since A has a dense row
April.
Abstract deadline is 25th March. Soon! Please do submit papers, we'd love to
see you in London.
For any questions, please contact petsc2...@mcs.anl.gov
Lawrence Mitchell (for the organising committee)
> On 23 Apr 2018, at 11:32, Loic Gouarin wrote:
>
> Hi,
>
> I try to use MATIS from petsc4py but I am not able to use setValuesLocal (I
> have a segmentation fault). And I don't understand why.
>
> Could you tell me if I did a mistake or if it's a bug ?
This
Dear all,
a reminder that the PETSc users meeting is in London this summer.
The early registration deadline of 11th April is fast approaching, so
if you have not yet registered and wish to attend, now is a good time.
Also note that the abstract submission deadline has been extended
until 11th
Hi Max,
(I'm cc'ing in the petsc-users mailing list which may have more advice, if you
are using PETSc you should definitely subscribe!
> On 24 Oct 2018, at 09:27, Maximilian Hartig wrote:
>
> Hello Lawrence,
>
> sorry to message you out of the blue. My name is Max and I found your post on
> On 29 Oct 2018, at 10:56, Matthew Knepley wrote:
>
> You can certainly map the right edge to the top edge (its topologically a
> square again), but that
> mapping is not smooth, and I do not know how you would make a global basis
> for the approximation space.
Yeah, I thought this was
On Fri, 7 Feb 2020 at 19:15, Fande Kong wrote:
> Thanks, Matt,
>
> It is a great paper. According to the paper, here is my understanding: for
> normal matrices, the eigenvalues of the matrix together with the
> initial residual completely determine the GMRES convergence rate. For
> non-normal
Hi Dave,
> On 2 Jun 2020, at 05:43, Dave May wrote:
>
>
>
> On Tue 2. Jun 2020 at 03:30, Matthew Knepley wrote:
> On Mon, Jun 1, 2020 at 7:03 PM Danyang Su wrote:
> Thanks Jed for the quick response. Yes I am asking about the repartitioning
> of coarse grids in geometric multigrid for
> On 2 Jun 2020, at 09:35, Matthew Knepley wrote:
>
> On Tue, Jun 2, 2020 at 4:25 AM Lawrence Mitchell wrote:
> Hi Dave,
>
> > On 2 Jun 2020, at 05:43, Dave May wrote:
> >
> >
> >
> > On Tue 2. Jun 2020 at 03:30, Matthew Knepley wrote:
> On 2 Jun 2020, at 09:54, Matthew Knepley wrote:
>
> I almost agree. I still think we do not change Distribute(), since it is
> really convenient, but we do check sizes on input as you say.
If we only want Distribute(), we have to change it a bit, because right now
there's only one
> On 21 Sep 2020, at 14:03, Matthew Knepley wrote:
>
> On Mon, Sep 21, 2020 at 8:51 AM Luciano Siqueira
> wrote:
> Hi *,
>
> I'm experimenting with different combinations of KSP solvers and PCs and
> I don't know why GMRES/bjacobi are the default choices for CPU and
>
> GMRES is chosen
> On 28 Oct 2020, at 16:35, Guyer, Jonathan E. Dr. (Fed) via petsc-users
> wrote:
>
> We use petsc4py as a solver suite in our
> [FiPy](https://www.ctcms.nist.gov/fipy) Python-based PDE solver package. Some
> time back, I refactored some of the code and provoked a deadlock situation in
>
> On 13 Jul 2020, at 10:46, Matthew Knepley wrote:
>
> For now its complexity of moving simulation data around and scripting for it.
> However, now I have at least two meshes in my problem, and
> I anticipate having several more. I believe this will be the long term trend.
>
> I would
> On 21 Jul 2020, at 10:49, Eda Oktay wrote:
>
> Hi all,
>
> I am using the following libraries and for some reason, I figured out
> that if ı am disconnected to internet, my program is not working:
[...]
> I thought that there is no reason for PETSc to go online for libraries
> but I
> On 21 Jul 2020, at 11:06, Eda Oktay wrote:
>
> Dear Lawrence,
>
> I am using MPICC but not Mac, Fedora 25. If it will still work, I will try
> that.
>
> Thanks!
It might be the case. When you observe the error, does "nslookup localhost"
take a long time?
Lawrence
Dear PETSc-ites,
The Durham CS department is presently hiring. We have open positions at all
levels (Assistant, Associate, and Full Professor) across a broad range of
applied computer science, and we'd like to make at least one hire in Scientific
Computing (the group currently has interests in
> On 27 Jan 2021, at 16:30, Matthew Knepley wrote:
>
> This is very important to do _first_. It would probably only take you a day
> to measure the Allreduce time on your target, say the whole machine you run
> on.
Why plots like this are not _absolutely standard_ on all HPC sites'
> On 3 Feb 2021, at 08:48, Stefano Zampini wrote:
>
...
> Questions:
> - Is there any straightforward way to apply PCBDDC for DG which I am
> missing?
>
> I don't think so. I know Lawrence gave it some thoughts but never heard about
> a final solution about how to represent subdomain DG
> On 11 Jun 2021, at 13:19, Matthew Knepley wrote:
>
> Before doing what Lawrence suggests, I would like to talk this through. I
> don't think we should need communication. So,
>
> Goal: Section with dofs on each face that has support 2 (separates 2 cells)
>
> Okay, suppose we loop over
> On 18 Jun 2021, at 14:12, Victor Eijkhout wrote:
>
>
>
>> On , 2021Jun17, at 23:48, Satish Balay via petsc-dev
>> wrote:
>>
>> https://petsc.org/release/
>
> The “Documentation” link is
>
> https://petsc.org/release/#
>
> Maybe a more informative URL?
> On 18 Jun 2021, at 14:37, Victor Eijkhout wrote:
>
>>
>> On , 2021Jun18, at 08:33, Lawrence Mitchell wrote:
>>
>> https://petsc.org/release/documentation/
>
> How do I find that? Clicking “Documentation” on the main page does not go
> there.
> On 11 Jun 2021, at 11:29, Mark Adams wrote:
>
> This is a Matt question, but You can set a "Boundary" label
> (https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexMarkBoundaryFaces.html)
> and that will label boundaries.
> Next you want the dual of that ...
Note
> On 21 May 2021, at 17:53, Stefano Zampini wrote:
>
>> I see, anyway you do not need the check if the loop range [rStart,rEnd). So
>> now I don’t understand why the loop must be [rStart,rStart+maxRows], Matt?
>>
>> It is terrible, but I could not see a way around it. We want to use
>>
> On 28 May 2021, at 14:38, Stefano Zampini wrote:
>
> Mark
>
> That line is obtained via
>
> git describe --match "v*"
>
> At configure time. The number after the g indicates the commit
> As Matt says, you can do git checkout to go back at the point
> were you configured PETSc
In fact,
> On 28 May 2021, at 14:15, Mark Adams wrote:
>
> v3.15.0-531-g1397235
^^^ This is the shortened commit hash, so
git checkout 1397235
Lawrence
> On 28 May 2021, at 14:59, Mark Adams wrote:
>
> Thanks everyone.
>
> How would I get a version (a branch say) to be and stay visible?
>
> I am not seeing any of my versions used for this data but they were all in
> the repo at one point, in a branch. Does the branch need to be merged
> On 28 May 2021, at 16:51, Mark Adams wrote:
>
> It sounds like I should get one branch settled, use it, and keep that branch
> in the repo, and to be safe not touch it, and that should work for at least a
> few months. I just want it to work if the reviewer tests it :)
>
Just bake a
> On 19 May 2021, at 14:46, Karin wrote:
>
> Dear Matthew,
>
> You are (again) right. This Gmsh test file is "dirty" : some triangles do not
> belong to tests. Sorry for that.
> I have tried with another geo file (which is clean in that sense) and PETSc
> reads with no error.
>
> I take
> On 19 May 2021, at 14:57, Karin wrote:
>
> Thank you very much Lawrence.
> Are the names lost or are they saved somewhere ?
> If I do : "./ex2 -filename /tmp/test/Cube_with_facets.msh4 -dm_view
> vtk:/tmp/foo.vtk" , I only get the tets of the initial mesh.
I believe that the mapping of
> On 22 Apr 2021, at 19:46, Tang, Qi via petsc-users
> wrote:
>
> For our case, the Schur complement becomes a curl curl problem by designed.
> So we expect it is going to be scalable. We are still experimenting
> boomeramg, which seems be working, but would like to switch to AMS later if
> On 24 Mar 2021, at 01:30, Matthew Knepley wrote:
>
> This is true, but all the PETSc operations are speeding up by a factor 2x. It
> is hard to believe these were run on the same machine.
> For example, VecScale speeds up!?! So it is not network, or optimizations. I
> cannot explain
> On 9 Mar 2021, at 14:17, Jose E. Roman wrote:
>
> When I added this, I changed it to default to gmres+bjacobi when
> EPSSetPreconditionerMat() is called, but for some reason in your case it is
> not doing it. If it's Firedrake who is setting preonly+lu I would say it is
> not necessary
Dear Alexei,
I echo the comments that Barry and others have made.
Some more in line below.
> On 5 Mar 2021, at 21:06, Alexei Colin wrote:
>
> To PETSc DMPlex users, Firedrake users, Dr. Knepley and Dr. Karpeev:
>
> Is it expected for mesh distribution step to
> (A) take a share of 50-99% of
> On 1 Sep 2021, at 09:42, Наздрачёв Виктор wrote:
>
> I have a 3D elasticity problem with heterogeneous properties.
What does your coefficient variation look like? How large is the contrast?
> There is unstructured grid with aspect ratio varied from 4 to 25. Zero
> Dirichlet BCs are
ng us to
> load up on a different number of
> processes. We plan to be done by October. Vaclav and I are doing this in
> collaboration with Koki Sagiyama,
> David Ham, and Lawrence Mitchell from the Firedrake team.
The core load/save cycle functionality is now in PETSc main. So if
Dear Sergio,
(Added petsc-users back to cc),
> On 20 Sep 2021, at 14:08, sergio.bengoec...@ovgu.de wrote:
>
> Dear Lawrence,
>
> thanks for the HDF5 saving and loading example.
>
> In the documentation you sent
> (https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5)
Hi Pierre,
> On 12 Oct 2021, at 14:58, Pierre Seize wrote:
>
>
> 1 #include
> 2
> 3 int main(int argc, char **argv){
> 4 PetscErrorCode ierr = 0;
> 5
> 6 ierr = PetscInitialize(, , NULL, ""); if (ierr) return ierr;
> 7 PetscReal *foo;
> 8 malloc(sizeof(PetscReal));
> 9 ierr
Hi Daniel,
> On 21 Sep 2021, at 11:19, Daniel Stone wrote:
>
> Hello,
>
> If we look at lines 2330-2331 in file baij2.c, it looks like there are some
> mistakes in assigning the `sum..` variables to the z array, causing
> the function MatMultAdd_SeqBAIJ_11() to not produce the correct
>
This is failing setting the chunksize:
https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/da/gr2.c#L517
It is hard for me to follow this code, but it looks like the chunk is just
set to the total extent of the DA (on the process?). This can grow too
large for HDF5, which has limits described
Hi all,
(I cc Jack who is doing the implementation in the petsc4py setting)
> On 24 Oct 2021, at 06:51, Stefano Zampini wrote:
>
> Non-deterministic garbage collection is an issue from Python too, and
> firedrake folks are also working on that.
>
> We may consider deferring all calls to
> On 7 Dec 2021, at 13:26, Quentin Chevalier
> wrote:
>
> Ok my bad, that log corresponded to a tentative --download-hdf5. This
> log corresponds to the commands given above and has --with-hdf5 in its
> options.
OK, so PETSc is configured with HDF5. I assume you have now built it (with make
Comments inline below:
> On 7 Dec 2021, at 14:43, Quentin Chevalier
> wrote:
>
> @Matthew, as stated before, error output is unchanged, i.e.the python
> command below produces the same traceback :
>
> # python3 -c "from petsc4py import PETSc; PETSc.Viewer().createHDF5('d.h5')"
> Traceback
> On 10 Feb 2022, at 15:02, Stefano Zampini wrote:
>
>
>
>> On Feb 10, 2022, at 6:00 PM, Matthew Knepley wrote:
>>
>> On Thu, Feb 10, 2022 at 9:17 AM Medane TCHAKOROM
>> wrote:
>> Hello ,
>>
>> Sorry if this question does not belong to this mailling list, i'am using
>> Petsc , but
> On 14 Jan 2022, at 14:12, Matthew Knepley wrote:
>
> On Fri, Jan 14, 2022 at 9:07 AM Thibault Bridel-Bertomeu
> wrote:
> Also, if we still consider my example with Solid and Fluid, let's image we
> call DMPlexFilter twice. We then get two new DMs with Solid in one and Fluid
> in the
On Tue, 12 Apr 2022 at 12:18, Matthew Knepley wrote:
>
> On Tue, Apr 12, 2022 at 7:02 AM Berend van Wachem
> wrote:
>>
>> Dear Matt,
>>
>> In our code, the size of the overlap is determined in runtime, based on
>> some calculations. Therefore, we cannot specify it using the
>>
On Wed, 20 Sept 2023 at 12:17, Ce Qin wrote:
>
> Dear all,
>
> I am currently implementing a multigrid solver for Maxwell's equations in 3D.
> The AFW smoother has excellent convergence properties for Maxwell's
> equations. I
> noticed that PCPATCH provides such types of smoothers. However, I am
On Fri, 29 Jul 2022 at 18:36, Tang, Qi wrote:
> Thanks, Matt. This one is very helpful
>
There's also the Deuflhard book on the affine covariant linesearch
(snes_linesearch_type nleqerr)
Lawrence
>
On Mon, 29 Aug 2022 at 14:55, Matthew Knepley wrote:
>
> On Sun, Aug 28, 2022 at 7:17 PM Matthew Knepley wrote:
>>
>> On Sun, Aug 28, 2022 at 5:36 PM Mike Michell wrote:
>>>
>>> Thank you for the reply.
>>>
>>> I think it can be more helpful for me if the attached sample code
>>>
>
> I am not sure Injection is going to "do the right thing" when you move
> between spaces. At least my memory of the implementation
> was that it just did local interpolation, which is not right when injecting
> P0 into P1.
Ah, that could be the case. For discontinuous things I think what you
On Thu, 16 Feb 2023 at 16:43, Matthew Knepley wrote:
>
> On Thu, Feb 16, 2023 at 10:54 AM Lawrence Mitchell wrote:
>>
>> Hi Blaise,
>>
>> On Thu, 16 Feb 2023 at 15:17, Blaise Bourdin wrote:
>> >
>> > Hi,
>> >
>> > I
Hi Blaise,
On Thu, 16 Feb 2023 at 15:17, Blaise Bourdin wrote:
>
> Hi,
>
> I am trying to implement a non-local finite elements reconstruction operator
> in parallel.
>
> Given a dmplex distributed with an overlap, is there a way to figure out
> which cells are in the overlap and which are
> On 2 Nov 2018, at 14:58, 陳宗興 via petsc-users wrote:
>
> Hi,
>
> I have created a DMPlex using DMPlexCreateFromFile,
> and I use PetscObjectSetName to create the name of DM.
> I have occurred some problem when I try to use PetscViewerVTKGetDM.
> This is what I write:
>
> DM dm;
>
> On 6 Nov 2018, at 14:37, Maximilian Hartig via petsc-users
> wrote:
>
> lldb returns the following:
>
> (lldb) process attach --pid 1082
> Process 1082 stopped
> * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP
> frame #0: 0x7fff5aa5c876
> On 22 Jan 2019, at 22:30, Justin Chang via petsc-users
> wrote:
>
> As of right now, the dict() is something the user constructs inside
> mycode.py. It would contain things like the viscosity, boundary conditions,
> function space, etc which are needed to construct the PC operators in
Dear petsc-users,
I have a matrix with a block size > 1. I would sometimes like to insert into
it, applying Dirichlet conditions to one component of the block. Now, for
normal block (or when the block size is 1) dirichlet conditions, I do this by
swapping out the local to global map for one
Hi Barry,
> On 9 Dec 2018, at 19:25, Smith, Barry F. wrote:
>
>
> Lawrence,
>
> I understand what you want and it is a reasonable request. The problem
> is that currently ISLocalToGlobalMappingCreate() when used with block vectors
> and matrices is always based on blocks, that is,
> On 13 Mar 2019, at 14:04, Matthew Knepley via petsc-users
> wrote:
>
> On Wed, Mar 13, 2019 at 9:44 AM Manuel Colera Rico via petsc-users
> wrote:
> Yes:
>
> [ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
> /opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
> [
Dear petsc-users,
I have two (different) distributions of the (topologically) same DMPlex object
(DM_a and DM_b).
I would like to identify the map from points(DM_a) to points(DM_b) such that I
can transfer fields between the two.
Does such a facility exist?
Lawrence
Hi Sanjay,
> On 30 May 2019, at 08:58, Sanjay Govindjee via petsc-users
> wrote:
>
> The problem seems to persist but with a different signature. Graphs attached
> as before.
>
> Totals with MPICH (NB: single run)
>
> For the CG/Jacobi data_exchange_total = 41,385,984;
On Wed, 19 Jun 2019 at 08:37, Matthew Knepley via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> On Mon, Jun 17, 2019 at 9:49 PM Swarnava Ghosh
> wrote:
>
>> Hi Matthew,
>>
>> I am primarily trying to interpolate fields which are defined on
>> vertices. For a process, the point at which the
On Thu, 1 Aug 2019 at 16:59, Daniel Mckinnell via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hi,
>
> I have been having some trouble trying to refine a DMPlex object using a
> Refinement Function. I have been working with reference to the code
> discussed here:
>
> On 4 Oct 2019, at 10:46, Matthew Knepley via petsc-users
> wrote:
>
> On Thu, Oct 3, 2019 at 6:34 PM Salazar De Troya, Miguel via petsc-users
> wrote:
> I am trying to solve the Stokes equation with the Brinkman term to simulate a
> solid material. My intention is to implement the
101 - 165 of 165 matches
Mail list logo