Re: [petsc-users] chelosky vs multigrid

2018-12-03 Thread Jed Brown via petsc-users
Fazlul Huq writes: > Thanks. > > I have tried with large problem also (10^7 X 10^7). Even then I got > cholesky faster than multigrid. > But, the problem is 1D. May be that's the reason. Exactly. Cholesky is O(n) with a very small constant in 1D, O(n^{3/2}) in 2D, and O(n^2) in 3D.

Re: [petsc-users] chelosky vs multigrid

2018-12-03 Thread Jed Brown via petsc-users
Fazlul Huq via petsc-users writes: > Hello PETSc Developers, > > I am trying to solve a poisson equation using "-pc_type cholesky" and > "-pc_type hypre -pc_hypre_type boomeramg" and I got that cholesky > decomposition method takes > less time then multigrid method. > Is this expected? For smal

Re: [petsc-users] PETSc binary write format

2018-12-03 Thread Smith, Barry F. via petsc-users
> On Dec 3, 2018, at 1:57 PM, Sajid Ali > wrote: > > Apologies for the error on my part. > > Does the vector binary format also work the same way : VEC_FILE_CLASSID (32 > bit int), num_elements (32 bit int) , value of the elements in double > (num_elements*64 bit double) ? Yes. >

Re: [petsc-users] Fortran: undefined reference to petscsfdistributesection

2018-12-03 Thread Matthew Knepley via petsc-users
On Mon, Dec 3, 2018 at 3:40 PM Danyang Su wrote: > > On 2018-12-03 12:03 p.m., Matthew Knepley wrote: > > On Mon, Dec 3, 2018 at 2:27 PM Danyang Su wrote: > >> Hi Matt, >> >> Thanks. >> >> BTW: DmPlexGetVertexNumbering now can work using the latest develop >> version. But the index is not in nat

Re: [petsc-users] PETSc binary write format

2018-12-03 Thread Sajid Ali via petsc-users
Apologies for the error on my part. Does the vector binary format also work the same way : VEC_FILE_CLASSID (32 bit int), num_elements (32 bit int) , value of the elements in double (num_elements*64 bit double) ? Are the doubles stored in IEEE_754 format ? On Mon, Dec 3, 2018 at 1:43 PM Smith,

Re: [petsc-users] PETSc binary write format

2018-12-03 Thread Smith, Barry F. via petsc-users
You saved a Vec to the file, not a Mat. > On Dec 3, 2018, at 1:38 PM, Sajid Ali via petsc-users > wrote: > > Hi, > > I ran ex10 from /vec/examples/tutorials and saved the matrix in binary format. > > Looking at the matrix in binary using xxd, I see > > [sajid@xrm temp]$ xxd -b vector.

[petsc-users] PETSc binary write format

2018-12-03 Thread Sajid Ali via petsc-users
Hi, I ran ex10 from /vec/examples/tutorials and saved the matrix in binary format. Looking at the matrix in binary using xxd, I see [sajid@xrm temp]$ xxd -b vector.dat 000: 00010010 0011 01001110 ..{N.. 006: 00010100 000

Re: [petsc-users] A question regarding a potential use case for DMNetwork

2018-12-03 Thread Markus Lohmayer via petsc-users
Thank you very much both Hong and Matt for your answers! I have looked at the examples and I am still in the middle of figuring out how the concept of network in my application differs from the concept in the examples. It is sure that this will take me a while, especially since I am a beginner.

[petsc-users] How to solve multiple linear systems in parallel, one on each process?

2018-12-03 Thread Klaus Burkart via petsc-users
Hello, I want to solve a cfd case, after decomposition, I get a sub matrix allocated to each process. The example below shows how the data is allocated to the processes (the sample data includes only the lower parts of the matrices). Row and column addresses are local. What petsc program setup

Re: [petsc-users] PETSC address vector c++ access

2018-12-03 Thread RAELI ALICE via petsc-users
Thank you so much Matt and Dave, we will test soon. I think they are what we were searching for. Alice On Fri, 30 Nov 2018 19:40:34 + Dave May wrote: On Fri, 30 Nov 2018 at 14:50, RAELI ALICE via petsc-users < petsc-users@mcs.anl.gov> wrote: Hi All, My team is working on a PETSC vers