Thanks a lot, Barry and Jed.
 
     Your explain is very clear and informative.  Your suggestions make me move 
forward to my goal smoothly. I will try it. 
 
 
     
 
> From: bsmith at mcs.anl.gov
> Date: Wed, 9 Feb 2011 11:36:47 -0600
> To: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] questions about the multigrid framework
> 
> 
> On Feb 9, 2011, at 9:58 AM, Peter Wang wrote:
> 
> > Thanks Barry,
> > 
> > I run the code with -ksp_monitor_true_residual -ksp_converged_reason, and 
> > it turns out that the computation didn't get the real convergence. After I 
> > set the rtol and more iteration, the numerical solution get better. 
> > However, the computation converges very slowly with finer grid points. For 
> > example, with nx=2500 and ny=10000, (lx=2.5e-4,ly=1e-3, and the 
> > distribution varys mainly in y direction) 
> > at IT=72009, true resid norm 1.638857052871e-01 ||Ae||/||Ax|| 
> > 9.159199925235e-07
> > IT=400000,true resid norm 1.638852449299e-01 ||Ae||/||Ax|| 
> > 9.159174196917e-07.
> > and it didn't converge yet.
> > 
> > I am wondering if the solver is changed, the convergency speed could get 
> > fater? Or, I should take anohte approach to use finer grids, like 
> > multigrid? Thanks for your help.
> 
> You have a little confusion here. Multigrid (in the context of PETSc and 
> numerical solvers) is ONLY an efficient way to solve a set of linear 
> equations arising from discretizing a PDE. It is not a different way of 
> discretizing the PDEs or giving a different or better solution. It is only a 
> way of getting the same solution (potentially much) faster than running the 
> slower convergent solver.
> 
> Definitely configure PETSc with --download-ml --download-hypre and make runs 
> using -pc_type hypre and then -pc_type ml to see how algebraic multigrid 
> works, it should work fine for your problem.
> 
> Barry
> 
> 
> > 
> > 
> > > From: bsmith at mcs.anl.gov
> > > Date: Sun, 6 Feb 2011 21:30:56 -0600
> > > To: petsc-users at mcs.anl.gov
> > > Subject: Re: [petsc-users] questions about the multigrid framework
> > > 
> > > 
> > > On Feb 6, 2011, at 5:00 PM, Peter Wang wrote:
> > > 
> > > > Hello, I have some concerns about the multigrid framework in PETSc.
> > > > 
> > > > We are trying to solve a two dimensional problem with a large variety 
> > > > in length scales. The length of computational domain is in order of 1e3 
> > > > m, and the width is in 1 m, nevertheless, there is a tiny object with 
> > > > 1e-3 m in a corner of the domain.
> > > > 
> > > > As a first thinking, we tried to solve the problem with a larger number 
> > > > of uniform or non-uniform grids. However, the error of the numerical 
> > > > solution increases when the number of the grid is too large. In order 
> > > > to test the effect of the grid size on the solution, a domain with 
> > > > regular scale of 1m by 1m was tried to solve. It is found that the 
> > > > extreme small grid size might lead to large variation to the exact 
> > > > solution. For example, the exact solution is a linear distribution in 
> > > > the domain. The numerical solution is linear as similar as the exact 
> > > > solution when the grid number is nx=1000 by ny=1000. However, if the 
> > > > grid number is nx=10000 by ny=10000, the numerical solution varies to 
> > > > nonlinear distribution which boundary is the only same as the exact 
> > > > solution. 
> > > 
> > > Stop right here. 99.9% of the time what you describe should not happen, 
> > > with a finer grid your solution (for a problem with a known solution for 
> > > example) will be more accurate and won't suddenly get less accurate with 
> > > a finer mesh.
> > > 
> > > Are you running with -ksp_monitor_true_residual -ksp_converged_reason to 
> > > make sure that it is converging? and using a smaller -ksp_rtol <tol> for 
> > > more grid points. For example with 10,000 grid points in each direction 
> > > and no better idea of what the discretization error is I would use a tol 
> > > of 1.e-12
> > > 
> > > Barry
> > > 
> > > We'll deal with the multigrid questions after we've resolved the more 
> > > basic issues.
> > > 
> > > 
> > > > The solver I used is a KSP solver in PETSc, which is set by calling :
> > > > KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr). Whether this 
> > > > solver is not suitable to the system with small size grid? Or, whether 
> > > > the problem crossing 6 orders of length scale is solvable with only one 
> > > > level grid system when the memory is enough for large matrix? Since 
> > > > there is less coding work for one level grid size, it would be easy to 
> > > > implement the solver.
> > > > 
> > > > I did some research work on the website and found the slides by Barry on
> > > > http://www.mcs.anl.gov/petsc/petsc-2/documentation/tutorials/Columbia04/DDandMultigrid.pdf
> > > > It seems that the multigrid framework in PETSc is a possible approach 
> > > > to our problem. We are thinking to turn to the multigrid framework in 
> > > > PETSc to solve the problem. However, before we dig into it, there are 
> > > > some issues confusing us. It would be great if we can get any 
> > > > suggestion from you:
> > > > 1 Whether the multigrid framework can handle the problem with a large 
> > > > variety in length scales (up to 6 orders)? Is DMMG is the best tool for 
> > > > our problem?
> > > > 
> > > > 2 The coefficient matrix A and the right hand side vector b were 
> > > > created for the finite difference scheme of the domain and solved by 
> > > > KSP solver 
> > > > (callKSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr)). Is it 
> > > > easy to immigrate the created Matrix A and Vector b to the multigrid 
> > > > framework?
> > > > 
> > > > 3 How many levels of the subgrid are needed to obtain a solution close 
> > > > enough to the exact solution for a problem with 6 orders in length 
> > > > scale?
> > > > 
> > > 
> 
                                          
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110209/a1f1c021/attachment.htm>

Reply via email to