On Feb 6, 2011, at 5:00 PM, Peter Wang wrote:

> Hello, I have some concerns about the multigrid framework in PETSc.
>  
> We are trying to solve a two dimensional problem with a large variety in 
> length scales.  The length of computational domain is in order of 1e3 m, and 
> the width is in 1 m, nevertheless, there is a tiny object with 1e-3 m in a 
> corner of the domain.
>  
> As a first thinking, we tried to solve the problem with a larger number of 
> uniform or non-uniform grids.  However, the error of the numerical solution 
> increases when the number of the grid is too large.  In order to test the 
> effect of the grid size on the solution, a domain with regular scale of 1m by 
> 1m was tried to solve.  It is found that the extreme small grid size might 
> lead to large variation to the exact solution.  For example, the exact 
> solution is a linear distribution in the domain. The numerical solution is 
> linear as similar as the exact solution when the grid number is nx=1000 by 
> ny=1000.  However, if the grid number is nx=10000 by ny=10000, the numerical 
> solution varies to nonlinear distribution which boundary is the only same as 
> the exact solution.  

  Stop right here. 99.9% of the time what you describe should not happen, with 
a finer grid your solution (for a problem with a known solution for example) 
will be more accurate and won't suddenly get less accurate with a finer mesh.

   Are you running with -ksp_monitor_true_residual -ksp_converged_reason to 
make sure that it is converging? and using a smaller -ksp_rtol <tol> for more 
grid points. For example with 10,000 grid points in each direction and no 
better idea of what the discretization error is I would use a tol of 1.e-12

  Barry

  We'll deal with the multigrid questions after we've resolved the more basic 
issues.


> The solver I used is a KSP solver in PETSc, which is set by calling :
> KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr).  Whether this solver 
> is not suitable to the system with small size grid? Or, whether the problem 
> crossing 6 orders of length scale is solvable with only one level grid system 
> when the memory is enough for large matrix? Since there is less coding work 
> for one level grid size, it would be easy to implement the solver.
>  
> I did some research work on the website and found the slides by Barry on
>  
> http://www.mcs.anl.gov/petsc/petsc-2/documentation/tutorials/Columbia04/DDandMultigrid.pdf
> It seems that the multigrid framework in PETSc is a possible approach to our 
> problem.  We are thinking to turn to the multigrid framework in PETSc to 
> solve the problem.  However, before we dig into it, there are some issues 
> confusing us.  It would be great if we can get any suggestion from you:
> 1  Whether the multigrid framework can handle the problem with a large 
> variety in length scales (up to 6 orders)? Is DMMG is the best tool for our 
> problem?
>  
> 2  The coefficient matrix A and the right hand side vector b were created for 
> the finite difference scheme of the domain and solved by KSP solver 
> (callKSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr)).  Is it easy to 
> immigrate the created Matrix A and Vector b to the multigrid framework?
>  
> 3  How many levels of the subgrid are needed to obtain a solution close 
> enough to the exact solution for a problem with 6 orders in length scale?
>  

Reply via email to