The only thing specifically related to the number of unknowns that is not due to a bug in either your code or our code is if
1) the total problem size is greater than 2^31 - 1 (which it doesn't sound like in your case) or 2) the number of non zeros in a matrix on a single MPI process is more than 2^31 -1 (which could be if each process has a huge amount of memory) These limits are because by default PETSc uses 32 bit integers to hold indices and sizes. If you configure PETSc with --with-64-bit-indices then PETSc uses 64 bit integers to hold indices and sizes and you can solver any size problem. Barry On Jan 9, 2013, at 12:45 PM, Fande Kong <fd.kong at siat.ac.cn> wrote: > Hi all, > > I want to try to solve a problem with half billion unknowns with > preconditioner pcmg (Of course, I have successfully provided the > interpolation matrix and the coarse matrix). When the size of the unknowns > is 1e7 level, the solve work very well with 1020 cores on the super computer. > But when the size of the unknowns increases to 1e8 level, the preconditioner > setup stage break down. The following is my run script that I use to set the > solver and the preconditioner. > > -pc_type mg -ksp_type fgmres -pc_mg_levels 2 -pc_mg_cycle_type v > -pc_mg_type multiplicative -mg_levels_1_ksp_type richardson > -mg_levels_1_ksp_max_it 1 -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type > preonly -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 1 > -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type gmres > -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 2 -mg_coarse_pc_type asm > -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu > -mg_coarse_sub_pc_factor_levels 1 -mg_coarse_sub_pc_factor_mat_ordering_type > rcm -ksp_view > > My question is weather the linear system with half billion unknowns is too > big to solve. Or are there some bugs in preconditioner pcmg? > > -- > Fande Kong > ShenZhen Institutes of Advanced Technology > Chinese Academy of Sciences
