On Mon, Jan 17, 2011 at 10:49 PM, <li.luo at siat.ac.cn> wrote: > Hi, Barry. > > I have tested several example, with the same problem. > The attachments are result for ex2 in \ksp\examples\tutorials, > 4GPU-4CPU, different grid size: 128*128, 256*256; with or without CUDA. > You may see the view of KSP, summary log, and database options line. > The result is, > for 128*128, both with an without CUDA, converge. > for 256*256, only without CUDA converge. >
This has the EXACT PROBLEM I mailed about last time. Look at your file ksp_poisson2d_256_4_with_cuda_view.txt It is only 6K. All the others are 14K. It has no information from ksp_view. Matt > Thanks for your help > > Regards, > Li Luo > > > -----????----- > > ???: "Barry Smith" <bsmith at mcs.anl.gov> > > ????: 2011?1?18? ??? > > ???: petsc-maint <petsc-maint at mcs.anl.gov>, li.luo at siat.ac.cn, "For > users of the development version of PETSc" <petsc-dev at mcs.anl.gov> > > ??: > > ??: Re: [petsc-maint #61515] data size of PETSc-dev GPU computing > > > > > > Hmm, could be a bug, could be the algorithm. Run with -ksp_view and > send the output > > > > What problem are you solving, is it a PETSc example? > > > > Barry > > > > On Jan 17, 2011, at 8:31 PM, li.luo at siat.ac.cn wrote: > > > > > Hi, > > > > > > > > > I met a problem when testing some examples on PETSc-dev for GPU > computing, > > > that is, if one proc pair(1GPU-1CPU) is used, the grid size can be > enlarged to even 2048*2048, the memory limitation; > > > however, if more than one proc pairs are use, for example 4GPU-4CPU, > the grid size is limited to about 200*200, if larger, the ksp solver would > not converge. The same problem happens to 8GPU-8CPU limited by 500*500 or > other size. > > > > > > I wonder whether you have the same problem? Any error happens to type > MPICUDA? > > > > > > Regards, > > > Li Luo > > > > > > > > >>> # Machine type: > > >>> CPU: Intel(R) Xeon(R) CPU E5520 > > >>> GPU: Tesla T10 > > >>> CUDA Driver Version: 3.20 > > >>> CUDA Capability Major/Minor version number: 1.3 > > >>> > > >>> # OS Version: > > >>> Linux console 2.6.18-128.el5 #1 SMP Wed Dec 17 11:41:38 EST 2008 > x86_64 x86_64 x86_64 GNU/Linux > > >>> > > >>> # PETSc Version: > > >>> #define PETSC_VERSION_RELEASE 0 > > >>> #define PETSC_VERSION_MAJOR 3 > > >>> #define PETSC_VERSION_MINOR 1 > > >>> #define PETSC_VERSION_SUBMINOR 0 > > >>> #define PETSC_VERSION_PATCH 6 > > >>> #define PETSC_VERSION_DATE "Mar, 25, 2010" > > >>> #define PETSC_VERSION_PATCH_DATE "Thu Dec 9 00:02:47 CST 2010" > > >>> > > >>> > > >>> # MPI implementation: > > >>> ictce3.2/impi/3.2.0.011/ > > >>> > > >>> # Compiler: > > >>> > > >>> > > >>> # Probable PETSc component: > > >>> run with GPU > > >>> # Configure > > >>> ./config/configure.py --download-f-blas-lapack=1 > --with-mpi-dir=/bwfs/software/mpich2-1.2.1p1 --with-shared-libraries=0 > --with-debugging=no --with-cuda-dir=/bwfs/home/liluo/cuda3.2_64 > --with-thrust-dir=/bwfs/home/liluo/cuda3.2_64/include/thrust > --with-cusp-dir=/bwfs/home/liluo/cuda3.2_64/include/cusp-library > > > > > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110118/2a24dd40/attachment.html>
