Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Satish Balay via petsc-users
On Fri, 11 Aug 2023, Jed Brown wrote: > Jacob Faibussowitsch writes: > > > More generally, it would be interesting to know the breakdown of installed > > CUDA versions for users. Unlike compilers etc, I suspect that cluster > > admins (and those running on local machines) are much more likely

Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Jed Brown
Jacob Faibussowitsch writes: > More generally, it would be interesting to know the breakdown of installed > CUDA versions for users. Unlike compilers etc, I suspect that cluster admins > (and those running on local machines) are much more likely to be updating > their CUDA toolkits to the

Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Jacob Faibussowitsch
> We should support it, but it still seems hypothetical and not urgent. FWIW, cuBLAS only just added 64-bit int support with CUDA 12 (naturally, with a completely separate API). More generally, it would be interesting to know the breakdown of installed CUDA versions for users. Unlike

Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Jed Brown
Rohan Yadav writes: > With modern GPU sizes, for example A100's with 80GB of memory, a vector of > length 2^31 is not that much memory -- one could conceivably run a CG solve > with local vectors > 2^31. Yeah, each vector would be 8 GB (single precision) or 16 GB (double). You can't store a

Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Rohan Yadav
>We do not currently have any code for using 64 bit integer sizes on the GPUs. Thank you, just wanted confirmation. >Given the current memory available on GPUs is 64 bit integer support needed? I think even a single vector of length 2^31 will use up most of the GPU's memory? Are the

Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Junchao Zhang
Rohan, You could try the petsc/kokkos backend. I have not tested it, but I guess it should handle 64 bit CUDA index types. I guess the petsc/cuda 32-bit limit came from old CUDA versions where only 32-bit indices were supported such that the original developers hardwired the type to

Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Barry Smith
We do not currently have any code for using 64 bit integer sizes on the GPUs. Given the current memory available on GPUs is 64 bit integer support needed? I think even a single vector of length 2^31 will use up most of the GPU's memory? Are the practical, not synthetic, situations that