I also just ran with cupy.linalg.eigvalsh (which wraps cuSOLVER), and it
only took 3.1 seconds. I will probably use this, but it is good to know
about the SLEPc cases if I don't need the full spectrum or have sparse
matrices, etc.
Thanks,
Sreeram
On Mon, May 13, 2024 at 2:13 PM Sreeram R Venkat
Apologies, I accidentally hit "reply" instead of "reply-all."
Thank you for the reference. Actually, I just tested that N ~ 1e4 case
where I had saved the dense matrix to a python-readable format. Using
scipy.linalg.eigvalsh, I got the eigenvalues in ~1.5 minutes. They agree
with the ones I got fr
Please respond to the list. The mpd parameter means "maximum projected dimension". You can think of the projected problem as the "sequential" part of the computation, that is not parallelized ("small" dense eigenproblem). When you run with
ZjQcmQRYFpfptBannerStart
This
Computing the full spectrum is always an unpleasant task. But if you cannot avoid it, I would suggest that you compute the eigenvalues in two runs: n/2 largest real eigenvalues and n/2 smallest real. If your matrix-vector product is cheap,
ZjQcmQRYFpfptBannerStart
This
On Mon, May 13, 2024 at 1:40 PM Sreeram R Venkat
wrote:
> I have a MatShell object that computes matrix-vector products of a dense
> symmetric matrix of size NxN. The MatShell does not actually form the dense
> matrix, so it is never in memory/storage. For my application, N ranges from
> 1e4 to 1
I have a MatShell object that computes matrix-vector products of a dense
symmetric matrix of size NxN. The MatShell does not actually form the dense
matrix, so it is never in memory/storage. For my application, N ranges from
1e4 to 1e5.
I want to compute the full spectrum of this matrix. For an ex