Re: [deal.II] Complex-valued distributed matrices in dealii

2020-07-26 Thread Pascal Kraft
ses derive from Tpetra::Operator <https://docs.trilinos.org/dev/packages/tpetra/doc/html/classTpetra_1_1Operator.html>, the base class for linear operators. see https://docs.trilinos.org/dev/packages/tpetra/doc/html/index.html Pascal Kraft schrieb am Sonntag, 26. Juli 2020

Re: [deal.II] Complex-valued distributed matrices in dealii

2020-07-26 Thread Pascal Kraft
GMRES: I will be using PETSc GMRES to solve my system, but if possible I will try to also solve it with dealii::SolverGMRES and let you know what happens. Kind regards, Pascal Wolfgang Bangerth schrieb am Sonntag, 26. Juli 2020 um 01:43:44 UTC+2: > On 7/23/20 10:42 AM, Pascal Kraft wr

Re: [deal.II] Complex-valued distributed matrices in dealii

2020-07-23 Thread Pascal Kraft
on Epetra which only supports > double AFAICT. That's why you can replace TrilinosScalar easily. > On the other hand, you should be able to compile PETSc with complex scalar > type and use that with MPI. > > Best, > Daniel > > Am Do., 23. Juli 2020 um 12:42 Uhr schrieb Pas

[deal.II] Re: Complex-valued distributed matrices in dealii

2020-07-23 Thread Pascal Kraft
::iterator’ {aka ‘std::complex*’} in return 1525 | return (*vector)[0]; suggesting a hard dependency on double somewhere else. Pascal Kraft schrieb am Donnerstag, 23. Juli 2020 um 18:42:47 UTC+2: > Dear Deal.II devs and users, > > In the latest release a lot of (great) work has been don

[deal.II] Complex-valued distributed matrices in dealii

2020-07-23 Thread Pascal Kraft
Versions all use a templated Vector which can take complex components) and MPI distribution of a sparse system. I have so far only seen FullMatrix to accept complex numbers. Can anyone give me a pointer on what is possible? Kind regards, Pascal Kraft -- The deal.II project is located at http

Re: [deal.II] Several questions about mesh generation and distributed triangulations

2019-01-15 Thread Pascal Kraft
Dear Wolfgang, thank you for your reply! I only noticed it now a while later since I had thought the topic was dead back when I posted it. Thank you for your time and effort! Remarks to your response below: Am Dienstag, 30. Oktober 2018 17:26:21 UTC+1 schrieb Wolfgang Bangerth: > > > Pascal,

[deal.II] Re: Several questions about mesh generation and distributed triangulations

2018-10-18 Thread Pascal Kraft
such that the partitioning is exactly the one I want would be hard and it would depend massively on the algorithm used to compute the partitioning which should be kept as a black box. This would also be a very error-prone workaround. Am Donnerstag, 18. Oktober 2018 13:54:13 UTC+2 schrieb Pascal

[deal.II] Several questions about mesh generation and distributed triangulations

2018-10-18 Thread Pascal Kraft
I will try to be as short as possible - if more details are required feel free to ask. Also I offer to submit all mesh generation code I create in the future, since others might have similar needs at some point. I work on a 3d mesh with purely axis-parallel edges. The mesh is a 2d-mesh (say in

[deal.II] Re: Error during configuration since 9.0.0

2018-05-16 Thread Pascal Kraft
is fine. So I guess this problem will appear for everyone who uses the packages "openmpi-bin" and "libscalapack-openmpi2.0". Thank you all for your time :) Am Dienstag, 15. Mai 2018 19:38:03 UTC+2 schrieb Pascal Kraft: > > Dear Deal.ii devs, > > first off: Thanks

[deal.II] Re: Error during configuration since 9.0.0

2018-05-16 Thread Pascal Kraft
Configuration now works if I explicitely switch scalapack off (-DDEAL_II_WITH_SCALAPACK=OFF)... I will try to find out why. Am Dienstag, 15. Mai 2018 19:38:03 UTC+2 schrieb Pascal Kraft: > > Dear Deal.ii devs, > > first off: Thanks for your great work and the many new features in 9.0

[deal.II] Re: Error during configuration since 9.0.0

2018-05-16 Thread Pascal Kraft
st/b.cpp yields no errors (code in b.cpp is int main(){ return 0; }). So am I right in suggesting that there might be an error in -DDEAL_II_HAVE_USABLE_FLAGS_DEBUG ? Do you have any suggestions on what I could try next? Am Dienstag, 15. Mai 2018 19:38:03 UTC+2 schrieb Pascal Kraft: > > Dea

Re: [deal.II] Error during configuration since 9.0.0

2018-05-15 Thread Pascal Kraft
to check this tomorrow) I also ran mpi-enabled codes of mine on that machine. Could there still be a problem with my openmpi-install? With kind regards, Pascal Kraft Am Dienstag, 15. Mai 2018 21:30:12 UTC+2 schrieb Timo Heister: > > You need to look at the last error in CMakeErr

Re: [deal.II] Nedelec Elements and non-tangential Dirichlet data

2018-01-18 Thread Pascal Kraft
Ignore my question about projection. Somehow I thought I remembered that the projection functions dont support nedelec elements in deal - my bad. Am Donnerstag, 18. Januar 2018 18:57:06 UTC+1 schrieb Pascal Kraft: > > Thanks for your fast reply! > About your first point: Yes, I curr

Re: [deal.II] Nedelec Elements and non-tangential Dirichlet data

2018-01-18 Thread Pascal Kraft
. Is there a function to compute the best approximation for a given element type (like nedelec)? Again, thank you for your time, Kind regards, Pascal Am Dienstag, 16. Januar 2018 18:45:15 UTC+1 schrieb Wolfgang Bangerth: > > On 01/16/2018 01:41 AM, Pascal Kraft wrote: > > I am currently usin

[deal.II] Nedelec Elements and non-tangential Dirichlet data

2018-01-16 Thread Pascal Kraft
I am currently using a FeSystem composed of two 3D Fields (real and imaginary E-field) and I want to impose Dirichlet conditions of the sort E(x,y,z) = E_{in}(x,y,z) on the input interface (in an xy-plane). Earlier the z-component had been 0 so I did not run into real problems and could use

Re: [deal.II] Re: Internal instability of the GMRES Solver / Trilinos

2017-03-16 Thread Pascal Kraft
eal.II/classTrilinosWrappers_1_1VectorBase.html#afa80df228813b5bd94a6e780a4f5e6ae>->Map()) > > == false) > > Best, Martin > On 16.03.2017 01:28, Pascal Kraft wrote: > > Hi Martin, > that didn't solve my problem. What I have done in the meantime is replace > the chec

Re: [deal.II] Re: Internal instability of the GMRES Solver / Trilinos

2017-03-16 Thread Pascal Kraft
ector->Map()) == false) > > to > > if (v.vector->Map().SameAs(vector > <https://www.dealii.org/8.4.0/doxygen/deal.II/classTrilinosWrappers_1_1VectorBase.html#afa80df228813b5bd94a6e780a4f5e6ae>->Map()) > > == false) > > Best, Martin > On 16.0

[deal.II] Re: Internal instability of the GMRES Solver / Trilinos

2017-03-15 Thread Pascal Kraft
Dear Timo, I have done some more digging and found out the following. The problems seem to happen in trilinos_vector.cc between the lines 240 and 270. What I see on the call stacks is, that one process reaches line 261 ( ierr = vector->GlobalAssemble (last_action); ) and then waits inside this

[deal.II] Internal instability of the GMRES Solver / Trilinos

2017-03-14 Thread Pascal Kraft
ten stuck in the exact same way. I had thought it might be some internal use of MPI_COMM_WORLD that was blocking somehow but it also happens now that I only use one communicator (MPI_COMM_WORLD). Thank you in advance for your time, Pascal Kraft -- The deal.II project is located at http://www.d