Good morning deal.II community,

I am getting segmentation faults when using WorkStream::run and 2+ threads 
when the worker function involves reinitializing a scratch_data and 
interpolating the current FE solution with FEValues, for instance:

fe_values[pressure].get_function_values(current_solution, 
present_pressure_values);

The current solution is a distributed PETSc vector. Overall and following 
this post 
(https://groups.google.com/g/dealii/c/Jvt36NOXM4o/m/tytRf3N9f4gJ), does it 
still hold that multithreaded matrix/rhs assembly with PETSc wrappers is 
not thread safe? (I think so since PETSc is still stated as not thread 
safe, but I'm asking in case I missed something)

I am currently testing with Mumps (through PETSc): if I'm not mistaken, 
then the other possibilities to use it with distributed matrix/vectors and 
threaded assembly are with deal.II's own distributed vectors + standalone 
Mumps, or with Trilinos wrappers + Mumps through Amesos/Amesos2, is that 
correct?

Thank you for your time,
Arthur

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/dealii/bc8e4f20-0cfe-43e7-ad43-2692fa1ce47an%40googlegroups.com.

Reply via email to