Dear Arthur,
The  get_function_values does not modify current_solution. Consequently, as 
along as present_pressure_values is either locally declared or a member of 
the scratch data, you should not get a seg fault.
Do you get any assertion thrown when launching in debug mode?


On Tuesday, November 4, 2025 at 3:42:34 p.m. UTC+1 [email protected] wrote:

> Good morning deal.II community,
>
> I am getting segmentation faults when using WorkStream::run and 2+ threads 
> when the worker function involves reinitializing a scratch_data and 
> interpolating the current FE solution with FEValues, for instance:
>
> fe_values[pressure].get_function_values(current_solution, 
> present_pressure_values);
>
> The current solution is a distributed PETSc vector. Overall and following 
> this post (https://groups.google.com/g/dealii/c/Jvt36NOXM4o/m/tytRf3N9f4gJ), 
> does it still hold that multithreaded matrix/rhs assembly with PETSc 
> wrappers is not thread safe? (I think so since PETSc is still stated as not 
> thread safe, but I'm asking in case I missed something)
>
> I am currently testing with Mumps (through PETSc): if I'm not mistaken, 
> then the other possibilities to use it with distributed matrix/vectors and 
> threaded assembly are with deal.II's own distributed vectors + standalone 
> Mumps, or with Trilinos wrappers + Mumps through Amesos/Amesos2, is that 
> correct?
>
> Thank you for your time,
> Arthur
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/dealii/fe3cc26d-9dbc-4842-9f0e-dc8aada6e863n%40googlegroups.com.

Reply via email to