[deal.II] Re: Get DoFHandler::cell_iterator from Triangulation::cell_iterator

2023-02-02 Thread Lucas Myers
Oops, this is in the FAQ: https://github.com/dealii/dealii/wiki/Frequently-Asked-Questions#can-i-convert-triangulation-cell-iterators-to-dofhandler-cell-iterators My bad! - Lucas On Thursday, February 2, 2023 at 3:53:07 PM UTC-6 Lucas Myers wrote: > Hi everyone, > > I'm trying to use the

[deal.II] Get DoFHandler::cell_iterator from Triangulation::cell_iterator

2023-02-02 Thread Lucas Myers
Hi everyone, I'm trying to use the `distributed_compute_point_locations` functions to compute a finite element function's values at specific points. However, the cells that this function returns are of type Triangulation::cell_iterator, and in order to use the `FEValues::get_function_values`

Re: [deal.II] Constrain a node instead of a face

2023-02-02 Thread Wolfgang Bangerth
I have a simple domain having nodal coordinate as follows: node 1: (0,0) node 2: (1,0) node 3: (1,1) node 4: (0,1) I want to fix node 1 in the x and y directions, and node 2 in only the y direction. What you need to do is find out which vertex contains the node you care about, then which

[deal.II] deal.II Newsletter #240

2023-02-02 Thread 'Rene Gassmoeller' via deal.II User Group
Hello everyone! This is deal.II newsletter #240. It automatically reports recently merged features and discussions about the deal.II finite element library. ## Below you find a list of recently proposed or merged features: #14754: Fix the type used to initialize a DerivativeForm from a

Re: [deal.II] Utilities::MPI::broadcast

2023-02-02 Thread Rahul Gopalan Ramachandran
Thanks Dr. Bangerth for the clarification, that makes sense! Regards, Rahul > On Feb 2, 2023, at 5:37 PM, Wolfgang Bangerth wrote: > > On 2/2/23 09:32, Rahul Gopalan Ramachandran wrote: >> Is Utilities::MPI::broadcast supposed to do the same as MPI_Bcast? I wanted >> an integer to be sent to

Re: [deal.II] Utilities::MPI::broadcast

2023-02-02 Thread Wolfgang Bangerth
On 2/2/23 09:32, Rahul Gopalan Ramachandran wrote: Is Utilities::MPI::broadcast supposed to do the same as MPI_Bcast? I wanted an integer to be sent to all the ranks. So this is what I wrote: Utilities::MPI::broadcast(mpi_communicator,rand_seed,0); However, it seems to be not sending the

[deal.II] Utilities::MPI::broadcast

2023-02-02 Thread Rahul Gopalan Ramachandran
Hello everyone, Is Utilities::MPI::broadcast supposed to do the same as MPI_Bcast? I wanted an integer to be sent to all the ranks. So this is what I wrote: Utilities::MPI::broadcast(mpi_communicator,rand_seed,0); However, it seems to be not sending the variable. Using MPI_Bcast as follows does