Re: [deal.II] Stabilized FEM implementation with bubble function

2021-01-04 Thread Lixing Zhu
Many thanks for such an instant response. I'll adopt these ideas and try to eliminate those DOFs related to bubble support following the workflow in the tutorial. Regards, Lixing On Tuesday, January 5, 2021 at 1:21:56 PM UTC+8 Wolfgang Bangerth wrote: > > Lixing, > > > I am trying to

Re: [deal.II] Stabilized FEM implementation with bubble function

2021-01-04 Thread Wolfgang Bangerth
Lixing, I am trying to implement a stabiliazed weak form (e.g. advection-diffusion) where the stabilization tensor is computed element-wise through a standard bubble: \Pi(1-x_i^2). It seems that FE_Q_Bubbles should provides all I need, but here are two things I am not quite clear about,

[deal.II] Stabilized FEM implementation with bubble function

2021-01-04 Thread Lixing Zhu
Dear all, I am trying to implement a stabiliazed weak form (e.g. advection-diffusion) where the stabilization tensor is computed element-wise through a standard bubble: \Pi(1-x_i^2). It seems that FE_Q_Bubbles should provides all I need, but here are two things I am not quite clear about, 1.

Re: [deal.II] PETSC SparsMatrix initialization error

2021-01-04 Thread Zachary Streeter
My project is in quantum scattering and I would like to have some operators be distributed PETSc objects. So inside my OneBodyHamiltonianOperator class (for example), I would like to create a PETScWrappers::MPI::SparseMatrix and then use SLEPC to solve for the ground state and excited states.

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2021-01-04 Thread Wolfgang Bangerth
Kaushik Marc and others have already answered the technical details, so just one overall comment: Let me explain what I am trying to do and why. I want to solve a transient heat transfer problem of the additive manufacturing (AM) process. In AM processes, metal powder is deposited in

Re: [deal.II] PETSC SparsMatrix initialization error

2021-01-04 Thread Wolfgang Bangerth
Zachary, I am trying to debug this strange behavior.  I am trying to build a PETSC sparse parallel matrix using 4 processors.  This gives me 32 local number of rows (so 128 global number of rows).  But when I pass the local_num_of_rows variable into the reinit function, this is the PETSC

[deal.II] Tips on writing "versatile" assembly function

2021-01-04 Thread blais...@gmail.com
Dear all, I wish you all an happy new year! One problem we always end up facing with FEM problems is that, as program grow, more and more features are added to the equations. This leads to multiple variation of the same equations (for example, Navier-Stokes with Newtonian and non-Newtonian

[deal.II] PETSC SparsMatrix initialization error

2021-01-04 Thread Zachary 42!
Hi everyone, I am trying to debug this strange behavior. I am trying to build a PETSC sparse parallel matrix using 4 processors. This gives me 32 local number of rows (so 128 global number of rows). But when I pass the local_num_of_rows variable into the reinit function, this is the PETSC

Re: [deal.II] Surface / volume interactions for a 3D extruded mesh

2021-01-04 Thread Wells, David
Hi Corbin, > Is there a better way I could go about mapping the 3D volume data and > depth-averaged data to the surface mesh? This is a tough question - other deal.II developers are trying to figure out how to do this in parallel here: https://github.com/dealii/dealii/issues/10037 for now

Re: [deal.II] Unable to run 'make test'

2021-01-04 Thread Wells, David
Hi Romin, That error means that the quicktest for gmsh failed - in particular, deal.II was not able to successfully run your gmsh executable. This usually means that there is something wrong with your gmsh installation. Unless you plan on using gmsh from inside deal.II (i.e., calling gmsh to

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2021-01-04 Thread Bruno Turcksin
Kaushik, Oh wow this is a small world :D Unfortunately, PETSc solver requires a PETSc vector but I think it should be straightforward to add compress(min) to the PETSc vector. So that's a possibility if copying the solution takes too much time. Bestm Bruno Le dim. 3 janv. 2021 à 21:42,

[deal.II] Re: deal.ii in Docker

2021-01-04 Thread luca.heltai
Dear Chris, I’m forwarding the mail also to the deal.II usergroup, as many others may find this useful. > On 4 Jan 2021, at 10:20, Christopher Ham wrote: > > Dear Both, > > I wonder if you might be able to help me. I would really like to try > out deal.ii. I am a bit of a novice with the