Re: [deal.II] On mesh refinement & solution transfer with Raviart-Thomas

2021-04-05 Thread Marc Fehling
Hi Charlie, to 2) Yes, this update would be necessary. It should be okay to overwrite the .output file with the actual output of the test after you applied 1). to 3) I now wonder to where to place it as well! I'm surprised that we have lots of tests for SolutionTransfer in the hp context or the

Re: [deal.II] On mesh refinement & solution transfer with Raviart-Thomas

2021-04-04 Thread Marc Fehling
Hi Charlie! It looks like you found the cause for the issue! The transferred solution now looks like one would expect. It seems like this is a general problem with the restriction matrices in the Raviart-Thomas elements. Would you mind writing a patch for the deal.II library so that your

Re: [deal.II] How to refine mesh in different field

2021-04-03 Thread Marc Fehling
Hi Chen, step-46 gives you a good example on how to perform grid refinement when coupling different kinds of equations in different parts of the domain. I would highly suggest reading this

Re: [deal.II] Problem in the installing dealii-9.2.0

2021-03-27 Thread Marc Fehling
s Fehling, which version of Petsc is compatible with dealii? > > Best regards, > > Em sáb, 27 de mar de 2021 às 8:50 PM, Marc Fehling > escreveu: > >> Hello! >> >> > I wonder whether the PETSc interfaces have changed again. >> >> Sigh, I guess e

Re: [deal.II] Problem in the installing dealii-9.2.0

2021-03-27 Thread Marc Fehling
Hello! > I wonder whether the PETSc interfaces have changed again. Sigh, I guess exactly this happened, see https://github.com/petsc/petsc/commit/569ea7c476928c4ccab5493c49720b88ce3320f4#diff-10d5cd204c8b9d29ef9bc54965f06c2ac039d4cfcdaab6a01ce027a345b9d232 I'll prepare a fix. @Emmanuel: To

Re: [deal.II] Re: Transfer vector of solutions

2021-01-29 Thread Marc Fehling
Hello Karthik, it is perfectly reasonable to treat refinement for the initial mesh separately. I noticed that both your refine and coarsen fractions always add up to 100%. This is not a requirement! You can adjust both fractions independently until you are okay fine with the results. Marc

Re: [deal.II] Re: Transfer vector of solutions

2021-01-27 Thread Marc Fehling
inement at time zero makes > sense but as the solution decays I was hoping the mesh would coarsen (but > it refines further). I am clearly doing something wrong. I need some help > in fixing this issue. > > > Thank you! > > > Karthi. > > On Mon, Jan 25,

[deal.II] Re: deal.ii installation on NERSC Cori

2021-01-26 Thread Marc Fehling
Hello I am not familiar with the details of the NESRC Cori machine. In its documentation, I found the following manual . I hope this helps. I can only speak from my experience on HPC machines, and we had

[deal.II] Re: Transfer vector of solutions

2021-01-24 Thread Marc Fehling
Hi Karthi, if you work on the same DoFHandler, one SolutionTransfer object is sufficient. There are already member functions that take a container of solutions like yours as a parameter. Have a look at SolutionTransfer::prepare_for_coarsening_and_refinement

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-31 Thread Marc Fehling
to use the `p::d::CellDataTransfer` class for your use case as described in the last message. Marc On Thursday, December 31, 2020 at 6:02:00 PM UTC-7 Marc Fehling wrote: > Hi Kaushik, > > Yes, this is possible by changing a cell from FE_Nothing to FE_Q using > p-refinement. &g

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-31 Thread Marc Fehling
all time steps. I was > hoping to save some computation time, by only forming a system consists of > cells that are in the "active" layers only. > > Please let me if this makes sense? Is there any other method in deal.ii > that can simulation such a process? > Thank you

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-28 Thread Marc Fehling
The FiniteElementDomination logic in the codim=0 case would indeed make up a cheap a priori check in this context. In case a FE_Nothing has been configured to dominate, the solution should be continuous on the interface if I understood correctly, i.e., zero on the face. I will write a few

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-28 Thread Marc Fehling
once. I don't know if we have such a test for the general SolutionTransfer class. I will check that. Marc On Monday, December 28, 2020 at 1:39:33 PM UTC-7 Wolfgang Bangerth wrote: > On 12/27/20 8:48 PM, Marc Fehling wrote: > > > > 2) I did not know you were trying to interpol

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-27 Thread Marc Fehling
_Q > elements after p-refinement? Are those always set to zeros after the > refinement? > > [image: image.png] > Thank you, > Kaushik > > On Wed, Dec 23, 2020 at 5:35 PM Marc Fehling wrote: > >> Hi Kaushik, >> >> Be careful on what you are doing here: Y

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-23 Thread Marc Fehling
nk you Mark. >>> I am using the dealii lib that I got from apt-get from >>> deal.ii-9.2.0-backports. >>> I used PETSc and the abort was on even 1 cpus. I tried 2, 3, 6 cpus and >>> all aborted similarly. >>> >>> I will get the latest master b

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-09 Thread Marc Fehling
>From your stacktrace I can see you are using PETSc and deal.II 9.2.0 which already incorporates the specified patch. Would you try to build the actual master branch anyways? On Wednesday, December 9, 2020 at 2:11:59 PM UTC-7 Marc Fehling wrote: > Hi Kaushik, > > I am unable to re

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-09 Thread Marc Fehling
:TriaIterator > >, > std::allocator 2>::CellStatus, dealii::TriaIterator > > > > > const&, std::vector > > (dealii::TriaIterator >, > dealii::Triangulation<2, 2>::CellStatus)>, > std::allocator > > (dealii::TriaIterator >, > dealii::Triangulatio

[deal.II] Re: Periodic boundary conditions : Error using GridTools::collect_periodic_facepairs

2020-12-08 Thread Marc Fehling
Hi Aaditya, on first look your implementation looks good to me. Does the same error occur when you are using a standard `Triangulation` object instead of a `parallel::distributed::Triangulation`? As far as I know, the direction parameter does not matter for scalar fields (see also step-45).

Re: [deal.II] Re: Parallel distributed hp solution transfer with FE_nothing

2020-12-08 Thread Marc Fehling
Hi Kaushik, the `p::d::SolutionTransfer` class should be able to deal with `FENothing` elements in your example. The tricky cases are when you're coarsening a `FENothing` element with others as Bruno already pointed out (h-coarsening), or if you change a `FENothing` element to a different

Re: [deal.II] Re: DEAL.II INSTALLATION ERROR

2020-11-26 Thread Marc Fehling
folder, i.e., /path/to/petsc/include Best, Marc On Thursday, November 26, 2020 at 7:47:50 PM UTC-7 pushkar...@gmail.com wrote: > Yes I did follow the above instructions but I am still facing the same > issue . > > On Fri, Nov 27, 2020 at 3:53 AM Marc Fehling wrote: >

[deal.II] Re: DEAL.II INSTALLATION ERROR

2020-11-26 Thread Marc Fehling
Hi Pushkar! It appears the PETSc has been found during the configuration of deal.II with cmake, but you can not find the header files of the PETSc libraries during compilation. Did you follow all instructions on how to interface deal.II to PETSc on this particular guide

[deal.II] Re: Refinement on a parallel distributed triangulation with periodic BC

2020-11-16 Thread Marc Fehling
Hi Maurice! On Monday, November 16, 2020 at 8:09:14 AM UTC-7 maurice@googlemail.com wrote: > Looking at the doc of `collect_periodic_faces` (This function will collect > periodic face pairs on the coarsest mesh level of the given mesh (a > Triangulation >

[deal.II] Re: Error with boost after installing dealii with spack

2020-11-11 Thread Marc Fehling
Hi Christian! Right now I have two things in mind that you could try out: - Configure your own project with cmake from scratch, if you haven't already done so. - Build deal.II with the bundled version of boost and see if the problem persists. You can also try out to build deal.II

Re: [deal.II] outer product of two vectors

2020-10-07 Thread Marc Fehling
Please have a look at this particular test which showcases how an outer product can be achieved with deal.II! https://github.com/dealii/dealii/blob/master/tests/full_matrix/full_matrix_57.cc Hope this helps! Marc Wolfgang Bangerth schrieb am Dienstag, 6. Oktober 2020 um 18:31:47 UTC-6: > On

[deal.II] Re: hp fem error assigning Fourier

2020-06-08 Thread Marc Fehling
Hi Ishan! You are correct: We opted for a more versatile approach in transforming solutions into Fourier or Legendre series with deal.II 9.2. Glad you figured it out! Marc -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see

Re: [deal.II] Re: Deal.ii installation

2020-04-28 Thread Marc Fehling
Hi Prasad! My guess now is the following: You have PETSc and Trilinos installed on your device, which deal.II finds, but complains that PETSc has been installed with a different MPI configuration as is available for deal.II. -- Include

Re: [deal.II] Re: Deal.ii installation

2020-04-28 Thread Marc Fehling
Prasad! If you look closely into your CMakeError.log file, you'll find that there are multiple tests failing. This is not a bad thing: deal.II figures out your system configuration this way and enables/disables certain features. However in your case, it seems that there is a mandatory test

Re: [deal.II] Re: Deal.ii installation

2020-04-28 Thread Marc Fehling
Hi Prasad! On Tuesday, April 28, 2020 at 10:54:54 AM UTC+2, Prasad Adhav wrote: > > I am using cmake version 3.10.2 > With version 3.10.2, it is less likely that my suggestion will fix your problem. Did cmake actually finished the configuration, as Wolfgang pointed out? Going through the

[deal.II] Re: Deal.ii installation

2020-04-28 Thread Marc Fehling
Hi Prasad! Thank you for providing the logs! It seems like this here is the cause of it: /usr/bin/ld: cannot find -lpthreads It seems like this is again related to the `-lpthread` problem. Just out of curiosity: Which version of cmake and which OS are you using? They may have updated CMake

[deal.II] Re: "libpthreads"?

2020-04-24 Thread Marc Fehling
Hi Victor! This has been fixed upstream with pull request #9117 . This issue only occurs with CMake version >= 3.16. Just compile from the master branch and you're good! Best, Marc On Wednesday, April 22, 2020 at 3:16:40 PM UTC+2, Victor Eijkhout

Re: [deal.II] Mesh refinement and the ability to transfer the data to the quadrature points of the new mesh on parallel::shared::triangulation.

2020-04-22 Thread Marc Fehling
Hi Alberto! If I understood you correctly, you transfer quadrature point data, with the `SolutionTransfer` class which is meant to transfer finite element approximations. A different class dedicated to the transfer of quadrature point data already exists: It is called

[deal.II] Re: Error in make_hanging_node_constraints() while using parallel::distributed::Triangulation with hp::DoFHandler

2020-02-20 Thread Marc Fehling
Hi Chaitanya, we turned your minimum working example into a test case for the deal.II library #9555 . Thank you for providing your code! Would you mind to give it a look, since we reduced and changed a few parts of it? Best, Marc -- The deal.II

[deal.II] Re: Error in make_hanging_node_constraints() while using parallel::distributed::Triangulation with hp::DoFHandler

2020-02-11 Thread Marc Fehling
Hi Chaitanya, This should've been fixed in #8365 , which is not included in deal.II-9.1.1. Compiling the most recent version of deal.II from the master branch should do the trick. Marc -- The deal.II project is located at http://www.dealii.org/

[deal.II] Re: Error installing with Candi

2019-10-14 Thread Marc Fehling
Hi David! On Saturday, October 12, 2019 at 4:34:22 AM UTC+2, David Ryan wrote: > > I'm trying to get deal.ii installed using candi on my Mac running macOS > Mojave. > > Everything seems to work up till the deal.ii compiling where it tells me > that it can't find the lapack libraries. > I've

Re: [deal.II] installation fails with intel/19.0.5

2019-09-30 Thread Marc Fehling
On Monday, September 30, 2019 at 11:01:45 PM UTC+2, Victor Eijkhout wrote: > > > > On Sep 30, 2019, at 3:23 PM, Marc Fehling > wrote: > > Victor, have you tried disabling C++17 support? Maybe that'll do the > trick... > > > cmake option please? > >

Re: [deal.II] installation fails with intel/19.0.5

2019-09-30 Thread Marc Fehling
On Friday, September 27, 2019 at 11:24:12 PM UTC+2, Wolfgang Bangerth wrote: > > Didn't we recently merge a patch where ICC reported that it understands > C++17, but doesn't in fact support this attribute? Does that ring a bell > for anyone? > Intel published a list of all C++17 features

Re: [deal.II] Parallel distributed hp solution transfer

2019-08-27 Thread Marc Fehling
Hi Doug! On Tuesday, August 27, 2019 at 3:41:11 AM UTC+2, Doug wrote: > > Thank you very much for the quick fix! Looking forward to pull this once > it goes through all the checks. > The patch has been merged. Let me know if this fix does the trick for you. We introduced a test named

Re: [deal.II] Parallel distributed hp solution transfer

2019-08-25 Thread Marc Fehling
Hi Doug! Hi Wolfgang! On Sunday, August 25, 2019 at 3:25:06 AM UTC+2, Wolfgang Bangerth wrote: > > On 8/23/19 6:32 PM, Marc Fehling wrote: > > > > Your scenario indeed revealed a bug: Currently, we set and send > > `active_fe_indices` based on the refinement fl

Re: [deal.II] Parallel distributed hp solution transfer

2019-08-23 Thread Marc Fehling
Hi Doug! Your scenario indeed revealed a bug: Currently, we set and send `active_fe_indices` based on the refinement flags on the Triangulation object. However, p4est has the last word on deciding which cells will be refined -- and in your specific scenario p4est makes use of it. I came up

Re: [deal.II] Parallel distributed hp solution transfer

2019-08-21 Thread Marc Fehling
Hi Doug! On Wednesday, August 21, 2019 at 4:00:49 AM UTC+2, Doug wrote: > > 8134: void dealii::parallel::distributed::SolutionTransfer VectorType, DoFHandlerType>::unpack_callback(const typename > dealii::parallel::distributed::Triangulation space_dimension>::cell_iterator&, typename >

Re: [deal.II] Parallel distributed hp solution transfer

2019-08-18 Thread Marc Fehling
Hi Doug, when dealing with distributed meshes, ownership of cells change and we may not know which finite element lives on cells that the process got recently assigned to. Thus, we need to transfer each cell's `active_fe_index`, which we do automatically during coarsening and refinement.

Re: [deal.II] Re: Heat equation (step-26): Negative values with small time step

2018-01-06 Thread Marc Fehling
I extended the step-26 documentation and provided a pull request on github. You can create an iterator to the elements of a matrix row. Would that do > what > you need? > Yes, that's exactly what I was looking for. I just somehow missed the

Re: [deal.II] Re: Heat equation (step-26): Negative values with small time step

2018-01-03 Thread Marc Fehling
Hi Wolfgang, On Monday, December 18, 2017 at 1:45:15 AM UTC+1, Wolfgang Bangerth wrote: > > I think your observation of negative values is an interesting one (and > surprising one, for many). Would you be interested in writing a couple of > paragraphs about time step choice for the introduction

[deal.II] Re: Heat equation (step-26): Negative values with small time step

2017-12-11 Thread Marc Fehling
Hi Bruno, I only heard about applying flux limiters on advection/convection problems, but not on diffusion-related ones. This conforms with what I recently found in literature, but I may skipped something crucial. The equation of interest is the heat equation:

[deal.II] Heat equation (step-26): Negative values with small time step

2017-12-07 Thread Marc Fehling
Dear deal.ii community! I stumbled over some interesting behavior of the heat equation from step-26. If I reduce the time step to a smaller value, let's say to 1e-6, I observe negative values for the solution near the sources (where gradients are large), which I would not expect. I guess it is

[deal.II] Convergence rate of solution scheme for incompressible Navier-Stokes equations

2016-10-07 Thread Marc Fehling
are changing over time. Why is the convergence rate in space inconsistent? Am I missing some crucial point? Best regards, Marc Fehling -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en --- You receive