Hi Alexander, try to use "hanging_node_constraints.condense (c_sparsity); " before copy_from() instead.
On Wed, May 9, 2012 at 11:29 AM, Alexander Grayver <[email protected]> wrote: > Hello, > > I'm trying to further work with CompressedSimpleSparsityPattern and I don't > understand how to avoid this error: > > An error occurred in line <847> of file > </usr/local/deal.II-dev/source/lac/constraint_matrix.cc> in function > void dealii::ConstraintMatrix::condense(dealii::SparsityPattern&) const > The violated condition was: > sparsity.is_compressed() == false > The name and call sequence of the exception was: > ExcMatrixIsClosed() > > Because I need to take into account hanging nodes and want to use > CompressedSimpleSparsityPattern at the same time: > > CompressedSparsityPattern c_sparsity(dof_handler.n_dofs()); > DoFTools::make_sparsity_pattern (dof_handler, c_sparsity); > sparsity_pattern.copy_from(c_sparsity); > > hanging_node_constraints.condense (sparsity_pattern); > sparsity_pattern.compress(); > > What would be the way to do that? > > Thanks. > > > On 08.05.2012 12:24, Alexander Grayver wrote: >> >> Timo, Markus, >> >> CompressedSimpleSparsityPattern didn't help. >> However I see the point now. I've just thought that the limit will be >> reached a bit later... >> >> Thank you. >> >> On 07.05.2012 14:50, Timo Heister wrote: >>>> >>>> you are demanding too much memory here. With your triangulation and this >>>> FESystem you create more than 12 million degrees of freedom. Especially >>>> with >>>> the Nédélec elements, which couple quite heavily, this requires a bit >>>> more >>>> than 64 GB of memory. >>> >>> That is not quite correct. You roughly have 2 million cells that takes >>> about >>> 700mb in the Triangulation and you have 12.8 million DoFs: the >>> DoFHandler uses around 300mb for that. >>> >>> The problem lies in the way you create your SparsityPattern. >>> dof_handler.max_couplings_between_dofs() is a very pessimistic value >>> as an estimate for the number of entries per row, especially in 3d. In >>> your case it is 1764. With this you create a SparsityPattern with 12.8 >>> million*1764= 22 billion entries (which is too much memory!). >>> The first step would be replace the SparsityPattern by a >>> CompressedSimpleSparsityPattern and check if that reduces the number >>> of entries enough (see step-3). >>> >> >> > > > -- > Regards, > Alexander > > _______________________________________________ > dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii -- Timo Heister http://www.math.tamu.edu/~heister/ _______________________________________________ dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii
