Does this code work on one core (mpirun -np 1)? Are there any extra errors 
when running in debug mode, or are you already running it in debug mode?

On Tuesday, June 7, 2016 at 7:02:42 PM UTC+2, Ehsan Esfahani wrote:
>
> Thank for your response. I'm not sure it's related because, previously, 
> without distributed triangulation, I modified step-25 in order to solve my 
> problem (ginzburg landau eq.) and in that code, I didn't use those lines of 
> the code, and it's running without errors. Also, I don't need to implement 
> boundary conditions because of GL eq. 
>
> On Tuesday, June 7, 2016 at 11:26:02 AM UTC-5, Jean-Paul Pelteret wrote:
>>
>> The one thing that I've noticed is that you're not using a constraint 
>> matrix, as is done in step-40:
>> constraints.clear 
>> <https://dealii.org/8.4.0/doxygen/deal.II/classConstraintMatrix.html#a24120d0331183f9a63cbe41493a19f6b>
>>  
>> ();
>> constraints.reinit 
>> <https://dealii.org/8.4.0/doxygen/deal.II/classConstraintMatrix.html#ac2726821354883ac97fe7e6181de9792>
>>  
>> (locally_relevant_dofs);
>> DoFTools::make_hanging_node_constraints 
>> <https://dealii.org/8.4.0/doxygen/deal.II/group__constraints.html#ga3eaa31a679484e80c193e74e8a967dc8>
>>  
>> (dof_handler, constraints);
>> VectorTools::interpolate_boundary_values 
>> <https://dealii.org/8.4.0/doxygen/deal.II/namespaceVectorTools.html#af6f700f193e9d5b52e9efe55e9b872d5>
>>  
>> (dof_handler,
>>                                          0,
>>                                          ZeroFunction<dim> 
>> <https://dealii.org/8.4.0/doxygen/deal.II/classZeroFunction.html>(),
>>                                          constraints);
>> constraints.close 
>> <https://dealii.org/8.4.0/doxygen/deal.II/classConstraintMatrix.html#a8056d07faa2a7ed3f158c1b42d56abc8>
>>  
>> ();
>>         constraints.distribute_local_to_global 
>> <https://dealii.org/8.4.0/doxygen/deal.II/classConstraintMatrix.html#aa9f3612a8fc51eafa34252bb436e8ae4>
>>  
>> (cell_matrix,
>>                                                cell_rhs,
>>                                                local_dof_indices,
>>                                                system_matrix,
>>                                                system_rhs);
>> Perhaps this has something to do with it? I'm not sufficiently familiar 
>> with distributed triangulations to say for sure that this is the problem. 
>> It looks like you may be trying to access an entry in the sparsity pattern 
>> that doesn't exist.
>>
>>
>> On Tuesday, June 7, 2016 at 6:16:40 PM UTC+2, Jean-Paul Pelteret wrote:
>>>
>>> For those who may be able to offer some advice, this is the full error 
>>> message extracted from the discussion  here 
>>> <https://github.com/dealii/dealii/issues/2670>:
>>>
>>> **> Time step #1 <https://github.com/dealii/dealii/pull/1>; advancing 
>>> to t = 0.2.
>>>
>>>
>>>
>>> [0]PETSC ERROR: --------------------- Error Message 
>>> --------------------------------------------------------------
>>>
>>> [0]PETSC ERROR: Argument out of range
>>>
>>> [0]PETSC ERROR: Inserting a new nonzero at global row/column (0, 0) into 
>>> matrix
>>>
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
>>> for trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015 
>>>
>>> [0]PETSC ERROR: 
>>> /home/ehsan/apps/candi/deal.II-toolchain/deal.II-v8.4.0/examples/step-25
>>>  (mine)/step-25 on a arch-linux2-c-opt named 
>>> levitasgrad01.me.iastate.edu by ehsan Mon Jun  6 13:06:02 2016
>>>
>>> [0]PETSC ERROR: Configure options 
>>> --prefix=/home/ehsan/apps/candi/deal.II-toolchain/petsc-3.6.3 
>>> --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 
>>> --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90
>>>
>>> [0]PETSC ERROR: #1 <https://github.com/dealii/dealii/pull/1> 
>>> MatSetValues_MPIAIJ() line 582 in 
>>> /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/impls/aij/mpi/mpiaij.c
>>>
>>> [0]PETSC ERROR: #2 <https://github.com/dealii/dealii/pull/2> 
>>> MatSetValues() line 1173 in 
>>> /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/interface/matrix.c
>>>
>>>
>>> +---------------------------------------------+------------+------------+
>>>
>>> | Total wallclock time elapsed since start    |     0.732s |            
>>> |
>>>
>>> |                                             |            |            
>>> |
>>>
>>> | Section                         | no. calls |  wall time | % of total 
>>> |
>>>
>>> +---------------------------------+-----------+------------+------------+
>>>
>>> | RHS                             |         1 |     0.122s |        17% 
>>> |
>>>
>>> | assembly                        |         1 |     0.113s |        15% 
>>> |
>>>
>>> | setup_GridGen                   |         1 |     0.353s |        48% 
>>> |
>>>
>>> +---------------------------------+-----------+------------+------------+
>>>
>>>
>>> ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping 
>>> MPI_Finalize() to avoid a deadlock.
>>>
>>>
>>> ------------------------------
>>>
>>>
>>> Exception on processing: 
>>>
>>>
>>> ------------------------------
>>>
>>>
>>> An error occurred in line <1424> of file  in function
>>>
>>>     void 
>>> dealii::PETScWrappers::MatrixBase::add(dealii::PETScWrappers::MatrixBase
>>> ::size_type,
>>>  dealii::PETScWrappers::MatrixBase::size_type, const size_type*, const 
>>> PetscScalar*, bool, bool)
>>>
>>> The violated condition was: 
>>>
>>>     ierr == 0
>>>
>>> The name and call sequence of the exception was:
>>>
>>>     ExcPETScError(ierr)
>>>
>>> Additional Information: 
>>>
>>>
>>> An error with error number 63 occurred while calling a PETSc function
>>>
>>> Aborting!
>>>
>>> ----------------------------------------------------**
>>>
>>>
>>>
>>>
>>> On Monday, June 6, 2016 at 4:14:12 PM UTC+2, Ehsan Esfahani wrote:
>>>>
>>>> Dear Deal-ii Users,
>>>>
>>>> Greetings. I am trying to change step-25 for MPI based on the method 
>>>> described in step-40. Unfortunately, I got stuck with an error :( 
>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *An error occurred in line <1424> of file 
>>>>> </home/ehsan/apps/candi/deal.II-toolchain/deal.II-v8.4.0/include/deal.II/lac/petsc_matrix_base.h>
>>>>>  
>>>>> in function    void 
>>>>> dealii::PETScWrappers::MatrixBase::add(dealii::PETScWrappers::MatrixBase::size_type,
>>>>>  
>>>>> dealii::PETScWrappers::MatrixBase::size_type, const size_type*, const 
>>>>> PetscScalar*, bool, 
>>>>> bool)+---------------------------------+-----------+------------+------------+The
>>>>>  
>>>>> violated condition was:     ierr == 0The name and call sequence of the 
>>>>> exception was:    ExcPETScError(ierr)Additional Information: An error 
>>>>> with 
>>>>> error number 63 occurred while calling a PETSc function*
>>>>
>>>> *I've attached the code* if you like to see a full description of the 
>>>> error. I cannot understand why I'm getting this error because I'm just 
>>>> trying to change step-25 based on step-40. It was thinking it would be 
>>>> easy 
>>>> but it seems I was mistaken. Do you have any suggestion about this error? 
>>>> I 
>>>> find out that I am inserting a new nonzero component in a sparse matrix 
>>>> and 
>>>> it violates the sparsitypattern but I cannot find out where I'm doing this 
>>>> because the code is aborting and I cannot debug it correctly in 
>>>> Eclipse-Mars II.
>>>> Also, is it really possible to change step-25 for MPI by using step-40 
>>>> or I'm completely on a wrong road?
>>>> Thanks for your help in advance.
>>>>
>>>> Best Regards,
>>>> Ehsan
>>>>
>>>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to