[deal.II] Re: changing step-25 based on step-40

2016-06-07 Thread Ehsan Esfahani
I have run this code by Eclipse in debug mode. It's been terminated so I 
cannot track the error, The error printed on Console is the same as the one 
I have mentioned here.

On Tuesday, June 7, 2016 at 2:47:34 PM UTC-5, Jean-Paul Pelteret wrote:
>
> Does this code work on one core (mpirun -np 1)? Are there any extra errors 
> when running in debug mode, or are you already running it in debug mode?
>
> On Tuesday, June 7, 2016 at 7:02:42 PM UTC+2, Ehsan Esfahani wrote:
>>
>> Thank for your response. I'm not sure it's related because, previously, 
>> without distributed triangulation, I modified step-25 in order to solve my 
>> problem (ginzburg landau eq.) and in that code, I didn't use those lines of 
>> the code, and it's running without errors. Also, I don't need to implement 
>> boundary conditions because of GL eq. 
>>
>> On Tuesday, June 7, 2016 at 11:26:02 AM UTC-5, Jean-Paul Pelteret wrote:
>>>
>>> The one thing that I've noticed is that you're not using a constraint 
>>> matrix, as is done in step-40:
>>> constraints.clear 
>>> 
>>>  
>>> ();
>>> constraints.reinit 
>>> 
>>>  
>>> (locally_relevant_dofs);
>>> DoFTools::make_hanging_node_constraints 
>>> 
>>>  
>>> (dof_handler, constraints);
>>> VectorTools::interpolate_boundary_values 
>>> 
>>>  
>>> (dof_handler,
>>>  0,
>>>  ZeroFunction 
>>> (),
>>>  constraints);
>>> constraints.close 
>>> 
>>>  
>>> ();
>>> constraints.distribute_local_to_global 
>>> 
>>>  
>>> (cell_matrix,
>>>cell_rhs,
>>>local_dof_indices,
>>>system_matrix,
>>>system_rhs);
>>> Perhaps this has something to do with it? I'm not sufficiently familiar 
>>> with distributed triangulations to say for sure that this is the problem. 
>>> It looks like you may be trying to access an entry in the sparsity pattern 
>>> that doesn't exist.
>>>
>>>
>>> On Tuesday, June 7, 2016 at 6:16:40 PM UTC+2, Jean-Paul Pelteret wrote:

 For those who may be able to offer some advice, this is the full error 
 message extracted from the discussion  here 
 :

 **> Time step #1 ; advancing 
 to t = 0.2.



 [0]PETSC ERROR: - Error Message 
 --

 [0]PETSC ERROR: Argument out of range

 [0]PETSC ERROR: Inserting a new nonzero at global row/column (0, 0) 
 into matrix

 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
 for trouble shooting.

 [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015 

 [0]PETSC ERROR: 
 /home/ehsan/apps/candi/deal.II-toolchain/deal.II-v8.4.0/examples/step-
 25
  (mine)/step-25 on a arch-linux2-c-opt named 
 levitasgrad01.me.iastate.edu by ehsan Mon Jun  6 13:06:02 2016

 [0]PETSC ERROR: Configure options 
 --prefix=/home/ehsan/apps/candi/deal.II-toolchain/petsc-3.6.3 
 --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 
 --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90

 [0]PETSC ERROR: #1  
 MatSetValues_MPIAIJ() line 582 in 
 /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/impls/aij/mpi/mpiaij.c

 [0]PETSC ERROR: #2  
 MatSetValues() line 1173 in 
 /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/interface/matrix.c



 +-+++

 | Total wallclock time elapsed since start| 0.732s |   
  |

 | ||   
  |

 | Section | no. calls |  wall time | % of 
 total |


 +-+---+++

 | RHS 

[deal.II] Re: changing step-25 based on step-40

2016-06-07 Thread Jean-Paul Pelteret
Does this code work on one core (mpirun -np 1)? Are there any extra errors 
when running in debug mode, or are you already running it in debug mode?

On Tuesday, June 7, 2016 at 7:02:42 PM UTC+2, Ehsan Esfahani wrote:
>
> Thank for your response. I'm not sure it's related because, previously, 
> without distributed triangulation, I modified step-25 in order to solve my 
> problem (ginzburg landau eq.) and in that code, I didn't use those lines of 
> the code, and it's running without errors. Also, I don't need to implement 
> boundary conditions because of GL eq. 
>
> On Tuesday, June 7, 2016 at 11:26:02 AM UTC-5, Jean-Paul Pelteret wrote:
>>
>> The one thing that I've noticed is that you're not using a constraint 
>> matrix, as is done in step-40:
>> constraints.clear 
>> 
>>  
>> ();
>> constraints.reinit 
>> 
>>  
>> (locally_relevant_dofs);
>> DoFTools::make_hanging_node_constraints 
>> 
>>  
>> (dof_handler, constraints);
>> VectorTools::interpolate_boundary_values 
>> 
>>  
>> (dof_handler,
>>  0,
>>  ZeroFunction 
>> (),
>>  constraints);
>> constraints.close 
>> 
>>  
>> ();
>> constraints.distribute_local_to_global 
>> 
>>  
>> (cell_matrix,
>>cell_rhs,
>>local_dof_indices,
>>system_matrix,
>>system_rhs);
>> Perhaps this has something to do with it? I'm not sufficiently familiar 
>> with distributed triangulations to say for sure that this is the problem. 
>> It looks like you may be trying to access an entry in the sparsity pattern 
>> that doesn't exist.
>>
>>
>> On Tuesday, June 7, 2016 at 6:16:40 PM UTC+2, Jean-Paul Pelteret wrote:
>>>
>>> For those who may be able to offer some advice, this is the full error 
>>> message extracted from the discussion  here 
>>> :
>>>
>>> **> Time step #1 ; advancing 
>>> to t = 0.2.
>>>
>>>
>>>
>>> [0]PETSC ERROR: - Error Message 
>>> --
>>>
>>> [0]PETSC ERROR: Argument out of range
>>>
>>> [0]PETSC ERROR: Inserting a new nonzero at global row/column (0, 0) into 
>>> matrix
>>>
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
>>> for trouble shooting.
>>>
>>> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015 
>>>
>>> [0]PETSC ERROR: 
>>> /home/ehsan/apps/candi/deal.II-toolchain/deal.II-v8.4.0/examples/step-25
>>>  (mine)/step-25 on a arch-linux2-c-opt named 
>>> levitasgrad01.me.iastate.edu by ehsan Mon Jun  6 13:06:02 2016
>>>
>>> [0]PETSC ERROR: Configure options 
>>> --prefix=/home/ehsan/apps/candi/deal.II-toolchain/petsc-3.6.3 
>>> --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 
>>> --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90
>>>
>>> [0]PETSC ERROR: #1  
>>> MatSetValues_MPIAIJ() line 582 in 
>>> /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/impls/aij/mpi/mpiaij.c
>>>
>>> [0]PETSC ERROR: #2  
>>> MatSetValues() line 1173 in 
>>> /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/interface/matrix.c
>>>
>>>
>>> +-+++
>>>
>>> | Total wallclock time elapsed since start| 0.732s |
>>> |
>>>
>>> | ||
>>> |
>>>
>>> | Section | no. calls |  wall time | % of total 
>>> |
>>>
>>> +-+---+++
>>>
>>> | RHS | 1 | 0.122s |17% 
>>> |
>>>
>>> | assembly| 1 | 0.113s |15% 
>>> |
>>>
>>> | setup_GridGen   | 1 | 0.353s |48% 
>>> |
>>>
>>> +-+---+++
>>>
>>>
>>> ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping 

[deal.II] Re: changing step-25 based on step-40

2016-06-07 Thread Ehsan Esfahani
Thank for your response. I'm not sure it's related because, previously, 
without distributed triangulation, I modified step-25 in order to solve my 
problem (ginzburg landau eq.) and in that code, I didn't use those lines of 
the code, and it's running without errors. Also, I don't need to implement 
boundary conditions because of GL eq. 

On Tuesday, June 7, 2016 at 11:26:02 AM UTC-5, Jean-Paul Pelteret wrote:
>
> The one thing that I've noticed is that you're not using a constraint 
> matrix, as is done in step-40:
> constraints.clear 
> 
>  
> ();
> constraints.reinit 
> 
>  
> (locally_relevant_dofs);
> DoFTools::make_hanging_node_constraints 
> 
>  
> (dof_handler, constraints);
> VectorTools::interpolate_boundary_values 
> 
>  
> (dof_handler,
>  0,
>  ZeroFunction 
> (),
>  constraints);
> constraints.close 
> 
>  
> ();
> constraints.distribute_local_to_global 
> 
>  
> (cell_matrix,
>cell_rhs,
>local_dof_indices,
>system_matrix,
>system_rhs);
> Perhaps this has something to do with it? I'm not sufficiently familiar 
> with distributed triangulations to say for sure that this is the problem. 
> It looks like you may be trying to access an entry in the sparsity pattern 
> that doesn't exist.
>
>
> On Tuesday, June 7, 2016 at 6:16:40 PM UTC+2, Jean-Paul Pelteret wrote:
>>
>> For those who may be able to offer some advice, this is the full error 
>> message extracted from the discussion  here 
>> :
>>
>> **> Time step #1 ; advancing to 
>> t = 0.2.
>>
>>
>>
>> [0]PETSC ERROR: - Error Message 
>> --
>>
>> [0]PETSC ERROR: Argument out of range
>>
>> [0]PETSC ERROR: Inserting a new nonzero at global row/column (0, 0) into 
>> matrix
>>
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
>> for trouble shooting.
>>
>> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015 
>>
>> [0]PETSC ERROR: 
>> /home/ehsan/apps/candi/deal.II-toolchain/deal.II-v8.4.0/examples/step-25
>>  (mine)/step-25 on a arch-linux2-c-opt named 
>> levitasgrad01.me.iastate.edu by ehsan Mon Jun  6 13:06:02 2016
>>
>> [0]PETSC ERROR: Configure options 
>> --prefix=/home/ehsan/apps/candi/deal.II-toolchain/petsc-3.6.3 
>> --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 
>> --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90
>>
>> [0]PETSC ERROR: #1  
>> MatSetValues_MPIAIJ() line 582 in 
>> /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/impls/aij/mpi/mpiaij.c
>>
>> [0]PETSC ERROR: #2  
>> MatSetValues() line 1173 in 
>> /home/ehsan/apps/candi/deal.II-toolchain-build/petsc-3.6.3/src/mat/interface/matrix.c
>>
>>
>> +-+++
>>
>> | Total wallclock time elapsed since start| 0.732s ||
>>
>> | |||
>>
>> | Section | no. calls |  wall time | % of total |
>>
>> +-+---+++
>>
>> | RHS | 1 | 0.122s |17% |
>>
>> | assembly| 1 | 0.113s |15% |
>>
>> | setup_GridGen   | 1 | 0.353s |48% |
>>
>> +-+---+++
>>
>>
>> ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping 
>> MPI_Finalize() to avoid a deadlock.
>>
>>
>> --
>>
>>
>> Exception on processing: 
>>
>>
>> --
>>
>>
>> An error occurred in line <1424> of file  in function
>>
>> void 
>> dealii::PETScWrappers::MatrixBase::add(dealii::PETScWrappers::MatrixBase
>> ::size_type,
>>