Hi,

I think you have to use the following methods when you are assembling 
your matrices:

    "MatrixTools::local_apply_boundary_values(boundary_values,
                                   local_dof_indices,
                                   cell_matrix,
                                   cell_rhs,false);   "

Then, distribute the value globally through:

    "hanging_node_constraints.distribute_local_to_global(cell_matrix, 
local_dof_indices,system_matrix);"
    "hanging_node_constraints.distribute_local_to_global(cell_rhs, 
local_dof_indices,system_rhs);"

Where:

     "PETScWrappers::MPI::BlockSparseMatrix system_matrix;"

Hope it works.

Best
Isa


[EMAIL PROTECTED] wrote:
> Dear All.
>
> Owing to good helper, I have gone through to initialize the
> PETScWrappers::MPI::BlockSparseMatrix type. However, maybe at this moment,
> deal.II can not support this type throughout all of area. Now I should use
> the MatrixTools::apply_boundary_values function for that matrix type, but,
> It does not work. Anyone have got through this problem? please give me a
> advice.
>
> Sungho Yoon
> Leeds
> UK
>
>
>
> _______________________________________________
>   



_______________________________________________

Reply via email to