Hello Vas,
Thanks for the answer! It worked perfectly in serial, however when I
run in parallel gives me error.
I did this function for one system with just one variable:
void Solution(const EquationSystems& es,const MeshBase& mesh,string s){
std::vector<Number> soln;
es.get_system("System").solution->localize_to_one(soln);
ofstream myfile;
myfile.open(s);
for(unsigned int i=0;i<mesh.n_nodes();i++){
myfile << scientific << " " << soln[i] << endl;
}
myfile.close();
}
And when running in parallel gives me lots of error, even though the
code is simple, just setting the initial condition and printing. This
works for you in parallel? Because maybe is something in my PETSC
instalation so.
Thanks
Ernesto
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: [2]PETSC ERROR:
------------------------------------------------------------------------
[2]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[2]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger
[2]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrindCaught
signal number 11 SEGV: Segmentation Violation, probably memory access
out of range
[1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger
[1]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC
ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to
find memory corruption errors
[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
and run
[1]PETSC ERROR: to get more information on the crash.
[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
OS X to find memory corruption errors
[2]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
and run
[2]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014
[1]PETSC ERROR: ./example-opt on a arch-linux2-c-opt named MSI by
ernesto Thu Apr 16 07:43:06 2015
[1]PETSC ERROR: Configure options --with-debugging=false
--COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3
--with-shared-libraries --download-mpich=1 --download-fblaslapack=1
--with-mumps=true --download-mumps=1 --with-metis=true
--download-metis=1 --with-parmetis=true --download-parmetis=1
--with-blacs=true --download-blacs=1 --with-scalapack=true
--download-scalapack=1 --with-superlu=true --download-superlu=1
--with-x11=0 --with-x=0
[1]PETSC ERROR: #1 User provided function() line 0 in unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1
[2]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[2]PETSC ERROR: Signal received
[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014
[2]PETSC ERROR: ./example-opt on a arch-linux2-c-opt named MSI by
ernesto Thu Apr 16 07:43:06 2015
[2]PETSC ERROR: Configure options --with-debugging=false
--COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3
--with-shared-libraries --download-mpich=1 --download-fblaslapack=1
--with-mumps=true --download-mumps=1 --with-metis=true
--download-metis=1 --with-parmetis=true --download-parmetis=1
--with-blacs=true --download-blacs=1 --with-scalapack=true
--download-scalapack=1 --with-superlu=true --download-superlu=1
--with-x11=0 --with-x=0
[2]PETSC ERROR: #1 User provided function() line 0 in unknown file
[3]PETSC ERROR:
------------------------------------------------------------------------
[3]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[3]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger
[3]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[3]PETSC
ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to
find memory corruption errors
[3]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
and run
[3]PETSC ERROR: to get more information on the crash.
[3]PETSC ERROR: [cli_1]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 2
--------------------- Error Message
--------------------------------------------------------------
[3]PETSC ERROR: Signal received
[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.5.2, Sep, 08, 2014
[cli_2]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 2
[3]PETSC ERROR: ./example-opt on a arch-linux2-c-opt named MSI by
ernesto Thu Apr 16 07:43:06 2015
[3]PETSC ERROR: Configure options --with-debugging=false
--COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3
--with-shared-libraries --download-mpich=1 --download-fblaslapack=1
--with-mumps=true --download-mumps=1 --with-metis=true
--download-metis=1 --with-parmetis=true --download-parmetis=1
--with-blacs=true --download-blacs=1 --with-scalapack=true
--download-scalapack=1 --with-superlu=true --download-superlu=1
--with-x11=0 --with-x=0
[3]PETSC ERROR: #1 User provided function() line 0 in unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 3
[cli_3]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 3
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 59
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
Em 2015-04-16 04:41, Vasileios Vavourakis escreveu:
> hi Ernesto,
>
> you could do that like that:
>
> std::vector<Number> soln;
>
> es.get_system("System").solution->localize_to_one(soln);
>
> cheers,
> Vas
>
> On 16 April 2015 at 02:55, ernestol <[email protected] [4]> wrote:
>
>> Hi all,
>>
>> I wonder if there is a simple way to get the global solution when
>> running the code in parallel?
>>
>> I tried:
>>
>> const System& system = es.get_system("System");
>> const unsigned short int variable_num =
>> system.variable_number("variable");
>> const unsigned int dim = mesh.mesh_dimension();
>> std::vector<Number> sys_soln;
>> system.update_global_solution (sys_soln, 0);
>>
>> And also created this function
>>
>> void Solution(const EquationSystems& es,const MeshBase& mesh,string
>> s){
>> std::vector<Number> soln;
>> std::vector<std::string> names;
>> es.build_variable_names(names);
>> es.build_solution_vector(soln);
>> ofstream myfile;
>> myfile.open(s);
>> for(unsigned int i=0;i<mesh.n_nodes();i++){
>> const unsigned int n_vars = names.size();
>> for(unsigned int c=0;c<n_vars;c++){
>> myfile << scientific << " " << soln[i*n_vars + c];
>> }
>> myfile << endl;
>> }
>> myfile.close();
>> }
>>
>> However both only work in serial. The first in parallel gives me
>> only 0
>> so sys_soln and the second gives me an error with PESTC when in
>> parallel.
>>
>> Thanks!
>>
>> Best,
>>
>> Ernesto
>>
>>
>
> ------------------------------------------------------------------------------
>> BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
>> Develop your own process in accordance with the BPMN 2 standard
>> Learn Process modeling best practices with Bonita BPM through live
>> exercises
>> http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual-
>> [1] event?utm_
>>
>
> source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
>> _______________________________________________
>> Libmesh-users mailing list
>> [email protected] [2]
>> https://lists.sourceforge.net/lists/listinfo/libmesh-users [3]
>
>
>
> Links:
> ------
> [1] http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual-
> [2] mailto:[email protected]
> [3] https://lists.sourceforge.net/lists/listinfo/libmesh-users
> [4] mailto:[email protected]
------------------------------------------------------------------------------
BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
Develop your own process in accordance with the BPMN 2 standard
Learn Process modeling best practices with Bonita BPM through live exercises
http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual- event?utm_
source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users