Hi Ronal, Thanks for the clarification.
Cheers, Benja On Wed, Oct 24, 2018 at 5:48 PM Roland Haas <[email protected]> wrote: > Hello Benja, > > you will still see both black holes. Half this size is really "reduce > the radius of the sphere by half" and not showing only one half of the > domain. > > Eg the original rpar file may have had a domain that goes out to a > radius of 2000M while the new one goes only to 1000M. The black holes > are never further apart than about 20M or so though so they are always > included. > > Yours, > Roland > > > Hi Ronal, > > > > Many thanks for your help, we are going to try the suggested rpar. > > Just one question, if we use "domain half its current size" as you > > suggested, when we will plot phi.*.xy.h5 result files we will see the two > > "black holes" as figure " > > https://docs.einsteintoolkit.org/et-docs/File:vt-5.png" > > or just one ? > > > > Cheers, > > Benja > > > > > > On Tue, Oct 23, 2018 at 7:47 PM Roland Haas <[email protected]> wrote: > > > > > Hello Benja, > > > > > > attached please find a modified rpar file where I made two changes: > > > > > > * changed the boundary condition to be of Robin type instead of > > > Dirichlet type, which reduces reflections on the boundary (the line > > > NewRad::z_is_radial = "yes") > > > * made the domain half its current size which reduces memory footprint > > > and runtime but will induce some reflections off the boundary, this > > > makes the simulation smaller so that is uses less memory > > > * then ran with very low resolution (N=24 instead of N=28) this makes > > > the simulation runs faster > > > > > > I gave it a test run on my workstation (12 cores, 96GB of RAM) and it > > > runs at ~4.1 M/hour. Since the full simulation is > > > about 1000 M this will finish in 10 days. > > > > > > If this is too slow (which is may well be) then you can try and reduce > > > the finite difference order to 6 from 8 by changing the lines (they > > > are not consecutive in the file): > > > > > > Driver::ghost_size = 5 > > > Coordinates::patch_boundary_size = 5 > > > Coordinates::additional_overlap_size = 3 > > > Coordinates::outer_boundary_size = 5 > > > ML_BSSN::fdOrder = 8 > > > SummationByParts::order = 8 > > > Interpolate::interpolator_order = 5 > > > WeylScal4::fdOrder = 8 > > > to: > > > > > > Driver::ghost_size = 4 > > > Coordinates::patch_boundary_size = 4 > > > Coordinates::additional_overlap_size = 3 > > > Coordinates::outer_boundary_size = 4 > > > ML_BSSN::fdOrder = 6 > > > SummationByParts::order = 6 > > > Interpolate::interpolator_order = 3 > > > WeylScal4::fdOrder = 6 > > > > > > which gives me a run speed of ~6.9M/hr (so 7 days runtime). > > > > > > This is the command line to start the simulation: > > > > > > simfactory/bin/sim create-submit GW150914_24 --define N 24 \ > > > --parfile ~/runs/devel/GW150914.rpar --procs 12 --walltime 24:00:00 > > > > > > Yours, > > > Roland > > > > > > > Dear friends, > > > > > > > > We are trying to use the EinsteinToolKit GW150914.rpar binary > > > > balckhole merge simulation as use case to test that our container > > > > orchestration product OpenShift can be used for HPC. > > > > Our test environment only has 30 CPUs so we need to execute that > > > > simulation in a reasonable time. > > > > > > > > Please can you tell us how to modify GW150914.rpar in order to get a > > > > less precise simulation executed in a 30 CPUs cluster in a reasonable > > > > time (~ few days). > > > > Now we can run the simulation GW150914.rpar using OpenMPI + > > > > EinsteinToolKit, but it takes so long to be executed (~ weeks). > > > > > > > > We believe that GW150914.rpar EinsteinToolKit is a great use case to > > > > test OpenShift for HPC, and of course we will reference to > > > > EinsteinToolKit is our final report as a use case for Openshift in > > > > HPC mode. > > > > > > > > Many thanks in advance for your help, > > > > Benja > > > > > > > > > > > > > > > > -- > > > My email is as private as my paper mail. I therefore support encrypting > > > and signing email messages. Get my PGP key from http://pgp.mit.edu . > > > > > > > > > > > -- > My email is as private as my paper mail. I therefore support encrypting > and signing email messages. Get my PGP key from http://pgp.mit.edu . > -- BenjamÃn Chardà Marco Senior Red Hat Consultant RHCE #100-107-341 [email protected] Mobile: 0034 654 344 878
_______________________________________________ Users mailing list [email protected] http://lists.einsteintoolkit.org/mailman/listinfo/users
