On Mon, Apr 14, 2008 at 8:43 AM, Ben Tay <zonexo at gmail.com> wrote: > Hi Matthew, > > I think you've misunderstood what I meant. What I'm trying to say is > initially I've got a serial code. I tried to convert to a parallel one. Then > I tested it and it was pretty slow. Due to some work requirement, I need to > go back to make some changes to my code. Since the parallel is not working > well, I updated and changed the serial one. > > Well, that was a while ago and now, due to the updates and changes, the > serial code is different from the old converted parallel code. Some files > were also deleted and I can't seem to get it working now. So I thought I > might as well convert the new serial code to parallel. But I'm not very sure > what I should do 1st. > > Maybe I should rephrase my question in that if I just convert my poisson > equation subroutine from a serial PETSc to a parallel PETSc version, will it > work? Should I expect a speedup? The rest of my code is still serial.
You should, of course, only expect speedup in the parallel parts Matt > Thank you very much. > > > > Matthew Knepley wrote: > > > I am not sure why you would ever have two codes. I never do this. PETSc > > is designed to write one code to run in serial and parallel. The PETSc > part > > should look identical. To test, run the code yo uhave verified in serial > and > > output PETSc data structures (like Mat and Vec) using a binary viewer. > > Then run in parallel with the same code, which will output the same > > structures. Take the two files and write a small verification code that > > loads both versions and calls MatEqual and VecEqual. > > > > Matt > > > > On Mon, Apr 14, 2008 at 5:49 AM, Ben Tay <zonexo at gmail.com> wrote: > > > > > > > Thank you Matthew. Sorry to trouble you again. > > > > > > I tried to run it with -log_summary output and I found that there's > some > > > errors in the execution. Well, I was busy with other things and I just > came > > > back to this problem. Some of my files on the server has also been > deleted. > > > It has been a while and I remember that it worked before, only much > > > slower. > > > > > > Anyway, most of the serial code has been updated and maybe it's easier > to > > > convert the new serial code instead of debugging on the old parallel > code > > > now. I believe I can still reuse part of the old parallel code. However, > I > > > hope I can approach it better this time. > > > > > > So supposed I need to start converting my new serial code to parallel. > > > There's 2 eqns to be solved using PETSc, the momentum and poisson. I > also > > > need to parallelize other parts of my code. I wonder which route is the > > > best: > > > > > > 1. Don't change the PETSc part ie continue using PETSC_COMM_SELF, > modify > > > other parts of my code to parallel e.g. looping, updating of values etc. > > > Once the execution is fine and speedup is reasonable, then modify the > PETSc > > > part - poisson eqn 1st followed by the momentum eqn. > > > > > > 2. Reverse the above order ie modify the PETSc part - poisson eqn 1st > > > followed by the momentum eqn. Then do other parts of my code. > > > > > > I'm not sure if the above 2 mtds can work or if there will be > conflicts. Of > > > course, an alternative will be: > > > > > > 3. Do the poisson, momentum eqns and other parts of the code > separately. > > > That is, code a standalone parallel poisson eqn and use samples values > to > > > test it. Same for the momentum and other parts of the code. When each of > > > them is working, combine them to form the full parallel code. However, > this > > > will be much more troublesome. > > > > > > I hope someone can give me some recommendations. > > > > > > Thank you once again. > > > > > > > > > > > > Matthew Knepley wrote: > > > > > > > > > > > > > 1) There is no way to have any idea what is going on in your code > > > > without -log_summary output > > > > > > > > 2) Looking at that output, look at the percentage taken by the solver > > > > KSPSolve event. I suspect it is not the biggest component, because > > > > it is very scalable. > > > > > > > > Matt > > > > > > > > On Sun, Apr 13, 2008 at 4:12 AM, Ben Tay <zonexo at gmail.com> wrote: > > > > > > > > > > > > > > > > > > > > > Hi, > > > > > > > > > > I've a serial 2D CFD code. As my grid size requirement increases, > the > > > > > simulation takes longer. Also, memory requirement becomes a problem. > > > > > > > > > > > > > > > > > Grid > > > > > > > > > > > > > > > size 've reached 1200x1200. Going higher is not possible due to > memory > > > > > problem. > > > > > > > > > > I tried to convert my code to a parallel one, following the examples > > > > > > > > > > > > > > > > > given. > > > > > > > > > > > > > > > I also need to restructure parts of my code to enable parallel > looping. > > > > > > > > > > > > > > > > > I > > > > > > > > > > > > > > > 1st changed the PETSc solver to be parallel enabled and then I > > > > > > > > > > > > > > > > > restructured > > > > > > > > > > > > > > > parts of my code. I proceed on as longer as the answer for a simple > test > > > > > case is correct. I thought it's not really possible to do any speed > > > > > > > > > > > > > > > > > testing > > > > > > > > > > > > > > > since the code is not fully parallelized yet. When I finished during > > > > > > > > > > > > > > > > > most of > > > > > > > > > > > > > > > the conversion, I found that in the actual run that it is much > slower, > > > > > although the answer is correct. > > > > > > > > > > So what is the remedy now? I wonder what I should do to check what's > > > > > > > > > > > > > > > > > wrong. > > > > > > > > > > > > > > > Must I restart everything again? Btw, my grid size is 1200x1200. I > > > > > > > > > > > > > > > > > believed > > > > > > > > > > > > > > > it should be suitable for parallel run of 4 processors? Is that so? > > > > > > > > > > Thank you. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
