Hello, PETSc developers,
        I have a question regarding the performance of PETSc TS solver 
expecially the TSTHETA. I used it to solve my DAE equations. There're 
altogether 1152 functions in my IFunction(), and the Jacobian matrix in my 
IJacobian is also of dimension 1152*1152.
        The main TS code is set up as following:
  ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr);
  ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr);
  ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr);
  ierr = TSSetIFunction(ts, NULL, (TSIFunction) IFunction, &user); 
CHKERRQ(ierr);

  ierr = MatCreate(PETSC_COMM_WORLD, &J); CHKERRQ(ierr); // J: Jacobian matrix
  ierr = MatSetSizes(J, PETSC_DECIDE, PETSC_DECIDE, 4*ngen, 4*ngen); 
CHKERRQ(ierr);
  ierr = MatSetFromOptions(J); CHKERRQ(ierr);
  ierr = MatSetUp(J); CHKERRQ(ierr);

  ierr = TSSetIJacobian(ts, J, J, (TSIJacobian) IJacobian, &user); 
CHKERRQ(ierr);
  ierr = TSSetDM(ts, da); CHKERRQ(ierr);

  ierr = formInitialSolution(ts, x, &user, t_step, t_width); CHKERRQ(ierr);
  ftime = t_step[0] * t_width[0];
  ierr = TSSetDuration(ts, PETSC_DEFAULT, ftime); CHKERRQ(ierr);
  ierr = TSSetSolution(ts, x); CHKERRQ(ierr);
  ierr = TSSetInitialTimeStep(ts, 0.0, t_width[0]); CHKERRQ(ierr);

  ierr = TSSetFromOptions(ts);CHKERRQ(ierr);

  ierr = TSSolve(ts,x);CHKERRQ(ierr);
  ierr = TSGetSolveTime(ts,&ftime);CHKERRQ(ierr);
  ierr = TSGetTimeStepNumber(ts,&steps);CHKERRQ(ierr);
  ierr = PetscPrintf(PETSC_COMM_WORLD,"%D steps, ftime 
%G\n",steps,ftime);CHKERRQ(ierr);

  I have recorded the solution times when different numbers of processors are 
used:

2 processors: 1021 seconds,
4 processors: 587.244 seconds,
8 processors: 421.565 seconds,
16 processors: 355.594 seconds,
32 processors: 322.28 seconds,
64 processors: 382.967 seconds.

It seems like with 32 processors, it reaches the best performance. However, 
322.28 seconds to solve such DAE equations is too slow than I expected.

I have the following questions based on the above results:
1.      Is this the usual DAE solving time in PETSc to for the problem with 
this dimension?
2.      I was told that in TS, by default, ksp uses GMRES, and the 
preconditioner is ILU(0), is there any other alterative ksp solver or options I 
should use in the command line to solve the problem much faster?
3.      Do you have any other suggestion for me to speed up the DAE computation 
in PETSc?

Thanks a lot!
Shuangshuang


Reply via email to