[petsc-users] Adaptive mesh refinement for transient problem

2017-03-31 Thread Buesing, Henrik
Dear all, I was wondering if it is possible to integrate adaptive mesh refinement into my code (transient, cell centered on a regular grid). I was looking at DMForest, but I'm unsure how much effort is needed to include this. What would I need to do? Write some indicator functions to mark cells

Re: [petsc-users] Adaptive mesh refinement for transient problem

2017-03-31 Thread Buesing, Henrik
Von: Matthew Knepley [mailto:knep...@gmail.com] Gesendet: Freitag, 31. März 2017 17:15 An: Buesing, Henrik Cc: petsc-users Betreff: Re: [petsc-users] Adaptive mesh refinement for transient problem On Fri, Mar 31, 2017 at 10:09 AM, Buesing, Henrik mailto:hbues...@eonerc.rwth-aachen.de>>

[petsc-users] Newton methods that converge all the time

2017-11-07 Thread Buesing, Henrik
Dear all, I am solving a system of nonlinear, transient PDEs. I am using Newton's method in every time step to solve the nonlinear algebraic equations. Of course, Newton's method only converges if the initial guess is sufficiently close to the solution. This is often not the case and Newton's

Re: [petsc-users] Newton methods that converge all the time

2017-11-30 Thread Buesing, Henrik
49889 -- http://www.eonerc.rwth-aachen.de/GGE hbues...@eonerc.rwth-aachen.de -- > -Ursprüngliche Nachricht- > Von: Smith, Barry F. [mailto:bsm...@mcs.anl.gov] > Gesendet: 10 November 2017 05:09 > An: Buesing, Henrik

Re: [petsc-users] Newton methods that converge all the time

2017-11-30 Thread Buesing, Henrik
49889 -- http://www.eonerc.rwth-aachen.de/GGE hbues...@eonerc.rwth-aachen.de -- Von: Matthew Knepley [mailto:knep...@gmail.com] Gesendet: 07 November 2017 12:54 An: Buesing, Henrik Cc: petsc

Re: [petsc-users] Newton methods that converge all the time

2017-11-30 Thread Buesing, Henrik
1) There should be more to the stack frame. I assume you call MatSetValuesStencil(). Do you use PetscFunctionBegin/Return() in that function? It would add to the stack [Buesing, Henrik] Yes, I call MatSetValuesStencil(). This is the full error. I abort the program after setting

Re: [petsc-users] Newton methods that converge all the time

2017-11-30 Thread Buesing, Henrik
There should be an ISL2G there automatically. It does not make sense to me that its gone, [Buesing, Henrik] You are right! I was not calling DMGlobaltoLocal. I was operating on the global arrays. Thus, it was not there. Works now! Thank you! Henrik PS: I am using Fortran so no

Re: [petsc-users] Newton methods that converge all the time

2017-12-01 Thread Buesing, Henrik
> > > >Please describe in some detail how you are handling phase change. > > If you have if () tests of any sort in your FormFunction() or > > FormJacobian() this can kill Newton's method. If you are using > > "variable switching" this WILL kill Newtons' method. Are you monkeying > > with phase

[petsc-users] SNESLineSearchSetPre/PostCheck()

2017-12-01 Thread Buesing, Henrik
Dear all, So what is the difference between the pre and post check? When should I use what? Thank you! Henrik -- Dipl.-Math. Henrik Büsing Institute for Applied Geophysics and Geothermal Energy E.ON Energy Research Center RWTH Aachen University --

Re: [petsc-users] SNESLineSearchSetPre/PostCheck()

2017-12-01 Thread Buesing, Henrik
> -Ursprüngliche Nachricht- > Von: Jed Brown [mailto:j...@jedbrown.org] > Gesendet: 01 December 2017 14:18 > An: Buesing, Henrik ; petsc-users > > Betreff: Re: [petsc-users] SNESLineSearchSetPre/PostCheck() > > "Buesing, Henrik" writes: > > >

Re: [petsc-users] SNESLineSearchSetPre/PostCheck()

2017-12-01 Thread Buesing, Henrik
> > > So what is the difference between the pre and post check? When > > > should > > I use what? > > > > PreCheck runs before starting the line search, PostCheck runs after > > the line search has completed and (if relevant) the inequality > proj

Re: [petsc-users] SNESLineSearchSetPre/PostCheck()

2017-12-02 Thread Buesing, Henrik
; >> the line search has completed and (if relevant) the inequality projection. > > > > [Buesing, Henrik] What I did was using the PreCheck to change the step > length if pressure or enthalpy went out of physical bounds (e.g. got > negative). > > > > Is the impact on Newton'

[petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
Dear all, I am using DMDACreate3d to generate my domain and a cell-centered two-point flux approximation as discretization. I use geometric Multigrid (-pc_type mg) with DMDASetInterpolationType(petsc_da, DMDA_Q0, petsc_ierr). With DM_BOUNDARY_GHOSTED as boundary type in DMDACreate3d, I get [0]

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
achen.de/GGE hbues...@eonerc.rwth-aachen.de -- > -Ursprüngliche Nachricht- > Von: Smith, Barry F. [mailto:bsm...@mcs.anl.gov] > Gesendet: 09 January 2018 15:10 > An: Buesing, Henrik > Cc: petsc-users > Betreff: Re: [petsc-

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
vertices, not cells, so +1 to your grid size. [Buesing, Henrik] Yes! But, I do not want that. I want to give the cells… [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.8.2-48-g851ec02 GIT Date: 2017-12

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
-aachen.de -- > -Ursprüngliche Nachricht- > Von: petsc-users [mailto:petsc-users-boun...@mcs.anl.gov] Im Auftrag von > Buesing, Henrik > Gesendet: 09 January 2018 16:13 > An: 'Smith, Barry F.' > Cc: petsc-users

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
vertices, not cells, so +1 to your grid size. [Buesing, Henrik] Yes! But, I do not want that. I want to give the cells… That is the interface. You will get m-1 cells so give it m = num_cells + 1. [Buesing, Henrik] Ok, I can do that. But, does the space discretization matter for the coarsening and

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
With DMDA_Q0, I tell him it is cell-centered two-point flux approximation, right? Not sure. You specific that you want one variable per cell someplace (not edge or vertex), look at examples and/or documentation. [Buesing, Henrik] I look at KSP example 32. There it says, functions are cell

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
vertices, not cells, so +1 to your grid size. [Buesing, Henrik] Yes! But, I do not want that. I want to give the cells… That is the interface. You will get m-1 cells so give it m = num_cells + 1. [Buesing, Henrik] Looking at KSP example 32<https://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/ksp/examp

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-09 Thread Buesing, Henrik
ll try it tomorrow and come back to you with the results. Thank you! Henrik > > Barry > > Sorry about this, someone needs to fix the injection for Q0 case. > > > > On Jan 9, 2018, at 9:12 AM, Buesing, Henrik aachen.de> wrote: > > > > Sorry (ag

Re: [petsc-users] Geometric multigrid with a cell-centered discretization

2018-01-10 Thread Buesing, Henrik
This means, that my Paraview files will have no domain size info > anymore. > > Presumably. But you may be able to work around the issue some way, for > example setting the coordinates after the solvers are complete. [Buesing, Henrik] Good idea! I moved the DMDASetUniformCoordinates to

[petsc-users] Weak scaling test with geometric multigrid

2018-01-10 Thread Buesing, Henrik
Dear all, I am doing a weak scaling test using geometric multigrid. With increasing the number of cells, and the number of processes, I also increase the number of multigrid levels. With 64 cores, 12288 cells in x-direction and 11 multigrid levels, I see error message [1]. Could you help me un

[petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
Dear all, I have structured cell-centered data and would like to visualize this with Paraview. Up to now I use PetscViewerVTKOpen and VecView to write data in *.vts format. I would like to tell PETSc that the fieldtype is PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField. Is this the w

[petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
Dear all, I would like to write HDF5 and VTK files in parallel. I found Vec example 19: "Parallel HDF5 Vec Viewing". But I do not understand how I tell PETSc to write a DMDA in parallel when doing VecView. At the

Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
>> Can PETSc use parallel HDF5? > Yeah, the implementation is in VecView_MPI_HDF5_DA and uses > H5FD_MPIO_COLLECTIVE if supported. Did you build your HDF5 with MPI? I just --download-hdf5. Do I need to do something else? > > > Regarding VTK: Is it possible that every process dumps his part of t

Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
> > I have structured cell-centered data and would like to visualize this with > Paraview. Up to now I use PetscViewerVTKOpen and VecView to write data in > *.vts format. I would like to tell PETSc that the fieldtype is > PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField. > > > > Is this th

Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
> >>> Can PETSc use parallel HDF5? > >> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses > >> H5FD_MPIO_COLLECTIVE if supported. Did you build your HDF5 with > MPI? > > > > I just --download-hdf5. Do I need to do something else? > > That should configure it with whichever MPI PETSc is

Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
> >> >>> Can PETSc use parallel HDF5? > >> >> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses > >> >> H5FD_MPIO_COLLECTIVE if supported. Did you build your HDF5 with > >> MPI? > >> > > >> > I just --download-hdf5. Do I need to do something else? > >> > >> That should configure it with

Re: [petsc-users] Parallel output in PETSc with pHDF5 and VTK

2018-02-07 Thread Buesing, Henrik
> > >>> Can PETSc use parallel HDF5? > > >> Yeah, the implementation is in VecView_MPI_HDF5_DA and uses > > >> H5FD_MPIO_COLLECTIVE if supported. Did you build your HDF5 with > > MPI? > > > > > > I just --download-hdf5. Do I need to do something else? > > > > That should configure it with whicheve

Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
> >> > I have structured cell-centered data and would like to visualize > >> > this with > >> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write > >> data in *.vts format. I would like to tell PETSc that the fieldtype > >> is PETSC_VTK_CELL_FIELD. I have found PetscViewerVTKAddField.

Re: [petsc-users] Visualizing structured cell-centered data VTK

2018-02-07 Thread Buesing, Henrik
> >> >> > I have structured cell-centered data and would like to visualize > >> >> > this with > >> >> Paraview. Up to now I use PetscViewerVTKOpen and VecView to write > >> >> data in *.vts format. I would like to tell PETSc that the > >> >> fieldtype is PETSC_VTK_CELL_FIELD. I have found > PetscV

[petsc-users] Set Field names in HDF5 output

2018-02-16 Thread Buesing, Henrik
Dear all, I set fieldnames with DMDASetFieldName. These appear in my VTK output, but not in my HDF5 output. Is it possible to set fieldnames in the HDF5 output files? Thank you! Henrik -- Dipl.-Math. Henrik Büsing Institute for Applied Geophysics and Geothermal Energy E.ON Energy Research Cen

[petsc-users] Optimal coloring using -snes_fd_color with DMDACreate

2018-04-12 Thread Buesing, Henrik
Dear all, When I use -snes_fd_color with my da created by DMDACreate3d, will it use an optimal coloring algorithm (see [1]) to estimate the Jacobian? Thank you! Henrik [1] Goldfarb/Toint: Optimal estimation of Jacobian and Hessian matrices that arise in finite difference calculations https:/

Re: [petsc-users] Optimal coloring using -snes_fd_color with DMDACreate

2018-04-13 Thread Buesing, Henrik
| Tel +49 (0)241 80 49907 52074 Aachen, Germany | Fax +49 (0)241 80 49889 http://www.eonerc.rwth-aachen.de/GGE hbues...@eonerc.rwth-aachen.de -Ursprüngliche Nachricht- Von: Smith, Barry F. Gesendet: Donnerstag, 12. April 2018 16:42 An: Buesing, Henrik Cc: petsc

[petsc-users] Matrix-Free Newton Krylov with Automatic Differentiation and User Provided Preconditioner

2018-04-20 Thread Buesing, Henrik
Dear all, I would like to use the matrix-free feature from PETSc, but I do not want to use finite differences to calculate J*u but automatic differentiation (AD). Since I do not want to live totally without preconditioning, I want to build important parts of the Jacobian and pass it for precond

[petsc-users] Fieldsplit - Schur Complement Reduction - Efficient Preconditioner for Schur Complement

2018-07-25 Thread Buesing, Henrik
Dear all, I would like to improve the iterative solver [1]. As I understand it I would need to improve the preconditioner for the Schur complement. How would I do that? Thank you for your help! Henrik [1] -ksp_max_it 100 -ksp_rtol 1e-6 -ksp_atol 1e-50 -ksp_type fgmres -pc_type fieldsplit

Re: [petsc-users] Fieldsplit - Schur Complement Reduction - Efficient Preconditioner for Schur Complement

2018-07-25 Thread Buesing, Henrik
Cc: Buesing, Henrik ; PETSc Betreff: Re: [petsc-users] Fieldsplit - Schur Complement Reduction - Efficient Preconditioner for Schur Complement On 25 July 2018 at 09:48, Matthew Knepley mailto:knep...@gmail.com>> wrote: On Wed, Jul 25, 2018 at 4:24 AM Buesing, Henrik mailto

[petsc-users] Efficient and scalable preconditioner for advection-dominated problems

2018-07-26 Thread Buesing, Henrik
Dear all, What is your favourite efficient and scalable preconditioner for advection-dominated (hyperbolic) problems? Thank you! Henrik PS: Maybe my question regarding two-phase flow is too specific. Thus, I rephrase it in this way. -- Dipl.-Math. Henrik Büsing Institute for Applied Geophy