Re: [Users] running ML_ADMConstraints with timelevels=1 and no SYNCs

2016-09-23 Thread Roland Haas
Hello Erik, Frank, all, yes, I would only compute and output the constraints on the coarse level timestep. > The largest issue I see is that e.g. a regular L2 norm over the simulation > domain is not very useful. It emphasizes very much the coarse grid, which > is very large, and where the

Re: [Users] running ML_ADMConstraints with timelevels=1 and no SYNCs

2016-09-23 Thread Erik Schnetter
The ghost and buffer zones are also used for time interpolation, but since you suggest to only evaluate the constraints when the coarse level is active, you are circumventing this. The largest issue I see is that e.g. a regular L2 norm over the simulation domain is not very useful. It emphasizes

[Users] running ML_ADMConstraints with timelevels=1 and no SYNCs

2016-09-22 Thread Roland Haas
Hello all, I am wondering if there would be anything wrong with running ML_ADMConstraints with just a single time level for the constraints as well as using no SYNCs for it as long as I set it's calc_every parameter to the frequency of the coarsest level. Obviously this will give my "nonssense"