Re: [DuMuX] memory corruption in brookscorey.hh?

2018-08-21 Thread Christoph Grüninger
Hi Edscott,
give AddressSanitizer a try, it is a great tool to find such thing you
describe. Further MemorySanitizer and Undefined Behavior Sanitizer might
turn out to be helpful, too. My best experience with these tools were
when I used the latest Clang compiler. Some of them work with recent
versions of GCC very well, too.
Valgrind might be worth a try. But it has more false positives and the
output is more difficult to understand.

Kind regards,
Christoph

Am 21.08.2018 um 18:10 schrieb Edscott Wilson:
> OK.
> I'll dig into the matter a bit further to see if I can solve where the
> problem arises. It might be an incorrect cast somewhere that screws up
> memory locations.
> 
> Best regards,
> 
> Edscott
> 
> 
> 2018-08-17 15:31 GMT-05:00 Flemisch, Bernd
>  >:
> 
> Hi Edscott,
> 
> can you please open an issue at
> https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/issues
>  ?
> Due to holiday season, it might take us some time to look at this.
> By opening an issue, it won't be forgotten.
> 
> Kind regards
> Bernd
> 
> Von: Edscott Wilson
> Gesendet: Donnerstag, 16. August, 23:47
> Betreff: [DuMuX] memory corruption in brookscorey.hh?
> An: DuMuX User Forum
> 
> 
> This is weird and should not be happening. I explain.
>  
> While debugging a generalized Dirichlet type problem, I am
> encountering a problem with the BrooksCorey material law
> (dumux/material/fluidmatrixinteractions/2p/brookscorey.hh). 
>  
> Here is the code from brookscorey.hh:
>  
> 181 using std::min;
> 182 using std::max;
> 183
> 184 swe = min(max(swe, 0.0), 1.0); // the equation below
> is only defined for 0.0 <= sw <= 1.0
> 185
> 186 return pow(swe, 2.0/params.lambda() + 3);
> 187 }
>  
> Inserting a break before the min(max()) call, I examine the value of
> swe:
>  
> Thread 1 "lswf-chem11" hit Breakpoint 2, Dumux::BrooksCorey Dumux::RegularizedBrooksCoreyParams >::krw (params=...,
>     swe=6.9319619419652626e-17) at
> 
> /opt/dune/include/dumux/material/fluidmatrixinteractions/2p/brookscorey.hh:184
> 184 swe = min(max(swe, 0.0), 1.0); // the equation below
> is only defined for 0.0 <= sw <= 1.0
> (gdb) print swe
> $11 = 6.9319619419652626e-17
>  
>  
> Then I step over the min(max() call and re-examine swe and get a
> corrupted value:
>  
> (gdb) next
> 186 return pow(swe, 2.0/params.lambda() + 3);
> (gdb) print swe
> $12 = -3.9159195083267944e+240
>  
> Stepping into the min(max()) function I see that the value which
> should be "1.0" arrives corrupted:
>  
> (gdb)
> std::min (__a=@0x7fffae00: 6.9319619419652626e-17,
> __b=@0x7fffae10: -3.9159195083267944e+240)
>     at /usr/include/c++/6.3.1/bits/stl_algobase.h:200
> 200   if (__b < __a)
> (gdb) print __b
> $16 = (const double &) @0x7fffae10: -3.9159195083267944e+240
>  
>  
> Looks like the "1.0" is being placed on the stack and being
> optimized away after the max() part completes.
>  
> Doing some changes to the code and doing the simple eff2abs law
> conversion within the regularizedbrooksCorey class template, the
> problem disappears, as the following gdb output shows:
>  
> Thread 1 "lswf-chem11" hit Breakpoint 3, Dumux::BrooksCoreyV Dumux::RegularizedBrooksCoreyVParams >::krw (params=...,
>     swe=6.9319619419652626e-17, iS=4.22627533) at
> /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:91
> 91  swe = min(max(swe, 0.0), 1.0); // the equation below
> is only defined for 0.0 <= sw <= 1.0
> (gdb) step
> std::max (__a=@0x7fffade0: 6.9319619419652626e-17,
> __b=@0x7fffadf8: 0) at
> /usr/include/c++/6.3.1/bits/stl_algobase.h:224
> 224   if (__a < __b)
> (gdb) next
> 226   return __a;
> (gdb)
> 227 }
> (gdb)
> Dumux::BrooksCoreyV Dumux::RegularizedBrooksCoreyVParams >::krw (params=...,
> swe=6.9319619419652626e-17, iS=4.22627533)
>     at
> /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:92
> 92  return pow(swe, 2.0/params.lambda(iS) + 3);
> (gdb) print swe
> $18 = 6.9319619419652626e-17
>  
>  
> Opinions?
>  
> Could there be something amiss in the EffToAbsLaw class template?
>  
> Or could it be a gcc bug? (using "gcc (GCC) 6.3.1 20170109" and "GNU
> gdb (GDB) 7.12.1").
>  
> I tried to use gdb within a docker container with "gcc (GCC) 7.3.1
> 20180312" and "GNU gdb (GDB) 8.1" but I get:
>  
> (gdb) run
> Starting program:
> 
> 

Re: [DuMuX] memory corruption in brookscorey.hh?

2018-08-21 Thread Timo Koch
Dear Edscott,

what was your actual problem you are trying to solve? Is this a problem
with Dumux? The code you are showing looks perfectly fine to me. 

Or was this just occurring during debugging? Your comments sound like you are 
debugging optimized code. If yes, is there any problem if the code is compiled 
with -O0? Not all symbols are always defined if there are optimized away so it 
might look like this. 

Timo



Viele Grüße,
Timo
> Am 21.08.2018 um 20:40 schrieb Edscott Wilson 
> :
> 
> OK. 
> I'll dig into the matter a bit further to see if I can solve where the 
> problem arises. It might be an incorrect cast somewhere that screws up memory 
> locations.
> 
> Best regards,
> 
> Edscott
> 
> 
> 2018-08-17 15:31 GMT-05:00 Flemisch, Bernd 
> :
>> Hi Edscott,
>> 
>> can you please open an issue at 
>> https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/issues ? Due to 
>> holiday season, it might take us some time to look at this. By opening an 
>> issue, it won't be forgotten.
>> 
>> Kind regards
>> Bernd
>> 
>> Von: Edscott Wilson
>> Gesendet: Donnerstag, 16. August, 23:47
>> Betreff: [DuMuX] memory corruption in brookscorey.hh?
>> An: DuMuX User Forum
>> 
>> 
>> This is weird and should not be happening. I explain.
>>  
>> While debugging a generalized Dirichlet type problem, I am encountering a 
>> problem with the BrooksCorey material law 
>> (dumux/material/fluidmatrixinteractions/2p/brookscorey.hh).  
>>  
>> Here is the code from brookscorey.hh:
>>  
>> 181 using std::min;
>> 182 using std::max;
>> 183
>> 184 swe = min(max(swe, 0.0), 1.0); // the equation below is only 
>> defined for 0.0 <= sw <= 1.0
>> 185
>> 186 return pow(swe, 2.0/params.lambda() + 3);
>> 187 }
>>  
>> Inserting a break before the min(max()) call, I examine the value of swe: 
>>  
>> Thread 1 "lswf-chem11" hit Breakpoint 2, Dumux::BrooksCorey> Dumux::RegularizedBrooksCoreyParams >::krw (params=..., 
>> swe=6.9319619419652626e-17) at 
>> /opt/dune/include/dumux/material/fluidmatrixinteractions/2p/brookscorey.hh:184
>> 184 swe = min(max(swe, 0.0), 1.0); // the equation below is only 
>> defined for 0.0 <= sw <= 1.0
>> (gdb) print swe
>> $11 = 6.9319619419652626e-17
>>  
>>  
>> Then I step over the min(max() call and re-examine swe and get a corrupted 
>> value:
>>  
>> (gdb) next
>> 186 return pow(swe, 2.0/params.lambda() + 3);
>> (gdb) print swe
>> $12 = -3.9159195083267944e+240
>>  
>> Stepping into the min(max()) function I see that the value which should be 
>> "1.0" arrives corrupted:
>>  
>> (gdb) 
>> std::min (__a=@0x7fffae00: 6.9319619419652626e-17, 
>> __b=@0x7fffae10: -3.9159195083267944e+240)
>> at /usr/include/c++/6.3.1/bits/stl_algobase.h:200
>> 200   if (__b < __a)
>> (gdb) print __b
>> $16 = (const double &) @0x7fffae10: -3.9159195083267944e+240
>>  
>>  
>> Looks like the "1.0" is being placed on the stack and being optimized away 
>> after the max() part completes.
>>  
>> Doing some changes to the code and doing the simple eff2abs law conversion 
>> within the regularizedbrooksCorey class template, the problem disappears, as 
>> the following gdb output shows:
>>  
>> Thread 1 "lswf-chem11" hit Breakpoint 3, Dumux::BrooksCoreyV> Dumux::RegularizedBrooksCoreyVParams >::krw (params=..., 
>> swe=6.9319619419652626e-17, iS=4.22627533) at 
>> /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:91
>> 91  swe = min(max(swe, 0.0), 1.0); // the equation below is only 
>> defined for 0.0 <= sw <= 1.0
>> (gdb) step
>> std::max (__a=@0x7fffade0: 6.9319619419652626e-17, 
>> __b=@0x7fffadf8: 0) at /usr/include/c++/6.3.1/bits/stl_algobase.h:224
>> 224   if (__a < __b)
>> (gdb) next
>> 226   return __a;
>> (gdb) 
>> 227 }
>> (gdb) 
>> Dumux::BrooksCoreyV 
>> >::krw (params=..., swe=6.9319619419652626e-17, iS=4.22627533)
>> at /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:92
>> 92  return pow(swe, 2.0/params.lambda(iS) + 3);
>> (gdb) print swe
>> $18 = 6.9319619419652626e-17
>>  
>>  
>> Opinions?
>>  
>> Could there be something amiss in the EffToAbsLaw class template?
>>  
>> Or could it be a gcc bug? (using "gcc (GCC) 6.3.1 20170109" and "GNU gdb 
>> (GDB) 7.12.1").
>>  
>> I tried to use gdb within a docker container with "gcc (GCC) 7.3.1 20180312" 
>> and "GNU gdb (GDB) 8.1" but I get:
>>  
>> (gdb) run
>> Starting program: 
>> /home/dumux/projects/lswf-chem11-USE_BC-CACO3_CASO4_MGCO3-SIMPLIFIED-UMF/build-cmake/src/lswf-chem11
>>  -ParameterFile ../SW-b.input
>> warning: Error disabling address space randomization: Operation not permitted
>> warning: Could not trace the inferior process.
>> Error: 
>> warning: ptrace: Operation not permitted
>> During startup program exited with code 127.
>>  
>> Has anybody had luck debugging with gdb within a docker container?
>>  
>>  
>> The full g++ 

Re: [DuMuX] memory corruption in brookscorey.hh?

2018-08-21 Thread Edscott Wilson
OK.
I'll dig into the matter a bit further to see if I can solve where the
problem arises. It might be an incorrect cast somewhere that screws up
memory locations.

Best regards,

Edscott


2018-08-17 15:31 GMT-05:00 Flemisch, Bernd <
bernd.flemi...@iws.uni-stuttgart.de>:

> Hi Edscott,
>
> can you please open an issue at https://git.iws.uni-stuttgart.
> de/dumux-repositories/dumux/issues ? Due to holiday season, it might take
> us some time to look at this. By opening an issue, it won't be forgotten.
>
> Kind regards
> Bernd
>
> Von: Edscott Wilson
> Gesendet: Donnerstag, 16. August, 23:47
> Betreff: [DuMuX] memory corruption in brookscorey.hh?
> An: DuMuX User Forum
>
>
> This is weird and should not be happening. I explain.
>
> While debugging a generalized Dirichlet type problem, I am encountering a
> problem with the BrooksCorey material law (dumux/material/
> fluidmatrixinteractions/2p/brookscorey.hh).
>
> Here is the code from brookscorey.hh:
>
> 181 using std::min;
> 182 using std::max;
> 183
> 184 swe = min(max(swe, 0.0), 1.0); // the equation below is
> only defined for 0.0 <= sw <= 1.0
> 185
> 186 return pow(swe, 2.0/params.lambda() + 3);
> 187 }
>
> Inserting a break before the min(max()) call, I examine the value of swe:
>
> Thread 1 "lswf-chem11" hit Breakpoint 2, Dumux::BrooksCorey RegularizedBrooksCoreyParams >::krw (params=...,
> swe=6.9319619419652626e-17) at /opt/dune/include/dumux/material/
> fluidmatrixinteractions/2p/brookscorey.hh:184
> 184 swe = min(max(swe, 0.0), 1.0); // the equation below is
> only defined for 0.0 <= sw <= 1.0
> (gdb) print swe
> $11 = 6.9319619419652626e-17
>
>
> Then I step over the min(max() call and re-examine swe and get a corrupted
> value:
>
> (gdb) next
> 186 return pow(swe, 2.0/params.lambda() + 3);
> (gdb) print swe
> $12 = -3.9159195083267944e+240
>
> Stepping into the min(max()) function I see that the value which should be
> "1.0" arrives corrupted:
>
> (gdb)
> std::min (__a=@0x7fffae00: 6.9319619419652626e-17,
> __b=@0x7fffae10: -3.9159195083267944e+240)
> at /usr/include/c++/6.3.1/bits/stl_algobase.h:200
> 200   if (__b < __a)
> (gdb) print __b
> $16 = (const double &) @0x7fffae10: -3.9159195083267944e+240
>
>
> Looks like the "1.0" is being placed on the stack and being optimized away
> after the max() part completes.
>
> Doing some changes to the code and doing the simple eff2abs law conversion
> within the regularizedbrooksCorey class template, the problem disappears,
> as the following gdb output shows:
>
> Thread 1 "lswf-chem11" hit Breakpoint 3, Dumux::BrooksCoreyV Dumux::RegularizedBrooksCoreyVParams >::krw (params=...,
> swe=6.9319619419652626e-17, iS=4.22627533) at
> /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:91
> 91  swe = min(max(swe, 0.0), 1.0); // the equation below is
> only defined for 0.0 <= sw <= 1.0
> (gdb) step
> std::max (__a=@0x7fffade0: 6.9319619419652626e-17,
> __b=@0x7fffadf8: 0) at /usr/include/c++/6.3.1/bits/stl_algobase.h:224
> 224   if (__a < __b)
> (gdb) next
> 226   return __a;
> (gdb)
> 227 }
> (gdb)
> Dumux::BrooksCoreyV
> >::krw (params=..., swe=6.9319619419652626e-17, iS=4.22627533)
> at /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:92
> 92  return pow(swe, 2.0/params.lambda(iS) + 3);
> (gdb) print swe
> $18 = 6.9319619419652626e-17
>
>
> Opinions?
>
> Could there be something amiss in the EffToAbsLaw class template?
>
> Or could it be a gcc bug? (using "gcc (GCC) 6.3.1 20170109" and "GNU gdb
> (GDB) 7.12.1").
>
> I tried to use gdb within a docker container with "gcc (GCC) 7.3.1
> 20180312" and "GNU gdb (GDB) 8.1" but I get:
>
> (gdb) run
> Starting program: /home/dumux/projects/lswf-chem11-USE_BC-CACO3_CASO4_
> MGCO3-SIMPLIFIED-UMF/build-cmake/src/lswf-chem11 -ParameterFile
> ../SW-b.input
> warning: Error disabling address space randomization: Operation not
> permitted
> warning: Could not trace the inferior process.
> Error:
> warning: ptrace: Operation not permitted
> During startup program exited with code 127.
>
> Has anybody had luck debugging with gdb within a docker container?
>
>
> The full g++ compiler command is as follows:
>
> /usr/bin/c++  -Wno-deprecated -ggdb -I/home/dumux/problems/lswf-chem11
> -I/home/dumux/include -I/home/dumux -I/opt/dune/include -DUSE_BC
> -DCACO3_CASO4_MGCO3 -DSIMPLIFIED -DUMF   -pthread -rdynamic
> CMakeFiles/lswf-chem11.dir/lswf-chem11.cc.o  -o lswf-chem11
> -Wl,-rpath,/usr/lib/openmpi /opt/dune/lib64/libdunefem.a 
> /opt/dune/lib64/libdunealugrid.a
> /opt/dune/lib64/libdunegrid.a /opt/dune/lib64/libugS3.a
> /opt/dune/lib64/libugS2.a /opt/dune/lib64/libugL.a 
> /opt/dune/lib64/libdunegeometry.a
> /opt/dune/lib64/libdunecommon.a -lumfpack -lspqr -lldl -lcholmod -lamd
> -lcamd -lcolamd -lccolamd -lsuitesparseconfig -pthread
> 

Re: [DuMuX] Check global conservation error

2018-08-21 Thread Dennis Gläser

Hi Nikolai,


you have to make sure that you only compute this on Dirichlet faces. On 
Neumann faces the fluxes are the ones you set in your neumann() or 
neumannAtPos() function and cannot be computed by the flux assembly. So 
if you have only no-flow Neumann boundaries just make sure to only 
compute boundary fluxes on Dirichlet faces and your calculations will be 
correct.



If you also consider Neumann fluxes then integrate these fluxes by 
calling neumann() and multiplying the values with scvf.area() and the 
extrusion factor.



I hope this helps!

Dennis



On 21.08.2018 13:59, Nikolai Andrianov wrote:


Hi Dennis,


Thanks a lot, I got through elemFluxVarsCache.bind runtime error!


However, a new problem has popped up 


When I get to computing (or recovering) the advective flux with 
fluxVars.advectiveFlux(nPhaseIdx, upwindTermN), the execution brings 
me to caluclating Dracy's flux in CCTpfaDarcysLaw::flux(...)since


- !advFluxIsCached_[phaseIdx] is true in 
PorousMediumFluxVariables::advectiveFlux(..) and


- (!problem.couplingManager().isOnInteriorBoundary(element, scvf)) is 
true in CCTpfaFacetCouplingDarcysLawImpl::flux(..)



So I am ending up in CCTpfaDarcysLaw::flux(...) and the outsideVolVars 
= elemVolVars[scvf.outsideScvIdx()] cannot be computed because scvf is 
a boundary face, resulting in the error message "Assertion `it != 
volVarIndices_.end() && "Could not find the current volume variables 
for volVarIdx!"' failed."



Looking forward for your feedback.


Thanks,

Nikolai


*From:* Dumux  on behalf of 
Dennis Gläser 

*Sent:* Tuesday, August 21, 2018 11:39:34 AM
*To:* dumux@listserv.uni-stuttgart.de
*Subject:* Re: [DuMuX] Check global conservation error

Hi Nikolai,


I think your implementation is almost correct. If you want to balance 
masses, you have to add the density to your upwind terms though, i.e.:



auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx)*volVars.density(nPhaseIdx); };


...


Also, you have to call bind() instead of bindElement() on the geometry 
and the volume variables, i.e.:



auto fvGeometry = localView(this->fvGridGeometry());
fvGeometry.bind(element);

auto elemVolVars = localView(gridVariables.curGridVolVars());
elemVolVars.bind(element, fvGeometry, curSol);


This has the following background:


If you want to do computations on the element without assembling 
fluxes, you do not have to prepare all variables within the entire 
stencil. For these purposes you call bindElement(). This only prepares 
primary/secondary variables on the element. This is enough e.g. to 
evaluate the mass inside the domain after a time step. But, if you 
want to compute fluxes, you need to prepare the entire stencil. This 
stencil includes neighboring elements as well for cell-centered 
schemes. In this case you prepare you local views accordingly by 
calling bind() instead of bindElement().



I hope this makes it work!


Best wishes,
Dennis




On 21.08.2018 10:48, Nikolai Andrianov wrote:


Dear DuMuX experts,


Please advise how can I implement a global conservation error check, 
that is, evaluate the residual ( mass(t^{n+1}) - mass(t^n) ) / dt + 
sum of fluxes over the domain boundaries.



So far I can evaluate the mass in the domain by looping through the 
elements as in void getVolumes of 
https://git.iws.uni-stuttgart.de/andrian/rate-sens-nofrac/blob/master/src/rate_problem.hh.



I tried to use the following to access the fluxes in rate_problem.hh 
(inspired from porousmediumflow/velocityoutput.hh and 
porousmediumflow/nonequilibrium/gridvariables.hh):



    // the upwind term to be used for the volume flux evaluation
    auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx); };
    auto upwindTermW = [](const auto& volVars) { return 
volVars.mobility(wPhaseIdx); };


        auto elemFluxVarsCache = 
localView(gridVariables.gridFluxVarsCache());

        elemFluxVarsCache.bind(element, fvGeometry, elemVolVars);

    for (auto&& scvf : scvfs(fvGeometry)) {

        if (scvf.boundary()) {

            //auto bcTypes = problemBoundaryTypes(element, scvf);
            //if (bcTypes.hasOnlyDirichlet())
            {

    FluxVariables fluxVars;
    fluxVars.init(*this, element, fvGeometry, 
elemVolVars, scvf, elemFluxVarsCache);


    Scalar extrusionFactor = 
problem_.extrusionFactor(element, 
fvGeometry.scv(scvf.insideScvIdx()), elementSolution(element, sol_, 
fvGridGeometry_));
    FO += fluxVars.advectiveFlux(nPhaseIdx, 
upwindTermN);// / extrusionFactor;
    FW += fluxVars.advectiveFlux(wPhaseIdx, 
upwindTermW);// / extrusionFactor;

            }
        }
    }


However, I get compilation error at elemFluxVarsCache.bind(element, 

Re: [DuMuX] Check global conservation error

2018-08-21 Thread Nikolai Andrianov
Hi Dennis,


Thanks a lot, I got through elemFluxVarsCache.bind runtime error!


However, a new problem has popped up 


When I get to computing (or recovering) the advective flux with 
fluxVars.advectiveFlux(nPhaseIdx, upwindTermN), the execution brings me to 
caluclating Dracy's flux in CCTpfaDarcysLaw::flux(...) since

- !advFluxIsCached_[phaseIdx] is true in 
PorousMediumFluxVariables::advectiveFlux(..) and

- (!problem.couplingManager().isOnInteriorBoundary(element, scvf)) is true in 
CCTpfaFacetCouplingDarcysLawImpl::flux(..)


So I am ending up in CCTpfaDarcysLaw::flux(...) and the outsideVolVars = 
elemVolVars[scvf.outsideScvIdx()] cannot be computed because scvf is a boundary 
face, resulting in the error message "Assertion `it != volVarIndices_.end() && 
"Could not find the current volume variables for volVarIdx!"' failed."


Looking forward for your feedback.


Thanks,

Nikolai


From: Dumux  on behalf of Dennis 
Gläser 
Sent: Tuesday, August 21, 2018 11:39:34 AM
To: dumux@listserv.uni-stuttgart.de
Subject: Re: [DuMuX] Check global conservation error


Hi Nikolai,


I think your implementation is almost correct. If you want to balance masses, 
you have to add the density to your upwind terms though, i.e.:


auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx)*volVars.density(nPhaseIdx); };

...


Also, you have to call bind() instead of bindElement() on the geometry and the 
volume variables, i.e.:


auto fvGeometry = localView(this->fvGridGeometry());
fvGeometry.bind(element);

auto elemVolVars = localView(gridVariables.curGridVolVars());
elemVolVars.bind(element, fvGeometry, curSol);


This has the following background:


If you want to do computations on the element without assembling fluxes, you do 
not have to prepare all variables within the entire stencil. For these purposes 
you call bindElement(). This only prepares primary/secondary variables on the 
element. This is enough e.g. to evaluate the mass inside the domain after a 
time step. But, if you want to compute fluxes, you need to prepare the entire 
stencil. This stencil includes neighboring elements as well for cell-centered 
schemes. In this case you prepare you local views accordingly by calling bind() 
instead of bindElement().


I hope this makes it work!


Best wishes,
Dennis



On 21.08.2018 10:48, Nikolai Andrianov wrote:

Dear DuMuX experts,


Please advise how can I implement a global conservation error check, that is, 
evaluate the residual ( mass(t^{n+1}) - mass(t^n) ) / dt + sum of fluxes over 
the domain boundaries.


So far I can evaluate the mass in the domain by looping through the elements as 
in void getVolumes of 
https://git.iws.uni-stuttgart.de/andrian/rate-sens-nofrac/blob/master/src/rate_problem.hh.


I tried to use the following to access the fluxes in rate_problem.hh (inspired 
from porousmediumflow/velocityoutput.hh and 
porousmediumflow/nonequilibrium/gridvariables.hh):


// the upwind term to be used for the volume flux evaluation
auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx); };
auto upwindTermW = [](const auto& volVars) { return 
volVars.mobility(wPhaseIdx); };

auto elemFluxVarsCache = 
localView(gridVariables.gridFluxVarsCache());
elemFluxVarsCache.bind(element, fvGeometry, elemVolVars);

for (auto&& scvf : scvfs(fvGeometry)) {

if (scvf.boundary()) {

//auto bcTypes = problemBoundaryTypes(element, scvf);
//if (bcTypes.hasOnlyDirichlet())
{

FluxVariables fluxVars;
fluxVars.init(*this, element, fvGeometry, elemVolVars, 
scvf, elemFluxVarsCache);

Scalar extrusionFactor = 
problem_.extrusionFactor(element, fvGeometry.scv(scvf.insideScvIdx()), 
elementSolution(element, sol_, fvGridGeometry_));
FO += fluxVars.advectiveFlux(nPhaseIdx, upwindTermN);// 
/ extrusionFactor;
FW += fluxVars.advectiveFlux(wPhaseIdx, upwindTermW);// 
/ extrusionFactor;
}
}
}


However, I get compilation error at elemFluxVarsCache.bind(element, fvGeometry, 
elemVolVars) saying that Assertion `it != indices.end() && "Could not find the 
scv/scvf! Make sure to properly bind this class!"' failed.


Your feedback is greatly appreciated.


Many thanks,

Nikolai



PS: please disregard my previous mail with the same subject..



___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


Re: [DuMuX] Check global conservation error

2018-08-21 Thread Dennis Gläser

Hi Nikolai,


I think your implementation is almost correct. If you want to balance 
masses, you have to add the density to your upwind terms though, i.e.:



auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx)*volVars.density(nPhaseIdx); };


...


Also, you have to call bind() instead of bindElement() on the geometry 
and the volume variables, i.e.:



auto fvGeometry = localView(this->fvGridGeometry());
fvGeometry.bind(element);

auto elemVolVars = localView(gridVariables.curGridVolVars());
elemVolVars.bind(element, fvGeometry, curSol);


This has the following background:


If you want to do computations on the element without assembling fluxes, 
you do not have to prepare all variables within the entire stencil. For 
these purposes you call bindElement(). This only prepares 
primary/secondary variables on the element. This is enough e.g. to 
evaluate the mass inside the domain after a time step. But, if you want 
to compute fluxes, you need to prepare the entire stencil. This stencil 
includes neighboring elements as well for cell-centered schemes. In this 
case you prepare you local views accordingly by calling bind() instead 
of bindElement().



I hope this makes it work!


Best wishes,
Dennis




On 21.08.2018 10:48, Nikolai Andrianov wrote:


Dear DuMuX experts,


Please advise how can I implement a global conservation error check, 
that is, evaluate the residual ( mass(t^{n+1}) - mass(t^n) ) / dt + 
sum of fluxes over the domain boundaries.



So far I can evaluate the mass in the domain by looping through the 
elements as in void getVolumes of 
https://git.iws.uni-stuttgart.de/andrian/rate-sens-nofrac/blob/master/src/rate_problem.hh.



I tried to use the following to access the fluxes in rate_problem.hh 
(inspired from porousmediumflow/velocityoutput.hh and 
porousmediumflow/nonequilibrium/gridvariables.hh):



    // the upwind term to be used for the volume flux evaluation
    auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx); };
    auto upwindTermW = [](const auto& volVars) { return 
volVars.mobility(wPhaseIdx); };


        auto elemFluxVarsCache = 
localView(gridVariables.gridFluxVarsCache());

        elemFluxVarsCache.bind(element, fvGeometry, elemVolVars);

    for (auto&& scvf : scvfs(fvGeometry)) {

        if (scvf.boundary()) {

            //auto bcTypes = problemBoundaryTypes(element, scvf);
            //if (bcTypes.hasOnlyDirichlet())
            {

    FluxVariables fluxVars;
    fluxVars.init(*this, element, fvGeometry, 
elemVolVars, scvf, elemFluxVarsCache);


    Scalar extrusionFactor = 
problem_.extrusionFactor(element, fvGeometry.scv(scvf.insideScvIdx()), 
elementSolution(element, sol_, fvGridGeometry_));
    FO += fluxVars.advectiveFlux(nPhaseIdx, 
upwindTermN);// / extrusionFactor;
    FW += fluxVars.advectiveFlux(wPhaseIdx, 
upwindTermW);// / extrusionFactor;

            }
        }
    }


However, I get compilation error at elemFluxVarsCache.bind(element, 
fvGeometry, elemVolVars) saying that Assertion `it != indices.end() && 
"Could not find the scv/scvf! Make sure to properly bind this class!"' 
failed.



Your feedback is greatly appreciated.


Many thanks,

Nikolai



PS: please disregard my previous mail with the same subject..



___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


[DuMuX] Check global conservation error

2018-08-21 Thread Nikolai Andrianov
Dear DuMuX experts,


Please advise how can I implement a global conservation error check, that is, 
evaluate the residual ( mass(t^{n+1}) - mass(t^n) ) / dt + sum of fluxes over 
the domain boundaries.


So far I can evaluate the mass in the domain by looping through the elements as 
in void getVolumes of 
https://git.iws.uni-stuttgart.de/andrian/rate-sens-nofrac/blob/master/src/rate_problem.hh.


I tried to use the following to access the fluxes in rate_problem.hh (inspired 
from porousmediumflow/velocityoutput.hh and 
porousmediumflow/nonequilibrium/gridvariables.hh):


// the upwind term to be used for the volume flux evaluation
auto upwindTermN = [](const auto& volVars) { return 
volVars.mobility(nPhaseIdx); };
auto upwindTermW = [](const auto& volVars) { return 
volVars.mobility(wPhaseIdx); };

auto elemFluxVarsCache = 
localView(gridVariables.gridFluxVarsCache());
elemFluxVarsCache.bind(element, fvGeometry, elemVolVars);

for (auto&& scvf : scvfs(fvGeometry)) {

if (scvf.boundary()) {

//auto bcTypes = problemBoundaryTypes(element, scvf);
//if (bcTypes.hasOnlyDirichlet())
{

FluxVariables fluxVars;
fluxVars.init(*this, element, fvGeometry, elemVolVars, 
scvf, elemFluxVarsCache);

Scalar extrusionFactor = 
problem_.extrusionFactor(element, fvGeometry.scv(scvf.insideScvIdx()), 
elementSolution(element, sol_, fvGridGeometry_));
FO += fluxVars.advectiveFlux(nPhaseIdx, upwindTermN);// 
/ extrusionFactor;
FW += fluxVars.advectiveFlux(wPhaseIdx, upwindTermW);// 
/ extrusionFactor;
}
}
}


However, I get compilation error at elemFluxVarsCache.bind(element, fvGeometry, 
elemVolVars) saying that Assertion `it != indices.end() && "Could not find the 
scv/scvf! Make sure to properly bind this class!"' failed.


Your feedback is greatly appreciated.


Many thanks,

Nikolai



PS: please disregard my previous mail with the same subject..
___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux