Hi Edscott,
give AddressSanitizer a try, it is a great tool to find such thing you
describe. Further MemorySanitizer and Undefined Behavior Sanitizer might
turn out to be helpful, too. My best experience with these tools were
when I used the latest Clang compiler. Some of them work with recent
versions of GCC very well, too.
Valgrind might be worth a try. But it has more false positives and the
output is more difficult to understand.

Kind regards,
Christoph

Am 21.08.2018 um 18:10 schrieb Edscott Wilson:
> OK.
> I'll dig into the matter a bit further to see if I can solve where the
> problem arises. It might be an incorrect cast somewhere that screws up
> memory locations.
> 
> Best regards,
> 
> Edscott
> 
> 
> 2018-08-17 15:31 GMT-05:00 Flemisch, Bernd
> <[email protected]
> <mailto:[email protected]>>:
> 
>     Hi Edscott,
> 
>     can you please open an issue at
>     https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/issues
>     <https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/issues> ?
>     Due to holiday season, it might take us some time to look at this.
>     By opening an issue, it won't be forgotten.
> 
>     Kind regards
>     Bernd
> 
>     Von: Edscott Wilson
>     Gesendet: Donnerstag, 16. August, 23:47
>     Betreff: [DuMuX] memory corruption in brookscorey.hh?
>     An: DuMuX User Forum
> 
> 
>     This is weird and should not be happening. I explain.
>      
>     While debugging a generalized Dirichlet type problem, I am
>     encountering a problem with the BrooksCorey material law
>     (dumux/material/fluidmatrixinteractions/2p/brookscorey.hh). 
>      
>     Here is the code from brookscorey.hh:
>      
>     181             using std::min;
>     182             using std::max;
>     183
>     184             swe = min(max(swe, 0.0), 1.0); // the equation below
>     is only defined for 0.0 <= sw <= 1.0
>     185
>     186             return pow(swe, 2.0/params.lambda() + 3);
>     187         }
>      
>     Inserting a break before the min(max()) call, I examine the value of
>     swe:
>      
>     Thread 1 "lswf-chem11" hit Breakpoint 2, Dumux::BrooksCorey<double,
>     Dumux::RegularizedBrooksCoreyParams<double> >::krw (params=...,
>         swe=6.9319619419652626e-17) at
>     
> /opt/dune/include/dumux/material/fluidmatrixinteractions/2p/brookscorey.hh:184
>     184             swe = min(max(swe, 0.0), 1.0); // the equation below
>     is only defined for 0.0 <= sw <= 1.0
>     (gdb) print swe
>     $11 = 6.9319619419652626e-17
>      
>      
>     Then I step over the min(max() call and re-examine swe and get a
>     corrupted value:
>      
>     (gdb) next
>     186             return pow(swe, 2.0/params.lambda() + 3);
>     (gdb) print swe
>     $12 = -3.9159195083267944e+240
>      
>     Stepping into the min(max()) function I see that the value which
>     should be "1.0" arrives corrupted:
>      
>     (gdb)
>     std::min<double> (__a=@0x7fffffffae00: 6.9319619419652626e-17,
>     __b=@0x7fffffffae10: -3.9159195083267944e+240)
>         at /usr/include/c++/6.3.1/bits/stl_algobase.h:200
>     200           if (__b < __a)
>     (gdb) print __b
>     $16 = (const double &) @0x7fffffffae10: -3.9159195083267944e+240
>      
>      
>     Looks like the "1.0" is being placed on the stack and being
>     optimized away after the max() part completes.
>      
>     Doing some changes to the code and doing the simple eff2abs law
>     conversion within the regularizedbrooksCorey class template, the
>     problem disappears, as the following gdb output shows:
>      
>     Thread 1 "lswf-chem11" hit Breakpoint 3, Dumux::BrooksCoreyV<double,
>     Dumux::RegularizedBrooksCoreyVParams<double> >::krw (params=...,
>         swe=6.9319619419652626e-17, iS=4.2262753399999999) at
>     /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:91
>     91              swe = min(max(swe, 0.0), 1.0); // the equation below
>     is only defined for 0.0 <= sw <= 1.0
>     (gdb) step
>     std::max<double> (__a=@0x7fffffffade0: 6.9319619419652626e-17,
>     __b=@0x7fffffffadf8: 0) at
>     /usr/include/c++/6.3.1/bits/stl_algobase.h:224
>     224           if (__a < __b)
>     (gdb) next
>     226           return __a;
>     (gdb)
>     227         }
>     (gdb)
>     Dumux::BrooksCoreyV<double,
>     Dumux::RegularizedBrooksCoreyVParams<double> >::krw (params=...,
>     swe=6.9319619419652626e-17, iS=4.2262753399999999)
>         at
>     /home/edscott/GIT/LSWF/include/2pncs/materiallaw/brookscoreyV.hh:92
>     92              return pow(swe, 2.0/params.lambda(iS) + 3);
>     (gdb) print swe
>     $18 = 6.9319619419652626e-17
>      
>      
>     Opinions?
>      
>     Could there be something amiss in the EffToAbsLaw class template?
>      
>     Or could it be a gcc bug? (using "gcc (GCC) 6.3.1 20170109" and "GNU
>     gdb (GDB) 7.12.1").
>      
>     I tried to use gdb within a docker container with "gcc (GCC) 7.3.1
>     20180312" and "GNU gdb (GDB) 8.1" but I get:
>      
>     (gdb) run
>     Starting program:
>     
> /home/dumux/projects/lswf-chem11-USE_BC-CACO3_CASO4_MGCO3-SIMPLIFIED-UMF/build-cmake/src/lswf-chem11
>     -ParameterFile ../SW-b.input
>     warning: Error disabling address space randomization: Operation not
>     permitted
>     warning: Could not trace the inferior process.
>     Error:
>     warning: ptrace: Operation not permitted
>     During startup program exited with code 127.
>      
>     Has anybody had luck debugging with gdb within a docker container?
>      
>      
>     The full g++ compiler command is as follows:
>      
>     /usr/bin/c++  -Wno-deprecated -ggdb
>     -I/home/dumux/problems/lswf-chem11 -I/home/dumux/include
>     -I/home/dumux -I/opt/dune/include -DUSE_BC -DCACO3_CASO4_MGCO3
>     -DSIMPLIFIED -DUMF   -pthread -rdynamic
>     CMakeFiles/lswf-chem11.dir/lswf-chem11.cc.o  -o lswf-chem11
>     -Wl,-rpath,/usr/lib/openmpi /opt/dune/lib64/libdunefem.a
>     /opt/dune/lib64/libdunealugrid.a /opt/dune/lib64/libdunegrid.a
>     /opt/dune/lib64/libugS3.a /opt/dune/lib64/libugS2.a
>     /opt/dune/lib64/libugL.a /opt/dune/lib64/libdunegeometry.a
>     /opt/dune/lib64/libdunecommon.a -lumfpack -lspqr -lldl -lcholmod
>     -lamd -lcamd -lcolamd -lccolamd -lsuitesparseconfig -pthread
>     /usr/lib/openmpi/libmpi.so -lmetis -lm -pthread
>     /usr/lib/openmpi/libmpi.so -lz -lldl -lspqr -lumfpack -lcholmod
>     -lamd -lcamd -lcolamd -lccolamd -lsuitesparseconfig -lsuperlu -lblas
>     -lparmetis -lmetis -lm -pthread /usr/lib/openmpi/libmpi.so -lmetis
>     -lm -lpsurface /opt/dune/lib64/libdunegrid.a
>     /opt/dune/lib64/libugS3.a /opt/dune/lib64/libugS2.a
>     /opt/dune/lib64/libugL.a /opt/dune/lib64/libdunegeometry.a
>     /opt/dune/lib64/libdunecommon.a -lparmetis -lmetis -lm -pthread
>     /usr/lib/openmpi/libmpi.so -lmetis -lm -Wl,-Bstatic -lVc
>     -Wl,-Bdynamic -lgmp -lgmpxx -llapack -lblas -pthread
>     /usr/lib/openmpi/libmpi.so /opt/dune/lib64/libdunefem.a
>     /opt/dune/lib64/libdunealugrid.a /opt/dune/lib64/libdunegrid.a
>     /opt/dune/lib64/libugS3.a /opt/dune/lib64/libugS2.a
>     /opt/dune/lib64/libugL.a /opt/dune/lib64/libdunegeometry.a
>     /opt/dune/lib64/libdunecommon.a -pthread -lpsurface -lmetis -lm -lz
>     -Wl,-Bstatic -lVc -Wl,-Bdynamic -lgmp -lgmpxx
>     /usr/lib/openmpi/libmpi.so -llapack -lblas
>      
>     Any pointer would be greatly appreciated.
>      
>     kind regards,
>      
>      
>     Edscott
>      
> 
> 
>     -- 
>     
> ------------------------------------------------------------------------------------
>     Dr. Edscott Wilson Garcia
>     Reservoir Engineering
>     Mexican Petroleum Institute
> 
> 
> 
>     _______________________________________________
>     Dumux mailing list
>     [email protected] <mailto:[email protected]>
>     https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
>     <https://listserv.uni-stuttgart.de/mailman/listinfo/dumux>
> 
> 
> 
> 
> -- 
> ------------------------------------------------------------------------------------
> Dr. Edscott Wilson Garcia
> Reservoir Engineering
> Mexican Petroleum Institute
> 
> 
> _______________________________________________
> Dumux mailing list
> [email protected]
> https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
> 

-- 
Unfortunately, plots are notoriously hard to get right. Partly, the
default settings of programs like gnuplot or Excel are to blame for
this since these programs make it very convenient to create bad plots.
                        -- Till Tantau, "The TikZ and PGF Packages"

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to