> Ben recently added a method to let the user control the output > precision for ASCII Tecplot files, which is a great idea, but the > default is 6 digits, which worries me. I vaguely remember struggling > to hunt down some verification test failure years ago which turned out > to be because the (GMV? XDA?) output truncated after 6 decimal places. > > Could we default to 16 digits instead? Or perhaps set a typeof(Real) > dependent DIGITS or PRECISION macro the way we do with TOLERANCE? > I'd prefer precise output that an informed user has to override to get > more efficiency over efficient output that an informed user has to > override to get full precision.
Generically I don't have a problem changing the precision by more than a factor of three, but the risk is there for users to suddenly exceed a quota when 'make run_examples' previously worked just fine... -Ben ------------------------------------------------------------------------------ Download Intel® Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev _______________________________________________ Libmesh-devel mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/libmesh-devel
