Hello Gerrit and Georg, > > short question what is the correct setting so that floats always > go through any kind of ascii IO as 1.34234 ? I don't really want to deal > with cases where files can't be read because floats are expected to be > 1,2343 or written as 1,24143. > We use the following settings in our application:
setlocale(LC_ALL, "C") setlocale(LC_CTYPE, "English") The first call sets all of the locale settings to the standard C settings => ascii IO as 1.34... The second call is for the character type. This must be localized on the customer installation language (e.g. greek). For instance setlocale(LC_CTYPE, "English") corresponds to the Englisch_USA.1252 code page. The settings of the OpenSG osginit() function is actually sensible. The problem is that the locale initialization is often performed once on application start up and this point in time maybe before the OpenSG initialization. It is really problematic that each dynamic link library is able to change the locale settings of an application which uses the library. At last, I think it is a documentation issue. It should be stated absolutly clearly that the osgInit function does change the locale settings. An application programmer then knows that he has to take action. > > And why is there a difference between Debug and Release, which seems > weird. > Ok that is a real question I can't answer. Maybe, there are more than two protagonists and it is not the blame of neither OpenSG nor the application. You can test at runtime the current locale settings by calling: char* test = setlocale(LC_ALL, NULL) The variable test contains the settings for each category separated by semicolon. With sensible distribution of such calls in the application code it should be easy to find the offender. Best, Johannes ------------------------------------------------------------------------------ _______________________________________________ Opensg-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/opensg-users
