> However, one thing on my TODO list is investigating this: when I break
> the execution somewhere and inspect the variables (like FloatVarImp or
> IntVarImp), the 'home' pointer inside all variables is set to
> '0xBAADF00D', which is a little scary. You know, everything seems to
> work, but
heers
Christian
-Original Message-
From: Filip Konvička [mailto:[EMAIL PROTECTED]
Sent: Monday, August 25, 2008 5:26 PM
To: Christian Schulte
Cc: [EMAIL PROTECTED]
Subject: Re: [gecode-users] How to use Gecode binaries from within MSVC's IDE
> First, the "fix" I just se
> First, the "fix" I just send does not fix the problem... It does for me but
> not for Jan.
>
> The crashes are totally bizarre, as far as the debugger tells the crash
> occurs right in the heart of copying... That's close to impossible as Gecode
> does not use char there...
>
Well, if someone
rom: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf
Of Filip Konvicka
Sent: Monday, August 25, 2008 5:03 PM
To: [EMAIL PROTECTED]
Subject: Re: [gecode-users] How to use Gecode binaries from within MSVC's
IDE
Hi Christian,
> The IDE defines the macros UNICODE and _UNICODE: when compiling
Hi Christian,
> The IDE defines the macros UNICODE and _UNICODE: when compiling with these
> macros set, things crash.
strange - I'm using UNICODE all the time without problems.
IMHO it does only one thing: all Windows headers switch to the Unicode
APIs (and TCHAR is defined as wchar_t etc.) Yo
Dear all,
thanks to the help from Jan Kelbel, we found a solution (well, workaround)
for using the precompiled binaries we ship within Microsoft Visual Studio's
Integrated Development Environment (IDE).
The IDE defines the macros UNICODE and _UNICODE: when compiling with these
macros set, things