I was able to resolve the problem. Although I ran into another, eh,
interesting issue.
The problem was that in my Makefile.pl I had:
'CCFLAGS' => '/MT /W3', # Debug-flag: '/Zi',
'OPTIMIZE' => '/O2',
That is, I entirely overrode whatever came with $Config. I got away with
it for a long time, but finally time caught up with me. I changed this to:
my $ccflags = $Config{'ccflags'};
my $optimize = $Config{'optimize'};
$ccflags =~ s/-MD\b/-MT/;
$optimize =~ s/-MD\b/-MT/;
$ccflags =~ s/-O1\b/-O2/;
$optimize =~ s/-O1\b/-O2/;
And everything works fine. Exactly what in $Config{'ccflags'} that was
crucial I did not investigate. (But I don't use time_t.)
So why do I insist on /MT and /O2? Originally, I distributed the MSCVRT file
in my binary distribution and put in the same directory as my own DLL. But
this did not work when you used my module from IIS. I asked some C++ MVPs
for advice, and they suggested /MT.
On the other hand, there is no particular thought behind /O2, as I recall.
"Optimize for speed" sounded good, I thought some time long ago. So why not
stick with the default then? This is where we come to the interesting part.
One of my test scripts fail when I use /O1 on AS1401 and x86 (and again only
that build). But the failure is very strange. I have an XS routine with
this signature:
int
executebatch(sqlsrv, rows_affected = NULL)
SV * sqlsrv;
SV * rows_affected;
In one place in one test script, this XS routine returns undef. This is a
core routine, so it is called well over thousand times in the test suite.
And in one specific case it returns undef. I've traced it so far that the
underlying C++ function does really execute to the end and returns TRUE.
Apparently something goes wrong in the part that comes after? Why? Maybe
something was clobbered, so maybe I have a bad pointer somewhere. But
they usually cause a lot more problem than a single miss in 1000 executions.
Anyway, that's why I stick with -O2.
--
Erland Sommarskog, Stockholm, [email protected]