So, if I've got a UDF which is declared as

DECLARE EXTERNAL FUNCTION xxx
  INTEGER, INTEGER, INTEGER, INTEGER, INTEGER, INTEGER, INTEGER, 
INTEGER, INTEGER, INTEGER
   RETURNS INTEGER BY VALUE
   ENTRY_POINT 'xxx'
   MODULE_NAME 'xxx_udf';

these INTEGERs are all 32 bits, right, both on 32 bit and 64 bit 
versions of Firebird?

And if it turns out that the function is declared in C as

long xxx(long *elem1, long *elem2, long *elem3, long *elem4, long 
*elem5, long *elem6, long *elem7, long *elem8, long *elem9, long *elem10)

we might expect long to be 32 bits in a 32 bit build and 64 bits in a 64 
bit build, yes?

Which means that on a 64 bit build there's a mismatch between the 
parameter and result sizes, yes?

Does anyone have the remotest clue what will happen? - this is crypto 
stuff (it would be), so therefore (as always with crypto stuff)  the 
function churns away and returns some number which you can't tell by 
looking at it whether it's right or not.

-- 
Tim Ward

Reply via email to