Hi guys,

I am using a #define to do a calculation during compile
time and load the result in an (8 bit) uart register so
that I can set the baudrate.

I am using the following in a header file :

#define crystal 11059200
#define bd4800 (256 - crystal / 192 * 4800)

Here the result is 0xef854100

#define crystal 11059200
#define bd4800 (256 - (crystal / 921600))

And here the (correct) result is 0xf4

Why is it that the multiplication in the first #define fails ?
I have used brackets "()" in numerous places, did typecasts,
split up the #define in multiple parts but all failed on the
multiplication.

It seems that when I use a multiplication the result stored in
the #define is limited to a signed 16 bit number.
Is this assumption correct ?

Update : I have done some further tests and the assumption seems
correct for a multiplication. If I do the following division
#define (crystal / 100)
the answer is 0x0001b000 which is correct.

I am using SDCC : mcs51 3.2.1 #8413 (8 Feb 2013) (Linux)

The command line is :
sdcc -D_89v664 -c --model-small --code-size 65536 --xram-size 2048
serial.c

Thanks.

roelof



------------------------------------------------------------------------------
Free Next-Gen Firewall Hardware Offer
Buy your Sophos next-gen firewall before the end March 2013 
and get the hardware for free! Learn more.
http://p.sf.net/sfu/sophos-d2d-feb
_______________________________________________
Sdcc-user mailing list
Sdcc-user@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sdcc-user

Reply via email to