Hello,

On Mon, 22 Apr 2013 21:57:16 +0200
David Brown <david.br...@hesbynett.no> wrote:

> On 22/04/13 19:32, Paul Sokolovsky wrote:
> > Hello,
> >
> > On Mon, 22 Apr 2013 17:39:27 +0200
> > David Brown <da...@westcontrol.com> wrote:
> >
> >> Hi,
> >>
> >> When I compile the code you gave below (using gcc 4.6.3 from
> >> 20120406), I get:
> >>
> >> warning: left shift count >= width of type [enabled by default]
> >>
> >> That is even without any sort of warning flags - and you should
> >> normally enable lots of warnings.
> >>
> >> So here you have undefined code, and the compiler tells you of the
> >> problem.  You really cannot get better than that by trying to
> >> invent your own ideas about how you want to define this undefined
> >> behaviour.
> >
> > Yeah, but can get better by leveraging generally accepted ideas
> > (math def of shift) and ideas put by more than just "you" into
> > similar cases (other well known gcc ports, like x86) ;-).
> >
> 
> Defining
> 
>       (x << n) = (x * 2^(n % 16)) & 0xffff
> 
> is just as good a mathematical definition as your desired
> 
>       (x << n) = (x * 2^n) & 0xffff
> 
> (for 16-bit ints)
> 
> 
> Neither definition is "normal" mathematics.

Ok, let's recurse into getting terminology and foundation straight. So,
I gather the most correct classification of shift is "arithmetic-logic
bitwise operation". And it's defined in terms of iterative algorithm on
the value represented as a vector of digits. The algorithm is of course
obvious. So, I meant *that* mathematics - which deals with algorithms
and stuff, not one dealt with in primary school. Prooflink that the
former exists: http://en.wikipedia.org/wiki/Algorithm (though nowadays
everyone laughs at wikipedia).

> 
> 
> And note that gcc on the x86 has exactly the same effect - it's just 
> that with 32-bit ints, you get it with shifts of 32-bits or more.

Exactly same - which? As I propose, yes. It's an easy try, so did you?
I quoted
http://stackoverflow.com/questions/7214263/unexpected-behavior-of-bitwise-shifting-using-gcc
which describes x86 behavior (different compile/runtime, compile
matches algo definition), which I verified with few gcc versions before
sending the original mail.

Granted, I didn't try other targets to see how common one vs another
behavior, considering x86 good enough affinity (I'd guess they
definitely had this discussion before).

> 
> 
> > Thanks for all the flame, guys! ;-).
> >
> 
> You're welcome :-)  It's important to think about these things and
> get the correct.

Yep, but given that there's on-by-default warning, as you mentioned,
it's not *that* important.


-- 
Best regards,
 Paul                          mailto:pmis...@gmail.com

------------------------------------------------------------------------------
Precog is a next-generation analytics platform capable of advanced
analytics on semi-structured data. The platform includes APIs for building
apps and a phenomenal toolset for data science. Developers can use
our toolset for easy data analysis & visualization. Get a free account!
http://www2.precog.com/precogplatform/slashdotnewsletter
_______________________________________________
Mspgcc-users mailing list
Mspgcc-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mspgcc-users

Reply via email to