On Sun, May 3, 2009 at 2:35 AM, Walter Bright <[email protected]> wrote: > Don wrote: >> >> I don't think anyone expects to be able to divide an integer by an >> imaginary, and then assign it to an integer. I was astonished that the >> compiler accepted it. > > There actually is a reason - completeness. Mathematically, there is a > definite answer to it, so why not fill in all the entries for all the > combinations?
But the result of int divided by imaginary is pure imaginary. So I think Don's point was that you shouldn't be able to assign that back to an int. --bb
