You could take an alternate view: anything with a sign is an expression. 
So, the decimal value of 2**32-1 could be treated as a self-defining term. 

The same issue sometimes arises when people discuss the difference 
between X'FFFF' and 65535 -- if System/360 had been a 16-bit machine, 
would the maximum value of a decimal self-defining term be 32767? 

John Ehrman 
---------------------------------------------------------------------- 
>Date: Fri, 10 Nov 2017 23:14:40 -0500 
>From: Sudershan Ravi <[email protected]> 
>Subject: Hex and Decimal 

>The maximum value of a decimal self-defining term is 231−1, while the maximum 
>value a binary or hexadecimal self-defining term is 232−1. Why are they 
>different? 
------------------------------ 
>Date: Fri, 10 Nov 2017 23:21:00 -0500 
>From: Robert Netzlof <[email protected]> 
>Subject: Re: Hex and Decimal 

>The decimal term is signed, binary and hexadecimal are unsigned. 
>Therefore, the decimal term has only 31 bits available to record 
>magnitude, binary and hex can use 32 bits since there is no sign. 

Reply via email to