On Nov 28 2021, Paul Smith wrote:
>> The C standard defines the largest unsigned long long value
>> as 18446744073709551615, the largest signed long long value
>> as 9223372036854775807, and the smallest signed long long value as -
>> 9223372036854775808.  So, the definition cannot be wrong in any
>> standards-conforming implementation of C.

Andreas Schwab (6 December 2021 12:30) replied:
> This is wrong.  These are *minimum* limits.

For reference: the number of bytes needed for the ASCII decimal
representation (including terminating '\0') of an integral type is
bounded (fairly tightly) above by

  53 * sizeof(type) / 22 + (3 if type is signed else 2)

When I need a compile-time constant for an array buffer size, this is
what I use.  Fuller explanation here:
https://github.com/ediosyncratic/study.py/blob/master/maths/buffersize.py

        Eddy.

Reply via email to