On Tuesday, 2 February 2021 at 22:27:53 UTC, Tim wrote:
I have to serialize an array like [0.0, 0.0, 0.0] to a Json object. During this process, the serializer creates a string of the array, but it creates "[0, 0, 0]", dropping the decimal. How can I stop this?
This depends on the library you are using for serialization, but 0 and 0.0 are the same thing in json anyway so it shouldn't matter for data interchange.
Which lib you using?