Hi Evtim,
Just want to give some context for this.
This is due to the fact that `JSONEncoder` and `JSONDecoder` are
currently based on `JSONSerialization`: when you go to decode some JSON
data, the data is deserialized using `JSONSerialization`, and then
decoded into your types by `JSONDecoder`. At the `JSONSerialization`
level, however, there is no way to know whether a given numeric value is
meant to be interpreted as a `Double` or as a `Decimal`.
There are subtle differences to decoding as either, so there is no
behavior that could satisfy all use cases. `JSONSerialization` has to
make a decision, so if the number could fit losslessly in a `Double`, it
will prefer that to a `Decimal`. This allows guaranteed precise
round-tripping of all `Double` values at the cost of different behavior
when decoding a `Decimal`.
In practice, this might not really matter in the end based on how you
use the number (e.g. the loss in precision can be so minute as to be
insignificant) — what is your use case here? And can you give some
numeric values for which this is problematic for you?
As others have mentioned, one way to guarantee decoding a numeric string
in a specific way is to actually encode it and decode it as a `String`,
then convert into a `Decimal` where you need it, e.g.
```swift
import Foundation
struct Foo : Codable {
var number: Decimal
public init(number: Decimal) {
self.number = number
}
private enum CodingKeys : String, CodingKey {
case number
}
public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
let stringValue = try container.decode(String.self, forKey:
.number)
guard let decimal = Decimal(string: stringValue) else {
throw DecodingError.dataCorruptedError(forKey: .number, in:
container, debugDescription: "Invalid numeric value.")
}
self.number = decimal
}
public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: CodingKeys.self)
try container.encode(self.number.description, forKey: .number)
}
}
let foo = Foo(number: Decimal(string:
"2.71828182845904523536028747135266249775")!)
print(foo) // => Foo(number: 2.71828182845904523536028747135266249775)
let encoder = JSONEncoder()
let data = try encoder.encode(foo)
print(String(data: data, encoding: .utf8)!) // =>
{"number":"2.71828182845904523536028747135266249775"}
let decoder = JSONDecoder()
let decoded = try decoder.decode(Foo.self, from: data)
print(decoded) // => Foo(number:
2.71828182845904523536028747135266249775)
print(decoded.number == foo.number) // => true
```
— Itai
On 28 Oct 2017, at 11:23, Evtim Papushev via swift-users wrote:
Hello :)
I am trying to find a way to parse a number as Decimal without losing
the number's precision.
It seems that the JSON decoder parses it as Double then converts it to
Decimal which introduces errors in the parsing. That behavior is in
fact incorrect.
Does anyone know if there is a way to obtain the raw data for this
specific field so I can write the conversion code?
Thanks,
Evtim
_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users
_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users