[swift-users] Parsing Decimal values from JSON

Itai Ferber iferber at apple.com
Tue Oct 31 12:28:10 CDT 2017


I can’t speak to any resistance you’ve seen from the Swift team (nor 
to any performance issues you’ve encountered with 
`JSONSerialization`), but just keep in mind that
a) `JSONSerialization` is maintained by the Foundation team, and
b) maintaining a separate JSON parsing implementation just for Swift is 
a great way to introduce new, separate, likely incompatible bugs

That being said, we’ve considered it, and continue to consider it — 
there is just a cost-benefit analysis that goes into prioritizing 
developer time.

On 31 Oct 2017, at 10:18, Jon Shier wrote:

> The appropriate solution here would be for Swift to have its own 
> native JSON parser that allows direct decoding into generic types 
> without the intermediary of JSONSerialization. For whatever reason 
> there seems to be resistance to this from the Swift team, but until we 
> have that ability, these types of issues will keep coming up, and the 
> performance overhead of JSONSerialization with JSONDecoder on top of 
> it will continue to leave Swift without a very performant JSON 
> solution.
> 	That said, I appreciate the support given Codable on this list.
>
>
>
> Jon
>
>> On Oct 31, 2017, at 1:07 PM, Itai Ferber via swift-users 
>> <swift-users at swift.org> wrote:
>>
>> Hi Evtim,
>>
>> Just want to give some context for this.
>> This is due to the fact that JSONEncoder and JSONDecoder are 
>> currently based on JSONSerialization: when you go to decode some JSON 
>> data, the data is deserialized using JSONSerialization, and then 
>> decoded into your types by JSONDecoder. At the JSONSerialization 
>> level, however, there is no way to know whether a given numeric value 
>> is meant to be interpreted as a Double or as a Decimal.
>>
>> There are subtle differences to decoding as either, so there is no 
>> behavior that could satisfy all use cases. JSONSerialization has to 
>> make a decision, so if the number could fit losslessly in a Double, 
>> it will prefer that to a Decimal. This allows guaranteed precise 
>> round-tripping of all Double values at the cost of different behavior 
>> when decoding a Decimal.
>>
>> In practice, this might not really matter in the end based on how you 
>> use the number (e.g. the loss in precision can be so minute as to be 
>> insignificant) — what is your use case here? And can you give some 
>> numeric values for which this is problematic for you?
>>
>> As others have mentioned, one way to guarantee decoding a numeric 
>> string in a specific way is to actually encode it and decode it as a 
>> String, then convert into a Decimal where you need it, e.g.
>>
>> import Foundation
>>
>> struct Foo : Codable {
>>     var number: Decimal
>>
>>     public init(number: Decimal) {
>>         self.number = number
>>     }
>>
>>     private enum CodingKeys : String, CodingKey {
>>         case number
>>     }
>>
>>     public init(from decoder: Decoder) throws {
>>         let container = try decoder.container(keyedBy: 
>> CodingKeys.self)
>>         let stringValue = try container.decode(String.self, forKey: 
>> .number)
>>         guard let decimal = Decimal(string: stringValue) else {
>>             throw DecodingError.dataCorruptedError(forKey: .number, 
>> in: container, debugDescription: "Invalid numeric value.")
>>         }
>>
>>         self.number = decimal
>>     }
>>
>>     public func encode(to encoder: Encoder) throws {
>>         var container = encoder.container(keyedBy: CodingKeys.self)
>>         try container.encode(self.number.description, forKey: 
>> .number)
>>     }
>> }
>>
>>
>> let foo = Foo(number: Decimal(string: 
>> "2.71828182845904523536028747135266249775")!)
>> print(foo) // => Foo(number: 
>> 2.71828182845904523536028747135266249775)
>>
>> let encoder = JSONEncoder()
>> let data = try encoder.encode(foo)
>> print(String(data: data, encoding: .utf8)!) // => 
>> {"number":"2.71828182845904523536028747135266249775"}
>>
>> let decoder = JSONDecoder()
>> let decoded = try decoder.decode(Foo.self, from: data)
>> print(decoded) // => Foo(number: 
>> 2.71828182845904523536028747135266249775)
>>
>> print(decoded.number == foo.number) // => true
>> — Itai
>>
>> On 28 Oct 2017, at 11:23, Evtim Papushev via swift-users wrote:
>>
>> Hello :)
>>
>> I am trying to find a way to parse a number as Decimal without losing 
>> the number's precision.
>>
>> It seems that the JSON decoder parses it as Double then converts it 
>> to Decimal which introduces errors in the parsing. That behavior is 
>> in fact incorrect.
>>
>> Does anyone know if there is a way to obtain the raw data for this 
>> specific field so I can write the conversion code?
>>
>> Thanks,
>> Evtim
>>
>> _______________________________________________
>> swift-users mailing list
>> swift-users at swift.org
>> https://lists.swift.org/mailman/listinfo/swift-users 
>> <https://lists.swift.org/mailman/listinfo/swift-users>
>> _______________________________________________
>> swift-users mailing list
>> swift-users at swift.org
>> https://lists.swift.org/mailman/listinfo/swift-users


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-users/attachments/20171031/a177babb/attachment.html>


More information about the swift-users mailing list