[swift-users] Decimal imported as NSDecimal not NSDecimalNumber in Swift 3 to Objective C

Chris Anderson christopher.anderson at gmail.com
Fri Nov 11 15:46:21 CST 2016

I'm having problems with the type conversion between a Swift `Decimal` and
an Objective C `NSDecimalNumber`.

If I have the Swift class:

    @objc class Exam: NSObject {
        var grade: Decimal = 90.0

And try to use that Swift class in Objective C,

    Exam *exam = [[Exam alloc] init];
    NSDecimalNumber *result = [[NSDecimalNumber zero]

I get the error:

Sending 'NSDecimal' to parameter of incompatible type 'NSDecimalNumber *

as it seems like `grade` is being treated as an `NSDecimal` not an
`NSDecimalNumber`. This seems incorrect as per
https://developer.apple.com/reference/foundation/nsdecimalnumber it says

"The Swift overlay to the Foundation framework provides the Decimal
structure, which bridges to the NSDecimalNumber class. The Decimal value
type offers the same functionality as the NSDecimalNumber reference type,
and the two can be used interchangeably in Swift code that interacts with
Objective-C APIs. This behavior is similar to how Swift bridges standard
string, numeric, and collection types to their corresponding Foundation

So I'm not sure if 1) I'm doing something wrong. 2) there's an error in the
documentation or 3) this is a Swift bug. Number 1 on that list is
definitely the most likely, but I wanted to see what I’m missing here.

I don't want to explicitly make the values in my Swift class
`NSDecimalNumber` because then I cannot do simple arithmetic operations
such as `+` without doing the whole ugly `decimalNumberByAdding` dance.

Thanks for the help!

Chris Anderson
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-users/attachments/20161111/bdfd0cdd/attachment.html>

More information about the swift-users mailing list