[swift-evolution] [Pitch] Add the DefaultConstructible protocol to the standard library

Tony Allevato tony.allevato at gmail.com
Mon Dec 26 15:25:57 CST 2016


On Mon, Dec 26, 2016 at 12:53 PM Adam Nemecek <adamnemecek at gmail.com> wrote:

> > The fact that certain bit-patterns correspond to zero and that zero has
> been a default in other languages is an artifact of their hardware
> representation, but not necessarily one that should motivate higher level
> design.
>
> Zero isn't just a bit pattern. It's the philosophical idea of nothingness,
> void, nihil. The fact that we found a common pattern between addition and
> multiplication doesn't mean that nothingness has lost it's meaning.
>

This shows that zero is a special value with special meanings in certain
contexts. That still doesn't imply that it should be a "default" value for
that type.


>
> > It's a circular argument to use that as a reason for why default
> constructibility should exist as a consequence of that.
>
> Can you provide an example where a non-zero value for a value like type
> makes more sense than all zeros? Also it's not circular, I need to work
> with types that have a placeholder value. Most values have only one such
> value.
>

I'm not trying to make the argument that there would be a better default
value than zero. I'm arguing that there should not be a "default" value at
all.


> >  (One example: an ORM where objects have to be associated with an
> originating "context"; either the type needs to have an initializer that
> takes a context, or the context is a factory that returns instances.)
>
> In ActiveRecord, the Rails ORM, I can call User.new on a User model,
> correct? I'm relatively sure most other ORMs have similar functionality
>

It's just an example for illustration—don't focus specifically on the ORM
part, but on the part where it's reasonable for a type to have a notion of
a "default" value that is dependent on external state/context, and
therefore cannot be initialized with init().



>
> On Mon, Dec 26, 2016 at 12:43 PM, Tony Allevato <tony.allevato at gmail.com>
> wrote:
>
> On Mon, Dec 26, 2016 at 12:15 PM Adam Nemecek <adamnemecek at gmail.com>
> wrote:
>
> > The all-zero bit pattern represents the integer zero—that's not the same
> as whether it represents the best "default" value for an integer as a
> higher-level concept, or whether such a default should exist in the first
> place.
>
> It represents a sensible value to initialize an int to when I want to
> initialize an array of ints to a certain size. There is a reason why you
> zero out memory but you don't "one out" memory.
>
>
> It was previously mentioned in this thread, but Swift explicitly made the
> choice to *not* initialize values to zero by default. If you want to
> initialize an array of ints, you can just as easily pass the default in
> that you want. If you want to do this more generically, you can still push
> the responsibility to the caller, which is arguably better because it
> doesn't restrict your algorithm to only those data types that conform to a
> particular protocol.
>
> The fact that certain bit-patterns correspond to zero and that zero has
> been a default in other languages is an artifact of their hardware
> representation, but not necessarily one that should motivate higher level
> design.
>
>
> > That doesn't explain why the additive identity is more special than the
> multiplicative one. It just argues that it's more convenient for a
> particular use case.
>
> Because the identity associated with the default initializer as is is the
> additive identity which means the default operation is addition.
>
>
> I should have phrased more carefully: it doesn't explain why the additive
> identity *should be* more special.
>
> There is no "default operation" over integers. The fact that the additive
> identity also happens to be the all zero bit-pattern which also happens to
> be the default initialized state isn't an argument for why it *should* be
> that way—it's just explaining what the current state of things is. It's a
> circular argument to use that as a reason for why default constructibility
> should exist as a consequence of that.
>
> > Because encapsulation. There's a reason NSObject has a default
> initializer.
>
> That's not an encapsulation that works everywhere. Consider this: there
> exist types that might have reasonable "default" values but for which the
> default initializer isn't an appropriate or possible way to express it.
> (One example: an ORM where objects have to be associated with an
> originating "context"; either the type needs to have an initializer that
> takes a context, or the context is a factory that returns instances.) By
> writing an algorithm that relies on DefaultConstructible, you prohibit
> those types from being used there. If the algorithm pushes the
> responsibility to the caller, as Array(repeating:count:) does, then it can
> work for both types. It's *more* general and more powerful than the
> DefaultConstructible version.
>
> The reason NSObject has a default initializer is because every object
> inherits from it and it must have a sensible default initializer to end the
> call chain, and there's nothing meaningful that needs to be parameterized
> for NSObject. If you call [[NSObject alloc] init] by itself, you get back
> an object that doesn't really do much. But I'm not sure what that has to do
> with generic algorithms—what about all the types that extend NSObject that
> don't have default initializers?
>
>
>
>
>
> On Mon, Dec 26, 2016 at 11:57 AM, Tony Allevato <tony.allevato at gmail.com>
> wrote:
>
> On Mon, Dec 26, 2016 at 11:43 AM Adam Nemecek via swift-evolution <
> swift-evolution at swift.org> wrote:
>
> > For integers, 0 is an additive identity. Is there a reason it should be
> given special treatment over 1, the multiplicative identity?
>
> E.g. for statistical reasons. When I have a collection of users with age
> etc it makes sense to ask what is the combined age of the collection? What
> is the semantic meaning of multiplying their ages?
>
>
> That doesn't explain why the additive identity is more special than the
> multiplicative one. It just argues that it's more convenient for a
> particular use case.
>
> I would turn your example around—if you're interested enough in thorough
> type design that you feel that a DefaultConstructible protocol would be
> useful here, then I offer that a better and safer design would be to create
> an "Age" value type (or, more generally, measurement types with concepts of
> units) if you want compile-time safety and limiting the supported
> operations to only those that are sensical. "Int" is arguably too wide a
> type to represent an age in a public API because it would allow two ages to
> be multiplied together, as you said.
>
>
>
> > Mathematically, identities are associated with (type, operation) pairs,
> not types alone.
>
> Correct, however we aren't talking about mathematics, we are talking about
> the implementation of a language that runs on very concrete architectures
> where very concrete bit patterns mean very concrete things that are
> unlikely to change any time soon.
>
>
> The all-zero bit pattern represents the integer zero—that's not the same
> as whether it represents the best "default" value for an integer as a
> higher-level concept, or whether such a default should exist in the first
> place.
>
>
>
> On Mon, Dec 26, 2016 at 11:35 AM, Tony Allevato <allevato at google.com>
> wrote:
>
> For integers, 0 is an additive identity. Is there a reason it should be
> given special treatment over 1, the multiplicative identity? Historically
> the only reason is because it has the all-clear bit pattern.
>
> Mathematically, identities are associated with (type, operation) pairs,
> not types alone.
>
> This conversation has put me in the column of "numeric types shouldn't
> have default initializers at all", personally.
> On Mon, Dec 26, 2016 at 11:27 AM Adam Nemecek via swift-evolution <
> swift-evolution at swift.org> wrote:
>
> The elements already have an Identity, the one that you get when you
> invoke the default constructor. It's 0 for Int, "" for String.
>
> On Mon, Dec 26, 2016 at 11:24 AM, David Sweeris <davesweeris at mac.com>
> wrote:
>
>
> On Dec 26, 2016, at 11:12, Tino Heth via swift-evolution <
> swift-evolution at swift.org> wrote:
>
> There is an older discussion that is somewhat linked to this topic:
> "Removing the empty initialiser requirement from
> RangeReplaceableCollection"
>
> https://lists.swift.org/pipermail/swift-evolution/Week-of-Mon-20160704/023642.html
>
> Imho "DefaultConstructible" types can be very handy, but so far, it seems
> no one has presented a single use case that is important enough to justify
> the inclusion in the stdlib.
> On the other hand, I'm quite sure that there's much functionality in the
> stdlib that many people consider as superfluous…
>
> I guess adding the protocol wouldn't have a big impact on size, so for for
> me, the question is "Does this protocol confuse users of Swift?", which I'd
> answer with "yes, possibly" (unless someone comes up with a name that is
> more intuitive).
>
>
> "Identity", but, at least for many numeric types, you'd need a mechanism
> for specifying which one you mean.
>
> - Dave Sweeris
>
>
> _______________________________________________
> swift-evolution mailing list
> swift-evolution at swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
>
> _______________________________________________
> swift-evolution mailing list
> swift-evolution at swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-evolution/attachments/20161226/85eebd76/attachment.html>


More information about the swift-evolution mailing list