[swift-server-dev] Next HTTP API meeting

Brent Royal-Gordon brent at architechies.com
Tue Mar 28 06:21:41 CDT 2017


> On Mar 27, 2017, at 5:13 AM, Logan Wright <logan at qutheory.io> wrote:
> 
> I disagree with the premise that protocols inherently create bugs

To be clear, I am *not* saying that they inherently create bugs. What I'm saying is that they *often* create bugs *if* they contain mutating members *and* do not specify whether they expect value or reference semantics.

Here's an example: `RangeReplaceableCollection` does not specify whether conforming types should have value or reference semantics. As a result, there's a heartstopping FIXME in the middle of the `+` operator's implementation:

	public func +<
	  RRC1 : RangeReplaceableCollection,
	  RRC2 : RangeReplaceableCollection
	>(lhs: RRC1, rhs: RRC2) -> RRC1
	  where RRC1.Iterator.Element == RRC2.Iterator.Element {
	
	  var lhs = lhs
	  // FIXME: what if lhs is a reference type?  This will mutate it.
	  lhs.reserveCapacity(lhs.count + numericCast(rhs.count))
	  lhs.append(contentsOf: rhs)
	  return lhs
	}

*This is the `+` operator!* It ships with Swift! And it will do the wrong thing if you use it with a reference type, because it's very difficult to write a generic algorithm with mutating operations which does the *right* thing when it's not sure whether the type provides value or reference semantics. It's possible, but tricky, to do it when you create the instances and completely control them right until the moment when you finally hand them off to the caller—you basically just have to pretend it's a move-only type. It's virtually impossible when you have to mutate something somebody passed to you.

That's why this:

> If we create protocols, this allows frameworks to implement concrete models with whatever semantics they see fit. Any subsequent interaction from the framework would have a concrete type to rely on.

Doesn't really work. People can't reliably use your type if they don't know what semantics to expect.

> If people feel extremely strong that there needs to be a concrete type, then I'd like to push for reference type as much as possible. As far as reference vs value type, I haven't really heard an argument for value types beyond what feels like a reaction to value types being the hip new hotness. While yes, they're great in Swift, and there's tons of places that should absolutely be modeled with value semantics, a request/response interaction represents a single request and should definitely be a reference based interaction.

Here's the argument for value semantics:

1. A good web framework is going to have several different, decoupled components written by different teams. If you have shared mutable state, then a mutation by one component will be seen by the other components, which may not be expecting it.

2. A good web framework will almost certainly be concurrent. If you have shared, mutable state *with concurrency*, then you have to have to synchronize access to that state or your users will spend the rest of their lives squashing Heisenbugs. (Thread confinement is a synchronization strategy, of course.)

With reference types, it is possible for careful programmers to avoid bugs caused by mutating shared mutable state; with value types, this safety is automatic and there simply isn't any reason to worry about these kinds of bugs.

The request and response are probably going to be very widely distributed. That means there's a *lot* of code which could screw things up for everybody else. If the request and response are mutable, giving them value semantics is likely to reduce bugs by removing a vector for two pieces of code to interact poorly.

> In practice, we went through several model iterations and the value type created very high levels of bugs and confusion for end users. The three biggest problems we had were as follows:
> 
> - Greatly increased bug levels and confusion related to unexpected mutation

Can you explain the sort of bugs you saw in more detail? I have an easy time understanding how the "action-at-a-distance" behavior of a reference type would cause bugs. I have a much harder time understanding how mutations *not* being seen elsewhere would cause bugs, other than "nothing really misbehaved, I just didn't know how this works".

> - Unnecessary code requirements added to every single passive access (ie: middleware) increasing code bloat unnecessarily

What does "unnecessary code requirements" mean here?

> - Extreme performance loss due to massive copy overhead

Did you attempt to reduce copying overhead by making your value types COW wrappers? If so, what was your experience with that approach?

-- 
Brent Royal-Gordon
Architechies



More information about the swift-server-dev mailing list