[swift-evolution] [Pitch] Remove destructive consumption from Sequence

Brent Royal-Gordon brent at architechies.com
Mon Jun 27 20:42:07 CDT 2016


> [Just to complicate things... I wonder if finiteness is really
> meaningful.  It's easy to create a finite sequence that's so long that
> it's “effectively infinite.”]

And similarly, operations like `first(while:)` on an infinite sequence might or might not produce an infinite sequence, depending on properties of the sequence and predicate which aren't expressed in the type system.

I think there are really three cases:

* Known finite
* Known infinite
* Unknown

The most obvious way to model this is as two subtypes:

		Possibly infinite
		/			\
	Infinite		Not infinite

A known-infinite iterator might refine unknown-infinite iterators like this:

	protocol InfiniteIteratorProtocol: IteratorProtocol {
		mutating func next() -> Element	// Note that the return is non-Optional!
	}
	
	extension InfiniteIteratorProtocol {
		mutating func next() -> Self.Element? { return .some(next()) }
	} 

Refining unknown-infinite collections to provide known-infinite collections is trickier. I think we would want something like this:

	// We don't use Optional because its Comparable conformance puts `.none` 
	// before, not after, `.some`.
	enum InfiniteCollectionIndex<ConcreteIndex: Comparable>: Comparable {
		case .element (ConcreteIndex)
		case .end
		
		var concreteIndex: ConcreteIndex {
			get {
				switch self {
				case .element(let i):
					return i
				case .end:
					preconditionFailure("Attempted to access the end of an infinite collection")
				}
			}
			set {
				self = .element(newValue)
			}
		}
	}
	func < <ConcreteIndex: Comparable>(lhs: InfiniteCollectionIndex<ConcreteIndex>, rhs: InfiniteCollectionIndex<ConcreteIndex>) -> Bool {
		switch (lhs, rhs) {
		case let (.element(lhsIndex), .element(rhsIndex)):
			return lhsIndex < rhsIndex
		case (.element, .end):
			return true
		default:
			return false
		}
	}
	
	extension IndexingIterator: InfiniteIteratorProtocol where Elements: InfiniteCollection {}
		
	protocol InfiniteCollection: Collection where Index == InfiniteCollectionIndex<ConcreteIndex>, Iterator: InfiniteIteratorProtocol {
		typealias ConcreteIndex: Comparable
		
		var startConcreteIndex: ConcreteIndex { get }
		func index(after i: ConcreteIndex) -> ConcreteIndex
		func formIndex(after i: inout ConcreteIndex)
		
		subscript(i: ConcreteIndex) { get }
	}
	extension InfiniteCollection {
		var startIndex: InfiniteCollectionIndex<ConcreteIndex> {
			return .element(startConcreteIndex)
		}
		var endIndex: InfiniteCollectionIndex<ConcreteIndex> {
			return .end
		}
		func index(after i: InfiniteCollectionIndex<ConcreteIndex>) -> InfiniteCollectionIndex<ConcreteIndex> {
			return .element(index(after: i.concreteIndex))
		}
		func formIndex(after i: inout InfiniteCollectionIndex<ConcreteIndex>) {
			formIndex(after: &i.concreteIndex)
		}
		subscript(i: InfiniteCollectionIndex<ConcreteIndex>) {
			return self[i.concreteIndex]
		}
	}
	// With corresponding InfiniteBidirectionalCollection and InfiniteRandomAccessCollection types

But the problem with this is, for all its complications, it probably doesn't actually provide much additional safety. "Possibly infinite" needs to have all the operations we ought to forbid on an infinite sequence or collection. All the "infinite" type can really do is provide runtime implementations that crash immediately; the type system can't prevent it.

The alternative is to have some kind of wrapper type you put around "possibly infinite" sequences/collections to essentially assert they're finite:

	// Forbidden:
	infiniteSequence.prefix(while: { $0 < 100 }).map { … }
	// Instead, write:
	infiniteSequence.prefix(while: { $0 < 100 }).assumingFinite.map { … }

But that once again brings us to the question: How important *is* it that we prevent greedy calls on known-infinite sequences? There aren't actually any known-infinite sequences in the standard library; even the `sequence` function can be terminated by returning `nil`. Known-infinite sequences are certainly a coherent concept, but are they something we need to model? And if not, is requiring people to say `assumingFinite` on calls where, as far as the type system is concerned, there *is* a possibility of a `nil` return really worth it?

(It *is* kind of a shame that we didn't keep lazy-by-default `map` and `filter`, because the lazy algorithms are usable on infinite sequences. But I understand why that wasn't possible.)

-- 
Brent Royal-Gordon
Architechies



More information about the swift-evolution mailing list