I defined a simple type-level function to compute the sum of two nats. Oddly, in the match type, (_0, _1)
seems to match the pattern (_, _0)
…
Here is the code (or in Scastie):
sealed trait NatT
final case class Zero() extends NatT
final case class Succ[N <: NatT](n: N) extends NatT
type Sum[X <: NatT, Y <: NatT] = (X, Y) match
case (_, _0) => X
case (_0, _) => Y
case (Succ[x], Succ[y]) => Succ[Succ[Sum[x, y]]]
@main def main(): Unit =
_1: Sum[_1, _0]
_1: Sum[_0, _1]
type _0 = Zero
type _1 = Succ[_0]
val _0: _0 = Zero()
val _1: _1 = Succ(_0)
The expanded compiler error:
12 | _1: Sum[_0, _1]
| ^^
| Found: (_1 : _1)
| Required: Sum[_0, _1]
|----------------------------------------------------------------------------
| Explanation (enabled by `-explain`)
|- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
|
| Tree: _1
| I tried to show that
| (_1 : _1)
| conforms to
| Sum[_0, _1]
| but the comparison trace ended with `false`:
|
| ==> (_1 : _1) <: Sum[_0, _1]
| ==> (_1 : _1) <: _0
| ==> (_1 : _1) <: Zero
| ==> _1 <: Zero (left is approximated)
| ==> Succ[_0] <: Zero (left is approximated)
| <== Succ[_0] <: Zero (left is approximated) = false
| <== _1 <: Zero (left is approximated) = false
| <== (_1 : _1) <: Zero = false
| <== (_1 : _1) <: _0 = false
| <== (_1 : _1) <: Sum[_0, _1] = false
|
| The tests were made under the empty constraint
----------------------------------------------------------------------------