I’m writing a parser that needs a way to convert strings into tokens:
def parse1[A](line: String)(using conv: String => A): A = conv(line)
(The actual parser does a little more…)
I can use it as parse1("0")(using _.toInt)
but this also works:
val a = parse1("a")
because identity
is implicitly available as String => String
.
So far, so good, but one little issue with this approach is implicit functions popping out of nowhere. For instance, inside Scalatest, a conversion from String
to Equalizer
got in the way. As a result, using parse(...)
instead of parse[SomeType](...)
wasn’t caught at compile-time and let to weird test failures.
In an attempt to improve robustness, I rewrote the parser as:
def parse2[A](line: String)(using Conversion[String, A]): A = line.convert
parse2("0")(using _.toInt)
still works but I was surprised that this didn’t:
val b = parse2[String]("b")
Isn’t there a given
somewhere that can play the part of Conversion[String,String]
(or Conversion[A,A]
for that matter)?
(Side question: are String => A
and Conversion[String, A]
equally wrong as a design choice in this case? Should a specific type be introduced instead?)
https://scastie.scala-lang.org/7OyKIrP6TOSXsouh1HlRJQ
EDIT: I’ve answered the second question myself: it’s way too dangerous to have a Conversion[String, Int]
floating around… I’m still curious about the failure to summon a
Conversion[A, A]
when needed, though.