Hi, I’m wondering if somebody could shed light on a kind of overload resolution error…
Background: I have an API that is intended for use by both Java and Scala client code bases. The API is intended to work with both Java SAM functional interfaces, eg. Function, Consumer, Predicate, those from RX and so on, as well as with the Scala lambda representation via ‘Function1’. The intent is to support as much variance as possible in both worlds - so on the Java side there are bounded wildcards in the API and on the Scala side there are just plain Scala lambda types with their own contra- and co-variance on argument and result types.
I started off with overloading the methods - map
and filter
in the reduced code below, although there are others, notably flatMap
, more on that later.
This is feasible and plays well for Scala clients, but as Java only has the notion of SAM support to handle lambdas, this leads to overload ambiguities where Java can’t choose between using the Java functional interface and the Scala Function1
, leading to overload resolution failures.
I tried extracting a Java-only API with just the Java overloads into its own superinterface so that Java would only see the useful method signatures, but this doesn’t play well with flatMap
- it’s a long story, but it boils down to types that support flatMap
not playing well with subtyping of the actual monad type itself. I attempted to workaround this with self-types, nasty runtime casts, use of a free-monad and interpreter, F-bounds … all led to a mess and didn’t solve the problem and / or resulted in brittle unreadable code.
The latest wheeze is to use a syntax enhancement so that the core Java API stays in the interface and there is an implicit syntax class that decorates it with the Scala methods, as a kind of pseudo-overloading.
Annoyingly, this sometimes works and sometimes doesn’t - I’ve tried to reduce the difference but can’t quite find what what makes it break.
Any pointers as to what is going wrong?
Below is the reduced API and test to illustrate the problem. map
doesn’t work, but filter
does…
I’m using Scala 2.12.11 .
Regards,
Gerard
Exhibit A - an API for Java and Scala:
package com.sageserpent.americium
import _root_.java.util.function.{Predicate, Function => JavaFunction}
object Trials {
/**
* This is a syntax enhancement for {@link Trials}. It is required to avoid placing
* the enhancing methods in as competing overloads with the Java-facing methods
* already defined in {@link Trials}, as while that alternative would be perfectly
* OK for Scala client code, it confuses the Java compiler and would force Java client
* code to have to specify lots of disambiguating casts of lambda values to select
* the correct overload.
*/
implicit class ScalaApiSyntax[Case](trials: Trials[Case]) {
def map[TransformedCase](
transform: Case => TransformedCase): Trials[TransformedCase] =
trials.map(transform(_))
def filter(predicate: Case => Boolean): Trials[Case] =
trials.filter(predicate(_))
}
}
trait Trials[+Case] {
// Java API ...
def map[TransformedCase](transform: JavaFunction[_ >: Case, TransformedCase])
: Trials[TransformedCase]
def filter(predicate: Predicate[_ >: Case]): Trials[Case]
}
Exhibit B - some syntax testing:
package com.sageserpent.americium
import org.scalamock.scalatest.MockFactory
import org.scalatest.{FlatSpec, Matchers}
import _root_.java.util.function.{Predicate, Function => JavaFunction}
class TrialsSpec extends FlatSpec with Matchers with MockFactory {
val sut: Trials[Int] = stub[Trials[Int]]
// PASSES
"mapping using a Java function" should "compile" in {
assertCompiles("sut.map((_ + 1): JavaFunction[Int, Int])")
}
// FAILS
"mapping using a Scala function" should "compile" in {
assertCompiles("sut.map((_ + 1): Int => Int)")
}
// PASSES
"filtering using a Java function" should "compile" in {
assertCompiles("sut.filter((1 == _): Predicate[Int])")
}
// PASSES
"filtering using a Scala function" should "compile" in {
assertCompiles("sut.filter((1 == _): Int => Boolean)")
}
}