# Confusion about type conversion from Seq[Int] to Seq[Double]

Hi All.

I am a bit unclear on where and how what appears to be a type conversion takes place in the following example. Let say I define function (all shown in ammonite shell)

``````@ def f(x: Seq[Double]): Double = x.sum
defined function f
``````

Then following works

``````@ f(Seq(1, 2, 3))
res4: Double = 6.0
``````

But this one does not

``````@ val s = Seq(1, 2, 3)
s: Seq[Int] = List(1, 2, 3)

@ f(s)
cmd11.sc:1: type mismatch;
found   : Seq[Int]
required: Seq[Double]
val s = Seq(1, 2, 3); val res11_1 = f(s)
^
Compilation Failed
``````

So the question is where is this type conversion defined, why it does not work in the second case and how can it be made to work in the second case?

Inference goes both ways, so if a method takes `Seq[Double]` and you give it a `Seq` without explicit type then compiler infers `Seq[Double]` and converts numbers to Doubles. In second case youâ€™ve separated the steps, so when inferring the type of `val s` the compiler doesnâ€™t yet take into consideration that it will be passed to `def f`. Therefore it infers `Seq[Int]` and that canâ€™t be passed to `def f`.

The solution could be explicit typing, i.e.:

``````val s = Seq[Double](1, 2, 3)
f(s) // works
``````

Alternatives:

``````val s = Seq[Double](1, 2, 3)
val s = Seq(1.0, 2.0, 3.0)
val s: Seq[Double] = Seq(1, 2, 3)
val s = Seq(1, 2, 3): Seq[Double]
``````

All of them result in `Seq[Double]`.

2 Likes

It should be noted that while the compiler does some backwards inference, it is very limited. For example, while `f(Seq(1, 2, 3))` works, `f(Seq(1, 2, 3) :+ 4)` already fails.

Numeric literals are processed in the context of an expected type. If no particular type is expected, then e.g. `1` is inferred to be an `Int`. But if a `Double` is expected, `1` is treated as if you had written `1D`.

In `val s = Seq(1, 2, 3)` no particular type is expected, so first `1`, `2`, and `3` are assumed to be `Int`, and that causes the expression as a whole to have `Seq[Int]`.

But in `f(Seq(1, 2, 3))`, the compiler knows that `f` expects a `Seq[Double]`, so `Seq(1, 2, 3)` is expected to be that type, and that in turn causes `1` and `2` and `3` to have an expected type of `Double`.

This is a really good pair of examples for understanding the bidirectional nature of type inference. Expected types flow inward; inferred types flow outward.

3 Likes

@tarsaâ€™s explanation is correct, but one additional detail is worth noting: this isnâ€™t conventional type conversion, itâ€™s a very special edge case called â€śnumeric wideningâ€ť â€“ you can find the technical definition in the Scala spec.

Itâ€™s very precise but a little ad-hoc â€“ one of those things that got added to the language because people intuitively tend to want it, but less elegant than we might wish. There have been occasional proposals to make this something more general and well-defined, but they havenâ€™t gone anywhere. As a result, Scala 3 will probably be restricting the idea to just the common case you show here â€“ allowing Integer literals to be used as other types where that makes sense, but nothing else.

2 Likes

Than you all. It is all clear now. Nevertheless it does feel some unsatisfying but I guess there not much we can do about it.