I’d like a sorted set of items (so maybe TreeSet) - however I’d like to define my own toString() method. There need not be any other changes to the scala collection class.
I would like to redefine how say perhaps a List is shown, possibly:
"["+ elems.mkString(",") + "]".
Now, using the Cats route, if my list were in a Set or another collection, do I need to redefine how every one of those containers show items? Otherwise they would use their default toString e.g. a set with a list would revert to Set(List(1,2,3))?
You’ll have to forget about #toString() and go all in on Show. The tricky part with using multiple/mixed implicit declarations for the same type is to get the scoping right (if “right” is even possible).
Thanks - so if you redefined a Set and put some showable lists in, would you have to recursively call show on the items? I seem to need to but this wipes the type and converts them to Strings…
In the REPL, you’ll explicitly have to invoke #show(), unless your REPL allows to override the standard rendering based on #toString(). The standard Scala REPL probably doesn’t support this, no idea whether ammonite does.
You already do, kind of - customListShow has precedence over catsStdShowForList for implicit resolution.
So would it be correct to say since inbuilt types have a toString method already, an implicit toShow wouldn’t work (say if you wanted to redefine Option formatting) e.g. unless its called from another type which calls show on its objects?
implicit def myShowForOption[A: Show]: Show[Option[A]] =
new Show[Option[A]] {
def show(opt: Option[A]): String = opt match {
case Some(a) => "newformatSome(" + a.show + ")"
case None => "newformatNone"
}
}
Every reference type, builtin or custom, has #toString already. #toString() and Show simply are separate worlds.
You can add myShowForOption to CustomShowImplicits, and it will work just like the List and Set instances, “shadowing” the default instance provided by cats.
The catch just is that #show() needs to be used to trigger the mechanism. Which usually shouldn’t be a big deal in your own code, but it can be cumbersome working with “frameworks” that rely on #toString() for rendering and cannot be configured otherwise, like the REPL, or logging, for example.
i) as a style question: would you still use == on items you know are safe, e.g. if you were defining an eq for a custom class and were comparing strings, you would use == rather than ===?
ii) I’ve implemented an Eq for a custom class BigClass:
i) It depends, but mostly I’d try to stick to the Eq domain. So far I have no experience with using Eq consistently across a code base, though.
ii) Just providing an Eq instance for your custom class should work fine. Eq[List] should be pulled from catsKernelStdEqForList. If this doesn’t work, maybe provide a complete code example here - preferably with a rather SmallClass and without the leading >.
If I have a List of type A, with some A objects and some B objects (where class B extends A), when show is called over the list, only the show method from A's companion object is called (using cats Show typeclass). Is there any way, using the existing cats library, that if a subclass has a more specific (implicit) show method, then it will be called?
Well, the answer is that there’s “the tension between subtyping and ad-hoc polymorphism” - i.e. the two concepts don’t go together well.
Potential resolutions:
Make A and B children of a shared super type, provide Show instances for both of them (but not the super type). [EDIT: This obviously is not that useful, since you’d have to pattern match during traversal to get the proper instance for each element.]
Differentiate between A and B in A's Show instance via pattern matching. (Again, it might be nicer to make A and B children of a shared super type and provide the Show instance for the super type.)
Implement A's Show instance in terms of #toString() or a similar subtype polymorphic method. (But then obviously introducing Show won’t buy you much…)
because it said somewhere these were the basic ones.
However, including the cats.implicits._ gives the following:
could not find implicit value for evidence parameter of type cats.Show[List[Int]]
But removing it, makes everything work.
Something in the cats implicits must take priority and break your custom list and set Show methods.
(I’ve also found if you have two implicit defs for a class’s Show, it says show is not defined for that class, rather than saying two conflicting definitions were found)
extends syntax.AllSyntax with instances.AllInstances
in my example. The difference is that this way, the declarations in CustomShowImplicits have precedence over the Cats defaults. Importing cats.implicits._ will put the defaults on equal footing with the custom import and thus introduce clashes.
Implicit scoping rules are somewhat non-intuitive - see e.g. here and here for a discussion. This can make “overriding” existing implicits with custom declarations complicated. The usual remedies are selective imports (so the existing default implicits don’t enter scope at all) and/or “layering”, e.g. through inheritance (as in my example).
Thanks very much - I will take a look. In the meantime, could I just ask this one on a similar theme - fixing this will allow my basic Show methods in my project to work?:
This is the subtyping issue again. The type class concept simply doesn’t blend in well with specialization on subtypes. (Note that the concept originates in Haskell, where there is no subtyping.) If you really need to differentiate between subtypes, you’ll need to do it via pattern matching in the type class instance for the super type.
Just to illustrate selective imports vs layering: Instead of having your App extend AllInstances/AllSyntax, you could selectively import only the stuff you really require, bringing the default Show instance for Int in scope, but excluding the default instance for Set:
I’ve been making my package object extend syntax.AllSyntax with instances.AllInstances which I think may have meant I didn’t need to do many imports in each file.
I found the change mentioned above worked for show"…" but then broke the standard .show methods.
I’ve tried to get the below example compiling but to no avail…
Importing cats.implicits._, you are bringing all Cats default instances (and all syntax extensions) in scope, including cats.instances.int._, cats.syntax.show._ - and cats.instances.set._!
SortedSet(1,2,3).show doesn’t work because its type is SortedSet[Int], but you only have Show[Set[Int]] - try (SortedSet(1,2,3): Set[Int]).show.
The show interpolator works because it goes through Shown#mat(), which uses ContravariantShow. (Show extends ContravariantShow, but its generic parameter is invariant.)
The whole type class concept is really different from OO style. Probably it’s best to approach it in a guided fashion. I’d recommend the Cats book - I haven’t worked through it myself, but it looks pretty decent at first glance.