Run all tests in IntelliJ

Does anyone understand what the menu item labeled “Run 'ScalaTests in ‘scala…’” is intended to do? In my case it run some of the tests but not all of them.

It runs the tests define in some files of that directory, but not tests define in others. Nevertheless, I can run the missing tests in those files directly.

What does IntelliJ consider to be a valid ScalaTest? Is this just another mysterious thing about the IntelliJ scala plugin?

12

BTW, InteliJ displays the different files with different leading icons. And I don’t understand this distinction either.

09

It runs the following test suites, skipping the others. I don’t see a correlation between the various icons and the tests it thinks it needs to run.

00

Are all your test classes extending ScalaTest test suites, e.g. org.scalatest.FlatSpec? If not, then they won’t be run by “run all ScalaTest tests” action.

The icons in IntelliJ works more or less as follows:

  • blue circle with C - Xxx.scala contains class Xxx
  • yellow circle with O - Xxx.scala contains object Xxx
  • half blue, half yellow circle - Xxx.scala contains both class Xxx and object Xxx
  • gray sheet - Xxx.scala contains some top level classes or objects not named Xxx
1 Like

All of the test suite files import org.scalatest._ , and they all define a class which extends FunSuite.

I figured out what the problem is/was. In some of the test suite files, I declarer a package and in some I don’t. When I removed the package declarations, all the tests run.

BTW, I noticed that in some of the files I declare the test class like this, and in other places without the @RunWith... annotation. I got this from an example from the Coursera examples. Is it better to use this annotation or not?


@RunWith(classOf[JUnitRunner])
class DimacsSuite extends FunSuite {
  test("sample test") {
    assert(1 == 1, "sample assertion")
  }

Yeah, you can’t just define packages casually in files – they have to match the directory structure. (This may be a ScalaTest-specific restriction; I don’t recall offhand.)

I’d recommend without – I’m surprised the Coursera course includes it. That changes the Runner, which I believe can fundamentally alter the behavior of ScalaTest. I’ve never used that annotation, and I wouldn’t use it without close study of what the heck it does – it’s extremely unusual…

the idea of package needing to follow directory structure is pretty weird to me. Apparently it is common among Scala programming conventions. I often use packages to control visibility of code. Making, changes in directory structure simply because of changes in package name, is weird. Admittedly that’s motivated by how packages are used in Lisp, granted the culture is different in Scala.

I’d strongly recommend to consistently use proper packages everywhere and never rely on the default package. Some reasons for using packages are listed here, and there’s more: You cannot refer to classes in the default package from classes inside a proper package, some tooling expects to to use proper packages,…

1 Like

As mentioned earlier, using package in the test suite files causes the tests to be invisible to IntelliJ.

This is not a Scala convention, but a Java/JVM mechanism that is relevant at runtime/byte code level, too.

Just as having to rename a source file because of a change in class name - and just as in this case, the IDE supports you with keeping file system and source code declarations in sync.

I also don’t think it’s that weird or unique to Java/Scala - it’s similar in Haskell (where you usually have to maintain your module list inside a cabal file in addition) and a common convention, though not enforced, in Ruby, IIRC.

2 Likes

As mentioned by @jducoeur, package declarations need to match the file system structure. :slight_smile: If it doesn’t work for you, you’ll have to share more details. I’m not aware of any problems with tests in packages - to the contrary, I wouldn’t have been terribly surprised if the test runner refused to acknowledge tests in the default package.

1 Like

When I declare packages in the test suite files, and refactor my directory structure to match the packages, it seems IntelliJ can indeed find all the tests.

However, the IntelliJ auto-refactoring removes some necessary imports which I had to add back to make the tests compile properly.

This also necessitates refactoring some tests which use resource data, as the resource data is no longer at the same UNIX directory relative to the test suite file.

  test("read dimacs benchmark files") {
    val base = this.getClass().getResource(".").toString.drop(5) // skip "file:" 5 characters
    val rel = "../../../../../../data"
    // bench marks
    List("aim-100-1_6-no-1.cnf",
         "aim-50-1_6-yes1-4.cnf",
         "bf0432-007.cnf",
         "dtba-sat.cnf",
         "dubois20.cnf",
         "dubois21.cnf",
         "dubois22.cnf",
         "hole6.cnf",
         "par8-1-c.cnf",
         "quinn.cnf",
         "sat-33ZzxW.cnf",

         "sat-dMt1DH.cnf", // too big
         "simple_v3_c2.cnf",
         "zebra_v155_c1135.cnf"
         ).foreach { fname =>

      dimacsConvertFile(fname,
                        fname => base + rel + "/" + fname,
                        fname => s"/tmp/reduced-$fname")
    }

This approach is brittle, overly complicated and error-prone, anyway - test resources should not be accessed relative to class files, and classpath resources are not necessarily present on the file system, they might reside inside jars, on some URI across the network or wherever…

I’d recommend to use classpath resources all the way: Put your files e.g. in $PROJECT_ROOT/src/test/resources/testfiles, then do

fileNames.map(getClass.getResourceAsStream(fn => s"/testfiles/$fn")

and pass the resulting InputStreams to your code (optionally wrapping them into scala.io.Source or similar).

Alternatively, put them in $PROJECT_ROOT/testfiles and access them e.g. via

fileNames.map(Source.fromFile(fn => s"testfiles/$fn")

The current working directory that is used for resolving relative file paths for test runs can be configured in IDE launch configurations and is $PROJECT_ROOT by default.

(Of course you should make sure that resources (streams/sources/whatever) are properly closed after use…)

2 Likes

It sounds really unlikely to me that the JVM has knowledge about the directory structure and whether it coincides with package names. Are you sure about that?

I doubt it is the JVM making this assumption, but rather IntelliJ’s code to find all the tests probably makes the brittle assumption about this annoying redundancy between file system directory names and scala package names.

I haven’t quite wrapped my head around it yet.

AFAIK, packages are resolved into directories by the ClassLoader. This applies to class files by necessity and for Java, to source files by convention. Scala allows to organize source files differently, but class files need to be where the package name says they are.

2 Likes

At least for Java, you cannot compile source files that have a directory path different than that indicated by their package.

Since then, the convention of directory <-> package equivalence has also been followed by the vast majority of other JVM-based development tooling; it is possible to buck this convention in Scala, but it’s typically not worth the trouble.

The JVM as such probably doesn’t care - what I meant is that this requirement applies to most/many JVM-based languages. As @curoli already wrote, it’s the class loaders that are concerned with this aspect, and the default class loader(s) will expect declared package name and file system location (or location in jar,…) to be in sync.

$> javac -sourcepath src -d target src/de/sangamon/packpath/PackagePath.java 
$> java -classpath target de.sangamon.packpath.PackagePath
Hello
$> mv target/de target/xy
$> java -classpath target de.sangamon.packpath.PackagePath
Error: Could not find or load main class de.sangamon.packpath.PackagePath
$> java -classpath target xy.sangamon.packpath.PackagePath
Error: Could not find or load main class xy.sangamon.packpath.PackagePath

You’re going to need to get used to this one – while there are edge cases, packages and directories line up 99.9% of the time in the Scala world, and a lot of tooling assumes that structure. It is quite rare to violate that (I don’t recall ever seeing an exception in real production code), and you’re just going to get into trouble trying to avoid it…

1 Like

Packages are represented as folders, both before packaging to JARs and after. JARs are ZIP archives and ZIP archives can contain (nested) directories and files under that directories. Resources from package a.b.c and class files from package a.b.c will end up in the same directory a/b/c in JAR file.

What JVM is not supporting is directory hierarchy, i.e. package private modifier in Java doesn’t make particular thing visible in subpackages as JVM treats different packages as independent things, no matter whether and how they are nested.

@sangamon, one reason I’m using a relative path is because the test case files are not in the project. The project is being course controlled several directories able the top level of the scala project. The test data is outside the project so that other applications (non-scala) can be sure to access the same data. However, I really do like your idea of references the directory of the top of the project Source.fromFile rather than the the directory the class happens to be defined in, which as I’ve recently learned moves around in the file system every time the package name changes.

BTW, what is Source in your example code?

fileNames.map(Source.fromFile(fn => s"testfiles/$fn")