All of the test suite files import org.scalatest._ , and they all define a class which extends FunSuite.
I figured out what the problem is/was. In some of the test suite files, I declarer a package and in some I don’t. When I removed the package declarations, all the tests run.
BTW, I noticed that in some of the files I declare the test class like this, and in other places without the @RunWith... annotation. I got this from an example from the Coursera examples. Is it better to use this annotation or not?
Yeah, you can’t just define packages casually in files – they have to match the directory structure. (This may be a ScalaTest-specific restriction; I don’t recall offhand.)
I’d recommend without – I’m surprised the Coursera course includes it. That changes the Runner, which I believe can fundamentally alter the behavior of ScalaTest. I’ve never used that annotation, and I wouldn’t use it without close study of what the heck it does – it’s extremely unusual…
the idea of package needing to follow directory structure is pretty weird to me. Apparently it is common among Scala programming conventions. I often use packages to control visibility of code. Making, changes in directory structure simply because of changes in package name, is weird. Admittedly that’s motivated by how packages are used in Lisp, granted the culture is different in Scala.
I’d strongly recommend to consistently use proper packages everywhere and never rely on the default package. Some reasons for using packages are listed here, and there’s more: You cannot refer to classes in the default package from classes inside a proper package, some tooling expects to to use proper packages,…
This is not a Scala convention, but a Java/JVM mechanism that is relevant at runtime/byte code level, too.
Just as having to rename a source file because of a change in class name - and just as in this case, the IDE supports you with keeping file system and source code declarations in sync.
I also don’t think it’s that weird or unique to Java/Scala - it’s similar in Haskell (where you usually have to maintain your module list inside a cabal file in addition) and a common convention, though not enforced, in Ruby, IIRC.
As mentioned by @jducoeur, package declarations need to match the file system structure. If it doesn’t work for you, you’ll have to share more details. I’m not aware of any problems with tests in packages - to the contrary, I wouldn’t have been terribly surprised if the test runner refused to acknowledge tests in the default package.
This approach is brittle, overly complicated and error-prone, anyway - test resources should not be accessed relative to class files, and classpath resources are not necessarily present on the file system, they might reside inside jars, on some URI across the network or wherever…
I’d recommend to use classpath resources all the way: Put your files e.g. in $PROJECT_ROOT/src/test/resources/testfiles, then do
I doubt it is the JVM making this assumption, but rather IntelliJ’s code to find all the tests probably makes the brittle assumption about this annoying redundancy between file system directory names and scala package names.
AFAIK, packages are resolved into directories by the ClassLoader. This applies to class files by necessity and for Java, to source files by convention. Scala allows to organize source files differently, but class files need to be where the package name says they are.
At least for Java, you cannot compile source files that have a directory path different than that indicated by their package.
Since then, the convention of directory <-> package equivalence has also been followed by the vast majority of other JVM-based development tooling; it is possible to buck this convention in Scala, but it’s typically not worth the trouble.
The JVM as such probably doesn’t care - what I meant is that this requirement applies to most/many JVM-based languages. As @curoli already wrote, it’s the class loaders that are concerned with this aspect, and the default class loader(s) will expect declared package name and file system location (or location in jar,…) to be in sync.
$> javac -sourcepath src -d target src/de/sangamon/packpath/PackagePath.java
$> java -classpath target de.sangamon.packpath.PackagePath
$> mv target/de target/xy
$> java -classpath target de.sangamon.packpath.PackagePath
Error: Could not find or load main class de.sangamon.packpath.PackagePath
$> java -classpath target xy.sangamon.packpath.PackagePath
Error: Could not find or load main class xy.sangamon.packpath.PackagePath
You’re going to need to get used to this one – while there are edge cases, packages and directories line up 99.9% of the time in the Scala world, and a lot of tooling assumes that structure. It is quite rare to violate that (I don’t recall ever seeing an exception in real production code), and you’re just going to get into trouble trying to avoid it…
Packages are represented as folders, both before packaging to JARs and after. JARs are ZIP archives and ZIP archives can contain (nested) directories and files under that directories. Resources from package a.b.c and class files from package a.b.c will end up in the same directory a/b/c in JAR file.
What JVM is not supporting is directory hierarchy, i.e. package private modifier in Java doesn’t make particular thing visible in subpackages as JVM treats different packages as independent things, no matter whether and how they are nested.
@sangamon, one reason I’m using a relative path is because the test case files are not in the project. The project is being course controlled several directories able the top level of the scala project. The test data is outside the project so that other applications (non-scala) can be sure to access the same data. However, I really do like your idea of references the directory of the top of the project Source.fromFile rather than the the directory the class happens to be defined in, which as I’ve recently learned moves around in the file system every time the package name changes.