I’d like to run several (100s of) sbt processes on a compute cluster. I notice that when sbt first starts up, it seems to update some files in my home directory. And then compile the scala code. If I run multiple sbt processes, even on different hosts, but which share my home directory, is this safe? Is sbt smart enough to use some sort of cooperative locking for this update, or do I need to force very sbt run to use a different resource directory.
What about the compilation, what if two different sbt processes try to recompile the same files. Is it my responsibility to prevent this by assuring that nothing needs to be recompiled? If so, can I tell sbt to exit with an error if it thinks it needs to recompile something?
[johan:cl-robdd/src/cl-robdd-scala] jnewton% sbt run dimacsParse [info] Loading settings for project global-plugins from idea.sbt,plugins.sbt ... [info] Loading global plugins from /Users/jnewton/.sbt/1.0/plugins [info] Updating ProjectRef(uri("file:/Users/jnewton/.sbt/1.0/plugins/"), "global-plugins")... [info] Done updating. [info] Loading project definition from /Users/jnewton/sw/regular-type-expression/cl-robdd/src/cl-robdd-scala/project [info] Updating ProjectRef(uri("file:/Users/jnewton/sw/regular-type-expression/cl-robdd/src/cl-robdd-scala/project/"), "cl-robdd-scala-build")... [info] Done updating. [info] Loading settings for project cl-robdd-scala from build.sbt ... [info] Set current project to cl-robdd-scala (in build file:/Users/jnewton/sw/regular-type-expression/cl-robdd/src/cl-robdd-scala/)