How do i write spark/scala script that will react to file changes.

Input
Here is my use case :

Processes CRM file to derive categories.

Use case
Each Telco’s CRM is highly specific. Therefore a configuration file should contain a list of columns and values for each column to map to categories. The CRM processor will be invoked with one of these configurations.
CRM files are prone to change from the Telco without notification. I have to create monitoring on unparseable fields in the CRM file so we can see when the file format changes.

I am trying to understand this document

import java.nio.file._
import scala.collection.JavaConversions._
import scala.sys.process._

val file = Paths.get(args(0))
val cmd = args(1)
val watcher = FileSystems.getDefault.newWatchService

file.register(
watcher,
StandardWatchEventKinds.ENTRY_CREATE,
StandardWatchEventKinds.ENTRY_MODIFY,
StandardWatchEventKinds.ENTRY_DELETE
)

def exec = cmd run true

@scala.annotation.tailrec
def watch(proc: Process): Unit = {
val key = watcher.take
val events = key.pollEvents

val newProc =
if (!events.isEmpty) {
proc.destroy()
exec
} else proc

if (key.reset) watch(newProc)
else println(“aborted”)
}

watch(exec)

Please advise path to design this use case.