Dynamically Created Thread execution in Scala 2

I have use case to implement in scala 2. When the application starts, a I have a list of Tasks to execute 3 at a time. However at the end of each executed Task/Future, I need to rebuild the list of Tasks and feed my thread pool with it again.

implicit val executionContext: ExecutionContextExecutorService = ExecutionContext
        .fromExecutorService(Executors.newFixedThreadPool(3))

var tasks: List[Int] = getCurrentTasks()
val filterFutures: List[Future[Boolean]] = for ((task, idx) <- tasks.zipWithIndex)
        yield Future[Int] {
          // Based on task (ID integer), some long running jobs are executed here
          tasks = getCurrentTasks() // update list of tasks
          true
        }

def getCurrentTasks(): List[Int] = {
   // dynamically fetching from DB
}

It can be done very easy with akka-stream / zio-stream / fs2

Just as others say, it’s easier to use third-party library to implement it.

In case you just want to study, I paste some code from “Paraell Programming” to introduce ForkJoinPool

  import java.util.concurrent._
  import scala.util.DynamicVariable
  
  object Test {
  
  
    import java.util.concurrent.ForkJoinPool
  
    val forkJoinPool = new ForkJoinPool(3)
  
    abstract class TaskScheduler {
      def schedule[T](body: => T): ForkJoinTask[T]
  
      def parallel[A, B](taskA: => A, taskB: => B): (A, B) = {
  
        val right = task {
          taskB
        }
        val left = taskA
        (left, right.join())
      }
  
    }
  
    class DefaultTaskScheduler extends TaskScheduler {
      def schedule[T](body: => T): ForkJoinTask[T] = {
  
        val t = new RecursiveTask[T] {
          def compute = body
        }
        Thread.currentThread match {
          case wt: ForkJoinWorkerThread =>
            t.fork()
          case _ =>
            forkJoinPool.execute(t)
            t
        }
      }
    }
  
  
    val scheduler =
      new DynamicVariable[TaskScheduler](new DefaultTaskScheduler())
  
    def task[T](body: => T): ForkJoinTask[T] =
      scheduler.value.schedule(body)
  
    def parallel[A, B](taskA: => A, taskB: => B): (A, B) =
      scheduler.value.parallel(taskA, taskB)
  }