zionrubin edited a comment on pull request #55:
URL: https://github.com/apache/incubator-liminal/pull/55#issuecomment-877330859


   This PR adds support for running spark tasks with the k8s executor by 
default. 
   Spark task is supported by two types of executors, emr, and k8s.  
   To be able to have different executors support for one task, I have needed 
to change the way we register the tasks to the dag. 
   The old flow in register_dags: 
     ```
   for each task: 
         call task.apply_task_to_dag (...,executor)
             call executor.apply_task_to_dag (task,...)
   ```
   The new flow: 
   ```
   for each task: 
         call executor.apply_task_to_dag (...,task)
             call task.apply_task_to_dag(...) + executor tasks
   ```
   
   In addition, I have added a ```default``` executor  for tasks like 
```job_end, job_start```, etc...
   
   Added unit tests for the new logic


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to