William-GuoWei commented on issue #12367:
URL: 
https://github.com/apache/dolphinscheduler/issues/12367#issuecomment-1285031809

   Hi, I think here are some solutions for you:
   1. DS worker is not like Airflow, it can support install Kettle Client on 
Worker, but not a docker of Kettle with DS client. So you need to have a kettle 
docker besides the worker server (DS Worker can run on the same server on DS 
Master). 
   2. You can easily use Http Task in DS to trigger Kettle job with http 
request.
   3. I am thinking about add a Kettle task in DS, and that will be easier for 
you to use kettle as a docker to run ETL jobs and close it after ETL finish. 
   
   For now, If your ETL job is on cloud, I think maybe  you need create 3 task 
for Kettle, 
   1. Create a shell or python task of start a docker with Kettle server
   2. Create a Http task or shell task to trigger Kettle job with http request
   3. Create a shell or python task of close the docker with Kettle server
   
   I hope that can help you.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to