potiuk commented on issue #48622:
URL: https://github.com/apache/airflow/issues/48622#issuecomment-2769816257

   I do not know Jpype but from the description it looks like what happens is 
that JPype spawns it's own JVM process when it is initialized and then when you 
launch java process. And this happens when you import jpype (i.e. in 
DagFileProcessor) not when you run tasks - that's why the output from that 
process is sent to the Dag File Processor / Scheduler.
   
   There is not much you can do if you want to still run that one VM and 
communicate with it - then probably JPype has some mechanism to communicate 
back to the launching client (read the docs) and maybe you can send information 
in a different way that will be logged by a client that launches it. 
   
   You can also move JPype import to be local inside the task rather than at 
top level - assuming JPYpe will not be launched elsewhere this means that the 
VM will be started separately for each task running it - it's likely more 
overhead and would not reuse the single process for multiple tasks -but maybe 
that's what you really need.
   
   Generally - I think you should look deeply and understand how process and 
communication model works for JPype, you likely need to understand it better.
   
   Converting to a discussion if more is needed,


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to