Reamer commented on pull request #4097:
URL: https://github.com/apache/zeppelin/pull/4097#issuecomment-824591596


   > Regarding your experience, would building conda env into docker image 
works for users ?
   
   It works, but it's not as flexible as your approach with YARN. At the moment 
it's just an image with a Conda environment with a fixed set of Python 
libraries and a fixed Python version.
   Updates of the image with a changed Python version and libraries are not 
really possible because you don't know which other notebooks might be broken 
afterwards.
   
   The same problem exists in pyspark, so I have been thinking for quite a long 
time how we can activate a similar functionality there.
   I found this blog post which is very interesting.
   
https://databricks.com/de/blog/2020/12/22/how-to-manage-python-dependencies-in-pyspark.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to