MonkeyCanCode commented on code in PR #896:
URL: https://github.com/apache/polaris/pull/896#discussion_r1932170609


##########
regtests/Dockerfile:
##########
@@ -17,14 +17,14 @@
 # under the License.
 #
 
-FROM docker.io/apache/spark:3.5.4-python3
+FROM docker.io/apache/spark:3.5.4-java17-python3
 ARG POLARIS_HOST=polaris
 ENV POLARIS_HOST=$POLARIS_HOST
 ENV SPARK_HOME=/opt/spark
 
 USER root
 RUN apt update
-RUN apt-get install -y diffutils wget curl python3.8-venv
+RUN apt-get install -y diffutils wget curl python3.10-venv

Review Comment:
   It will. For my local, I am using Python 3.13. However, if we want to use 
the official spark image and different versions of python, we will need to 
compile from source code. In my previous PR of rework the test cases to pytest 
(paused for now, will pick it up again soon), I was using Python as a base 
image and built our own spark image on top (in that case, I am not locked to 
what Spark image is using and nor need to compile from source... setting up 
Spark will just be installing a software). Both will work. 
   
   It really comes to if we want to use the official Spark image and don't want 
to do software compile, we will be using that specific version of Python (e.g. 
for Centos7 which is also EOL, it is defaulted to Python 2 and Python3 will be 
referred to 3.8, but it is possible to setup different version of Python 3 
there via different repo or compiled from source). In this case, the JDK 11 
base image used by Spark is default to python 3.8 and JDK 17 is default to 
python 3.10.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@polaris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to