yantzu commented on issue #406:
URL: https://github.com/apache/incubator-livy/issues/406#issuecomment-1565789060

   你设置了SPARK_HOME就不会执行find_spark_home.py了,所以可以工作,Spark3最好用python 3
   
   cocdkl ***@***.***> 于2023年5月22日周一 08:51写道:
   
   > 感谢您抽出时间帮我解答问题
   > 我在 livy-env.sh中进行了如下配置
   > SPARK_HOME=/home/cocdkl/soft/spark-3.2.4-bin-hadoop2.7
   > SPARK_CONF_DIR=/home/cocdkl/soft/spark-3.2.4-bin-hadoop2.7/conf
   >
   > 并且通过python -V 得到的版本为 2.7.16
   >
   > 我尝试过通过spar直接执行 pyspark命令,可以正常操作。
   >
   > 我写改过 /pyspark.zip/pyspark/find_spark_home.py
   > 这个脚本,将这个脚本的返回值固化为我本地的spark_home目录,但是还是会报其他的错误。
   >
   > —
   > Reply to this email directly, view it on GitHub
   > 
<https://github.com/apache/incubator-livy/issues/406#issuecomment-1556358335>,
   > or unsubscribe
   > 
<https://github.com/notifications/unsubscribe-auth/AB35X7FNTCPXCPIIE3COU5DXHKZ7XANCNFSM6AAAAAAYGA2E2I>
   > .
   > You are receiving this because you commented.Message ID:
   > ***@***.***>
   >
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to