AnishMahto commented on code in PR #52254:
URL: https://github.com/apache/spark/pull/52254#discussion_r2376441639


##########
bin/spark-pipelines:
##########
@@ -30,4 +30,11 @@ fi
 export PYTHONPATH="${SPARK_HOME}/python/:$PYTHONPATH"

Review Comment:
   Correct yeah I I'm referring to pyspark here as the standalone PyPi pyspark 
package installation, and spark as the full tarball.
   
   > So a route that's consistent with that could be to copy cli.py
   
   Hmm yeah that sounds like a viable option. Not sure if there's a 
standardized practice here fwiw, I took inspiration from how 
`find-spark-home.sh` locates `python/pyspark/find_spark_home.py`. But copying 
the file to `${SPARK_HOME}/python/pyspark/` during packaging would be nice as 
it gives us a consistent location and we don't even need to spin up a python 
process



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to