[ https://issues.apache.org/jira/browse/SPARK-26685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin resolved SPARK-26685. ------------------------------------ Resolution: Fixed Fix Version/s: 3.0.0 Issue resolved by pull request 23611 [https://github.com/apache/spark/pull/23611] > Building Spark Images with latest Docker does not honour spark_uid build > argument > --------------------------------------------------------------------------------- > > Key: SPARK-26685 > URL: https://issues.apache.org/jira/browse/SPARK-26685 > Project: Spark > Issue Type: Improvement > Components: Kubernetes > Affects Versions: 2.4.0 > Reporter: Rob Vesse > Assignee: Rob Vesse > Priority: Major > Fix For: 3.0.0 > > > Latest Docker releases are stricter in their interpretation of the scope of > build arguments meaning the location of the {{ARG spark_uid}} declaration > puts it out of scope by the time the variable is consumed resulting in the > Python and R images still running as {{root}} regardless of what the user may > have specified as the desired UID. > e.g. Images built with {{-u 456}} provided to {{bin/docker-image-tool.sh}} > {noformat} > > docker run -it --entrypoint /bin/bash rvesse/spark-py:uid456 > bash-4.4# whoami > root > bash-4.4# id -u > 0 > bash-4.4# exit > > docker run -it --entrypoint /bin/bash rvesse/spark:uid456 > bash-4.4$ id -u > 456 > bash-4.4$ exit > {noformat} > Note that for the Python image the build argument was out of scope and > ignored. For the base image the {{ARG}} declaration is in an in-scope > location and so is honoured correctly. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org