[ 
https://issues.apache.org/jira/browse/SPARK-42787?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Max Rieger updated SPARK-42787:
-------------------------------
    Description: 
i tested this for 3.1.3 and 3.3.1 from 
[https://hub.docker.com/r/apache/spark-py/tags]

while it works for pyspark, it does not for the scala spark-shell.

it seems this is due to scala REPL using {{jline}} for input management.
 * creating a \{{.inputrc}} file with mapping for the arrow keys. this wouldn't 
work
 * finally, building and running from 
{{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
Dockerfile, things worked.

likely not limited to the {{spark-py}} images only.

i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
to in order to fix the docker hub images...

{code:sh}
diff --git a/dev/create-release/spark-rm/Dockerfile 
b/dev/create-release/spark-rm/Dockerfile
--- dev/create-release/spark-rm/Dockerfile
+++ dev/create-release/spark-rm/Dockerfile
@@ -71,9 +71,9 @@
   $APT_INSTALL nodejs && \
   # Install needed python packages. Use pip for installing packages (for 
consistency).
   $APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
   # qpdf is required for CRAN checks to pass.
-  $APT_INSTALL qpdf jq && \
+  $APT_INSTALL qpdf jq libjline-java && \
   pip3 install $PIP_PKGS && \
   # Install R packages and dependencies used when building.
   # R depends on pandoc*, libssl (which are installed above).
   # Note that PySpark doc generation also needs pandoc due to nbsphinx

{code}


  was:
i tested this for 3.1.3 and 3.3.1 from 
[https://hub.docker.com/r/apache/spark-py/tags]

while it works for pyspark, it does not for the scala spark-shell.

it seems this is due to scala REPL using {{jline}} for input management.
 * creating a \{{.inputrc}} file with mapping for the arrow keys. this wouldn't 
work
 * finally, building and running from 
{{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
Dockerfile, things worked.

likely not limited to the {{spark-py}} images only.

i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
to in order to fix the docker hub images...

{code:git}
diff --git a/dev/create-release/spark-rm/Dockerfile 
b/dev/create-release/spark-rm/Dockerfile
--- dev/create-release/spark-rm/Dockerfile
+++ dev/create-release/spark-rm/Dockerfile
@@ -71,9 +71,9 @@
   $APT_INSTALL nodejs && \
   # Install needed python packages. Use pip for installing packages (for 
consistency).
   $APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
   # qpdf is required for CRAN checks to pass.
-  $APT_INSTALL qpdf jq && \
+  $APT_INSTALL qpdf jq libjline-java && \
   pip3 install $PIP_PKGS && \
   # Install R packages and dependencies used when building.
   # R depends on pandoc*, libssl (which are installed above).
   # Note that PySpark doc generation also needs pandoc due to nbsphinx

{code}



> in spark-py docker images, arrowkeys do not work in (scala) spark-shell
> -----------------------------------------------------------------------
>
>                 Key: SPARK-42787
>                 URL: https://issues.apache.org/jira/browse/SPARK-42787
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>    Affects Versions: 3.1.3, 3.3.1
>         Environment: [https://hub.docker.com/r/apache/spark-py] 3.1.3 and 
> 3.3.1  in Docker on M1 MacBook pro OSX ventura
>            Reporter: Max Rieger
>            Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from 
> [https://hub.docker.com/r/apache/spark-py/tags]
> while it works for pyspark, it does not for the scala spark-shell.
> it seems this is due to scala REPL using {{jline}} for input management.
>  * creating a \{{.inputrc}} file with mapping for the arrow keys. this 
> wouldn't work
>  * finally, building and running from 
> {{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
> Dockerfile, things worked.
> likely not limited to the {{spark-py}} images only.
> i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
> to in order to fix the docker hub images...
> {code:sh}
> diff --git a/dev/create-release/spark-rm/Dockerfile 
> b/dev/create-release/spark-rm/Dockerfile
> --- dev/create-release/spark-rm/Dockerfile
> +++ dev/create-release/spark-rm/Dockerfile
> @@ -71,9 +71,9 @@
>    $APT_INSTALL nodejs && \
>    # Install needed python packages. Use pip for installing packages (for 
> consistency).
>    $APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
>    # qpdf is required for CRAN checks to pass.
> -  $APT_INSTALL qpdf jq && \
> +  $APT_INSTALL qpdf jq libjline-java && \
>    pip3 install $PIP_PKGS && \
>    # Install R packages and dependencies used when building.
>    # R depends on pandoc*, libssl (which are installed above).
>    # Note that PySpark doc generation also needs pandoc due to nbsphinx
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to