[jira] [Commented] (SPARK-42787) in spark-py docker images, arrowkeys do not work in (scala) spark-shell

2023-03-29 Thread Yikun Jiang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42787?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17706653#comment-17706653
 ] 

Yikun Jiang commented on SPARK-42787:
-

 you can also paste the execute script here, to help us 
understand and do a regression test.

BTW, there are two dockerfile:
1. dev/create-release/spark-rm/Dockerfile is mainly for k8s e2e test and docker 
image. (in future this will only used in k8s e2e test)
2. https://github.com/apache/spark-docker here is the future docker image repo, 
will publish after https://github.com/docker-library/official-images/pull/13089 
ready.

So, you can feel free to submit a PR to add libjline-java to 
https://github.com/apache/spark-docker .

> in spark-py docker images, arrowkeys do not work in (scala) spark-shell
> ---
>
> Key: SPARK-42787
> URL: https://issues.apache.org/jira/browse/SPARK-42787
> Project: Spark
>  Issue Type: Improvement
>  Components: Deploy
>Affects Versions: 3.1.3, 3.3.1
> Environment: [https://hub.docker.com/r/apache/spark-py] 3.1.3 and 
> 3.3.1  in Docker on M1 MacBook pro OSX ventura
>Reporter: Max Rieger
>Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from 
> [https://hub.docker.com/r/apache/spark-py/tags]
> while it works for pyspark, it does not for the scala spark-shell.
> it seems this is due to scala REPL using {{jline}} for input management.
>  * creating a \{{.inputrc}} file with mapping for the arrow keys. this 
> wouldn't work
>  * finally, building and running from 
> {{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
> Dockerfile, things worked.
> likely not limited to the {{spark-py}} images only.
> i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
> to in order to fix the docker hub images...
> {code:sh}
> diff --git a/dev/create-release/spark-rm/Dockerfile 
> b/dev/create-release/spark-rm/Dockerfile
> --- dev/create-release/spark-rm/Dockerfile
> +++ dev/create-release/spark-rm/Dockerfile
> @@ -71,9 +71,9 @@
>$APT_INSTALL nodejs && \
># Install needed python packages. Use pip for installing packages (for 
> consistency).
>$APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
># qpdf is required for CRAN checks to pass.
> -  $APT_INSTALL qpdf jq && \
> +  $APT_INSTALL qpdf jq libjline-java && \
>pip3 install $PIP_PKGS && \
># Install R packages and dependencies used when building.
># R depends on pandoc*, libssl (which are installed above).
># Note that PySpark doc generation also needs pandoc due to nbsphinx
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42787) in spark-py docker images, arrowkeys do not work in (scala) spark-shell

2023-03-29 Thread Max Rieger (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42787?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17706426#comment-17706426
 ] 

Max Rieger commented on SPARK-42787:


[~yikunkero], sure

{code:java}
$ docker run -it apache/spark-py:v3.1.3 bash
D$ export PATH=~/opt/spark/bin:$PATH
D$ spark-shell
D-S$ 
{code}
will not work
whereas
{code:java}
$ docker run -it apache/spark-py:v3.1.3 bash
D$ export PATH=~/opt/spark/bin:$PATH
D$ pyspark
D-PS$ 
{code}
will work.

for adding sparkl binaries to spark, which i do here beforehand, i already 
opened another ticket: https://issues.apache.org/jira/browse/SPARK-42788

> in spark-py docker images, arrowkeys do not work in (scala) spark-shell
> ---
>
> Key: SPARK-42787
> URL: https://issues.apache.org/jira/browse/SPARK-42787
> Project: Spark
>  Issue Type: Improvement
>  Components: Deploy
>Affects Versions: 3.1.3, 3.3.1
> Environment: [https://hub.docker.com/r/apache/spark-py] 3.1.3 and 
> 3.3.1  in Docker on M1 MacBook pro OSX ventura
>Reporter: Max Rieger
>Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from 
> [https://hub.docker.com/r/apache/spark-py/tags]
> while it works for pyspark, it does not for the scala spark-shell.
> it seems this is due to scala REPL using {{jline}} for input management.
>  * creating a \{{.inputrc}} file with mapping for the arrow keys. this 
> wouldn't work
>  * finally, building and running from 
> {{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
> Dockerfile, things worked.
> likely not limited to the {{spark-py}} images only.
> i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
> to in order to fix the docker hub images...
> {code:sh}
> diff --git a/dev/create-release/spark-rm/Dockerfile 
> b/dev/create-release/spark-rm/Dockerfile
> --- dev/create-release/spark-rm/Dockerfile
> +++ dev/create-release/spark-rm/Dockerfile
> @@ -71,9 +71,9 @@
>$APT_INSTALL nodejs && \
># Install needed python packages. Use pip for installing packages (for 
> consistency).
>$APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
># qpdf is required for CRAN checks to pass.
> -  $APT_INSTALL qpdf jq && \
> +  $APT_INSTALL qpdf jq libjline-java && \
>pip3 install $PIP_PKGS && \
># Install R packages and dependencies used when building.
># R depends on pandoc*, libssl (which are installed above).
># Note that PySpark doc generation also needs pandoc due to nbsphinx
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42787) in spark-py docker images, arrowkeys do not work in (scala) spark-shell

2023-03-19 Thread Yikun Jiang (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42787?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17702434#comment-17702434
 ] 

Yikun Jiang commented on SPARK-42787:
-

> while it works for pyspark, it does not for the scala spark-shell.

Could you give an example to show what cmd you excuted in Docker?
And what error you encountered?

Looks like missing required `libjline-java` pkg in some case?

> in spark-py docker images, arrowkeys do not work in (scala) spark-shell
> ---
>
> Key: SPARK-42787
> URL: https://issues.apache.org/jira/browse/SPARK-42787
> Project: Spark
>  Issue Type: Improvement
>  Components: Deploy
>Affects Versions: 3.1.3, 3.3.1
> Environment: [https://hub.docker.com/r/apache/spark-py] 3.1.3 and 
> 3.3.1  in Docker on M1 MacBook pro OSX ventura
>Reporter: Max Rieger
>Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from 
> [https://hub.docker.com/r/apache/spark-py/tags]
> while it works for pyspark, it does not for the scala spark-shell.
> it seems this is due to scala REPL using {{jline}} for input management.
>  * creating a \{{.inputrc}} file with mapping for the arrow keys. this 
> wouldn't work
>  * finally, building and running from 
> {{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
> Dockerfile, things worked.
> likely not limited to the {{spark-py}} images only.
> i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
> to in order to fix the docker hub images...
> {code:sh}
> diff --git a/dev/create-release/spark-rm/Dockerfile 
> b/dev/create-release/spark-rm/Dockerfile
> --- dev/create-release/spark-rm/Dockerfile
> +++ dev/create-release/spark-rm/Dockerfile
> @@ -71,9 +71,9 @@
>$APT_INSTALL nodejs && \
># Install needed python packages. Use pip for installing packages (for 
> consistency).
>$APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
># qpdf is required for CRAN checks to pass.
> -  $APT_INSTALL qpdf jq && \
> +  $APT_INSTALL qpdf jq libjline-java && \
>pip3 install $PIP_PKGS && \
># Install R packages and dependencies used when building.
># R depends on pandoc*, libssl (which are installed above).
># Note that PySpark doc generation also needs pandoc due to nbsphinx
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42787) in spark-py docker images, arrowkeys do not work in (scala) spark-shell

2023-03-19 Thread Hyukjin Kwon (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42787?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17702393#comment-17702393
 ] 

Hyukjin Kwon commented on SPARK-42787:
--

cc [~yikun] FYI

> in spark-py docker images, arrowkeys do not work in (scala) spark-shell
> ---
>
> Key: SPARK-42787
> URL: https://issues.apache.org/jira/browse/SPARK-42787
> Project: Spark
>  Issue Type: Improvement
>  Components: Deploy
>Affects Versions: 3.1.3, 3.3.1
> Environment: [https://hub.docker.com/r/apache/spark-py] 3.1.3 and 
> 3.3.1  in Docker on M1 MacBook pro OSX ventura
>Reporter: Max Rieger
>Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from 
> [https://hub.docker.com/r/apache/spark-py/tags]
> while it works for pyspark, it does not for the scala spark-shell.
> it seems this is due to scala REPL using {{jline}} for input management.
>  * creating a \{{.inputrc}} file with mapping for the arrow keys. this 
> wouldn't work
>  * finally, building and running from 
> {{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
> Dockerfile, things worked.
> likely not limited to the {{spark-py}} images only.
> i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
> to in order to fix the docker hub images...
> {code:sh}
> diff --git a/dev/create-release/spark-rm/Dockerfile 
> b/dev/create-release/spark-rm/Dockerfile
> --- dev/create-release/spark-rm/Dockerfile
> +++ dev/create-release/spark-rm/Dockerfile
> @@ -71,9 +71,9 @@
>$APT_INSTALL nodejs && \
># Install needed python packages. Use pip for installing packages (for 
> consistency).
>$APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
># qpdf is required for CRAN checks to pass.
> -  $APT_INSTALL qpdf jq && \
> +  $APT_INSTALL qpdf jq libjline-java && \
>pip3 install $PIP_PKGS && \
># Install R packages and dependencies used when building.
># R depends on pandoc*, libssl (which are installed above).
># Note that PySpark doc generation also needs pandoc due to nbsphinx
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42787) in spark-py docker images, arrowkeys do not work in (scala) spark-shell

2023-03-14 Thread Jira


[ 
https://issues.apache.org/jira/browse/SPARK-42787?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17700322#comment-17700322
 ] 

Bjørn Jørgensen commented on SPARK-42787:
-

Have a look at https://github.com/apache/spark-docker 

> in spark-py docker images, arrowkeys do not work in (scala) spark-shell
> ---
>
> Key: SPARK-42787
> URL: https://issues.apache.org/jira/browse/SPARK-42787
> Project: Spark
>  Issue Type: Improvement
>  Components: Deploy
>Affects Versions: 3.1.3, 3.3.1
> Environment: [https://hub.docker.com/r/apache/spark-py] 3.1.3 and 
> 3.3.1  in Docker on M1 MacBook pro OSX ventura
>Reporter: Max Rieger
>Priority: Minor
>
> i tested this for 3.1.3 and 3.3.1 from 
> [https://hub.docker.com/r/apache/spark-py/tags]
> while it works for pyspark, it does not for the scala spark-shell.
> it seems this is due to scala REPL using {{jline}} for input management.
>  * creating a \{{.inputrc}} file with mapping for the arrow keys. this 
> wouldn't work
>  * finally, building and running from 
> {{dev/create-release/spark-rm/Dockerfile}} with jline installed as of the 
> Dockerfile, things worked.
> likely not limited to the {{spark-py}} images only.
> i'd do a PR, but am unsure if this is even the right Dockerfile to contribute 
> to in order to fix the docker hub images...
> {code:sh}
> diff --git a/dev/create-release/spark-rm/Dockerfile 
> b/dev/create-release/spark-rm/Dockerfile
> --- dev/create-release/spark-rm/Dockerfile
> +++ dev/create-release/spark-rm/Dockerfile
> @@ -71,9 +71,9 @@
>$APT_INSTALL nodejs && \
># Install needed python packages. Use pip for installing packages (for 
> consistency).
>$APT_INSTALL python-is-python3 python3-pip python3-setuptools && \
># qpdf is required for CRAN checks to pass.
> -  $APT_INSTALL qpdf jq && \
> +  $APT_INSTALL qpdf jq libjline-java && \
>pip3 install $PIP_PKGS && \
># Install R packages and dependencies used when building.
># R depends on pandoc*, libssl (which are installed above).
># Note that PySpark doc generation also needs pandoc due to nbsphinx
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org