[
https://issues.apache.org/jira/browse/SPARK-2090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15151802#comment-15151802
]
Lantao Jin edited comment on SPARK-2090 at 2/18/16 6:36 AM:
------------------------------------------------------------
Richard is right, this is the permissions problem on home directory.
In many cases, the user don't have full permissions with user home. For
example, we use LDAP system to login, but it is forbidden to create any user
directory. It will lead the Spark REPL show nothing about input text. The
source code root cause is below:
Scala load "use.home" from system properties to create a default history file,
code is:
{code:title=org.apache.spark.repl.SparkJLineReader.scala|borderStyle=solid}
/** Changes the default history file to not collide with the scala repl's. */
private[repl] class SparkJLineHistory extends JLineFileHistory {
import Properties.userHome
def defaultFileName = ".spark_history"
override protected lazy val historyFile = File(Path(userHome) /
defaultFileName)
}
{code}
And "userHome" is defined in scala.util.PropertiesTrait
{code:title=scala.util.PropertiesTrait.scala|borderStyle=solid}
def userHome = propOrEmpty("user.home")
{code}
It will load from library.properties build-in scala-library.jar. In most cases,
it will use the system default value. If the default value directory has no
write permission for the login user. The file ".spark_history" can not be
create and spark-shell cat not show any input text on REPL.
So, I do two step to resolve this problem(Our company use LDAP account to login
a machine and forbid to create any LDAP user directory)
{panel}
1. add export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=/tmp/$USER"
to spark-env.sh
2. add VM argument -Duser.home=/tmp/$USER to execution script like spark-submit
or spark-shell
{panel}
then it will works.
Next I will review the spark trunk code to find how to resolve it gracefully.
was (Author: cltlfcjin):
Richard is right, this is the permissions problem on home directory.
In many cases, the user don't have full permissions with user home. For
example, we use LDAP system to login, but it is forbidden to create any user
directory. It will lead the Spark REPL show nothing about input text. The
source code root cause is below:
Scala load "use.home" from system properties to create a default history file,
code is:
{code:title=org.apache.spark.repl.SparkJLineReader.scala|borderStyle=solid}
/** Changes the default history file to not collide with the scala repl's. */
private[repl] class SparkJLineHistory extends JLineFileHistory {
import Properties.userHome
def defaultFileName = ".spark_history"
override protected lazy val historyFile = File(Path(userHome) /
defaultFileName)
}
{code}
And "userHome" is defined in scala.util.PropertiesTrait
{code:title=scala.util.PropertiesTrait.scala|borderStyle=solid}
def userHome = propOrEmpty("user.home")
{code}
It will load from library.properties build-in scala-library.jar. In most cases,
it will use the system default value. If the default value directory has no
write permission for the login user. The file ".spark_history" can not be
create and spark-shell cat not show any input text on REPL.
So, I do two step to resolve this problem(Our company use LDAP account to login
a machine and forbid to create any LDAP user directory)
1. add export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=/tmp/$USER"
to spark-env.sh
2. add VM argument -Duser.home=/tmp/$USER to execution script like spark-submit
or spark-shell
then it will works.
Next I will review the spark trunk code to find how to resolve it gracefully.
> spark-shell input text entry not showing on REPL
> ------------------------------------------------
>
> Key: SPARK-2090
> URL: https://issues.apache.org/jira/browse/SPARK-2090
> Project: Spark
> Issue Type: Bug
> Components: Input/Output, Spark Core
> Affects Versions: 1.0.0
> Environment: Ubuntu 14.04; Using Scala version 2.10.4 (Java
> HotSpot(TM) 64-Bit Server VM, Java 1.7.0_60)
> Reporter: Richard Conway
> Priority: Critical
> Labels: easyfix, patch
> Fix For: 1.0.0
>
> Original Estimate: 4h
> Remaining Estimate: 4h
>
> spark-shell doesn't allow text to be displayed on input
> Failed to created SparkJLineReader: java.io.IOException: Permission denied
> Falling back to SimpleReader.
> The driver has 2 workers on 2 virtual machines and error free apart from the
> above line so I think it may have something to do with the introduction of
> the new SecurityManager.
> The upshot is that when you type nothing is displayed on the screen. For
> example, type "test" at the scala prompt and you won't see the input but the
> output will show.
> scala> <console>:11: error: package test is not a value
> test
> ^
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]