[ https://issues.apache.org/jira/browse/SPARK-5493?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14574359#comment-14574359 ]
Kaveen Raajan commented on SPARK-5493: -------------------------------------- I'm using *SPARK-1.3.1* on *windows machine* contain space on username (current username-"kaveen raajan"). I tried to run following command {code} spark-shell --master yarn-client --proxy-user SYSTEM {code} I able to run successfully on non-space user application also running in SYSTEM user, But When I try to run in spaced user (kaveen raajan) mean it throws following error. {code} 15/06/05 16:52:48 INFO spark.SecurityManager: Changing view acls to: SYSTEM 15/06/05 16:52:48 INFO spark.SecurityManager: Changing modify acls to: SYSTEM 15/06/05 16:52:48 INFO spark.SecurityManager: SecurityManager: authentication di sabled; ui acls disabled; users with view permissions: Set(SYSTEM); users with m odify permissions: Set(SYSTEM) 15/06/05 16:52:49 INFO slf4j.Slf4jLogger: Slf4jLogger started 15/06/05 16:52:49 INFO Remoting: Starting remoting 15/06/05 16:52:49 INFO Remoting: Remoting started; listening on addresses :[akka .tcp://sparkDriver@synclapn3408.CONTOSO:52137] 15/06/05 16:52:49 INFO util.Utils: Successfully started service 'sparkDriver' on port 52137. 15/06/05 16:52:49 INFO spark.SparkEnv: Registering MapOutputTracker 15/06/05 16:52:49 INFO spark.SparkEnv: Registering BlockManagerMaster 15/06/05 16:52:49 INFO storage.DiskBlockManager: Created local directory at C:\U sers\KAVEEN~1\AppData\Local\Temp\spark-d5b43891-274c-457d-aa3a-d79a536fd536\bloc kmgr-e980101b-4f93-455a-8a05-9185dcab9f8e 15/06/05 16:52:49 INFO storage.MemoryStore: MemoryStore started with capacity 26 5.4 MB 15/06/05 16:52:49 INFO spark.HttpFileServer: HTTP File server directory is C:\Us ers\KAVEEN~1\AppData\Local\Temp\spark-a35e3f17-641c-4ae3-90f2-51eac901b799\httpd -ecea93ad-c285-4c62-9222-01a9d6ff24e4 15/06/05 16:52:49 INFO spark.HttpServer: Starting HTTP Server 15/06/05 16:52:49 INFO server.Server: jetty-8.y.z-SNAPSHOT 15/06/05 16:52:49 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0 :52138 15/06/05 16:52:49 INFO util.Utils: Successfully started service 'HTTP file serve r' on port 52138. 15/06/05 16:52:49 INFO spark.SparkEnv: Registering OutputCommitCoordinator 15/06/05 16:52:49 INFO server.Server: jetty-8.y.z-SNAPSHOT 15/06/05 16:52:49 INFO server.AbstractConnector: Started SelectChannelConnector@ 0.0.0.0:4040 15/06/05 16:52:49 INFO util.Utils: Successfully started service 'SparkUI' on por t 4040. 15/06/05 16:52:49 INFO ui.SparkUI: Started SparkUI at http://synclapn3408.CONTOS O:4040 15/06/05 16:52:49 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0 :8032 java.lang.NullPointerException at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:145) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:49) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct orAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:10 27) at $iwC$$iwC.<init>(<console>:9) at $iwC.<init>(<console>:18) at <init>(<console>:20) at .<init>(<console>:24) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala: 1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala: 1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840 ) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8 56) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca la:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:130) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:122) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop Init.scala:122) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s cala:157) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL oopInit.scala:106) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala: 64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass Loader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr ocess(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub mit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:150 ) at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:148 ) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1614) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:14 8) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) <console>:10: error: not found: value sqlContext import sqlContext.implicits._ ^ <console>:10: error: not found: value sqlContext import sqlContext.sql ^ scala> {code} Let me know, Is any changes need for spaced user machine? > Support proxy users under kerberos > ---------------------------------- > > Key: SPARK-5493 > URL: https://issues.apache.org/jira/browse/SPARK-5493 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.2.0 > Reporter: Brock Noland > Assignee: Marcelo Vanzin > Fix For: 1.3.0 > > > When using kerberos, services may want to use spark-submit to submit jobs as > a separate user. For example a service like hive might want to submit jobs as > a client user. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org