Maxim Belousov created ZEPPELIN-3326:
----------------------------------------

             Summary: 0.8.0 - Spark doesn't start in local mode
                 Key: ZEPPELIN-3326
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-3326
             Project: Zeppelin
          Issue Type: Bug
          Components: Interpreters
    Affects Versions: 0.8.0
            Reporter: Maxim Belousov


All parameters are from default.
zeppelin.spark.useNew doesn't influence on start.

{code}
INFO [2018-03-13 21:48:32,679] ({main} RemoteInterpreterServer.java[main]:260) 
- 
URL:file:/opt/zeppelin/product/zeppelin-interpreter/target/classes/org/apache/zeppelin/interpreter/remote/RemoteInterpreterServer.class
 INFO [2018-03-13 21:48:32,745] ({main} 
RemoteInterpreterServer.java[<init>]:161) - Launching ThriftServer at 
10.6.3.7:45808
 INFO [2018-03-13 21:48:32,755] ({main} 
RemoteInterpreterServer.java[<init>]:165) - Starting remote interpreter server 
on port 45808
 INFO [2018-03-13 21:48:32,757] ({Thread-0} 
RemoteInterpreterServer.java[run]:202) - Starting remote interpreter server on 
port 45808
 INFO [2018-03-13 21:48:32,765] ({Thread-1} 
RemoteInterpreterUtils.java[registerInterpreter]:165) - callbackHost: 10.6.3.7, 
callbackPort: 45695, callbackInfo: CallbackInfo(host:10.6.3.7, port:45808)
 INFO [2018-03-13 21:48:32,911] ({pool-1-thread-1} 
RemoteInterpreterServer.java[createInterpreter]:310) - Instantiate interpreter 
org.apache.zeppelin.spark.SparkInterpreter
 INFO [2018-03-13 21:48:32,915] ({pool-1-thread-1} 
RemoteInterpreterServer.java[createInterpreter]:310) - Instantiate interpreter 
org.apache.zeppelin.spark.SparkSqlInterpreter
 INFO [2018-03-13 21:48:32,923] ({pool-1-thread-1} 
RemoteInterpreterServer.java[createInterpreter]:310) - Instantiate interpreter 
org.apache.zeppelin.spark.DepInterpreter
 INFO [2018-03-13 21:48:32,938] ({pool-1-thread-1} 
RemoteInterpreterServer.java[createInterpreter]:310) - Instantiate interpreter 
org.apache.zeppelin.spark.PySparkInterpreter
 INFO [2018-03-13 21:48:32,943] ({pool-1-thread-1} 
RemoteInterpreterServer.java[createInterpreter]:310) - Instantiate interpreter 
org.apache.zeppelin.spark.IPySparkInterpreter
 INFO [2018-03-13 21:48:32,946] ({pool-1-thread-1} 
RemoteInterpreterServer.java[createInterpreter]:310) - Instantiate interpreter 
org.apache.zeppelin.spark.SparkRInterpreter
 INFO [2018-03-13 21:48:33,014] ({pool-2-thread-2} 
SchedulerFactory.java[jobStarted]:109) - Job 20180313-214506_1608528313 started 
by scheduler interpreter_1164251230
 INFO [2018-03-13 21:48:33,053] ({pool-2-thread-2} 
PySparkInterpreter.java[createPythonScript]:116) - File 
/tmp/zeppelin_pyspark-4096704272054277993.py created
 INFO [2018-03-13 21:48:33,073] ({pool-2-thread-2} 
PySparkInterpreter.java[createGatewayServerAndStartScript]:269) - pythonExec: 
python
 INFO [2018-03-13 21:48:33,076] ({pool-2-thread-2} 
NewSparkInterpreter.java[open]:89) - Using Scala Version: 2.11
 INFO [2018-03-13 21:48:40,344] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
Running Spark version 2.0.0
 WARN [2018-03-13 21:48:40,686] ({pool-2-thread-2} 
NativeCodeLoader.java[<clinit>]:62) - Unable to load native-hadoop library for 
your platform... using builtin-java classes where applicable
 INFO [2018-03-13 21:48:40,887] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
Changing view acls to: zeppelin
 INFO [2018-03-13 21:48:40,887] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
Changing modify acls to: zeppelin
 INFO [2018-03-13 21:48:40,888] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
Changing view acls groups to: 
 INFO [2018-03-13 21:48:40,888] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
Changing modify acls groups to: 
 INFO [2018-03-13 21:48:40,889] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
SecurityManager: authentication disabled; ui acls disabled; users  with view 
permissions: Set(zeppelin); groups with view permissions: Set(); users  with 
modify permissions: Set(zeppelin); groups with modify permissions: Set()
 WARN [2018-03-13 21:48:41,320] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,325] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,328] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,332] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,336] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,340] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,343] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,345] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,348] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,351] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,353] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,356] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,358] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,361] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,363] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
 WARN [2018-03-13 21:48:41,366] ({pool-2-thread-2} 
Logging.scala[logWarning]:66) - Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.
ERROR [2018-03-13 21:48:41,373] ({pool-2-thread-2} Logging.scala[logError]:91) 
- Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' 
failed after 16 retries! Consider explicitly setting the appropriate port for 
the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an 
available port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:463)
        at sun.nio.ch.Net.bind(Net.java:455)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at 
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at 
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at 
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at 
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:748)
 INFO [2018-03-13 21:48:41,380] ({pool-2-thread-2} Logging.scala[logInfo]:54) - 
Successfully stopped SparkContext
ERROR [2018-03-13 21:48:41,440] ({pool-2-thread-2} 
NewSparkInterpreter.java[open]:130) - Fail to open SparkInterpreter
ERROR [2018-03-13 21:48:41,440] ({pool-2-thread-2} 
PySparkInterpreter.java[open]:203) - Error
org.apache.zeppelin.interpreter.InterpreterException: Fail to open 
SparkInterpreter
        at 
org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:131)
        at 
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:61)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at 
org.apache.zeppelin.spark.PySparkInterpreter.getSparkInterpreter(PySparkInterpreter.java:665)
        at 
org.apache.zeppelin.spark.PySparkInterpreter.createGatewayServerAndStartScript(PySparkInterpreter.java:273)
        at 
org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:201)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:617)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:186)
        at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:473)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:204)
        at 
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:126)
        at 
org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:82)
        at 
org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:108)
        ... 16 more
Caused by: java.net.BindException: Cannot assign requested address: Service 
'sparkDriver' failed after 16 retries! Consider explicitly setting the 
appropriate port for the service 'sparkDriver' (for example spark.ui.port for 
SparkUI) to an available port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:463)
        at sun.nio.ch.Net.bind(Net.java:455)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at 
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at 
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at 
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at 
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        ... 1 more
ERROR [2018-03-13 21:48:41,441] ({pool-2-thread-2} Job.java[run]:188) - Job 
failed
org.apache.zeppelin.interpreter.InterpreterException: 
org.apache.zeppelin.interpreter.InterpreterException: Fail to open 
SparkInterpreter
        at 
org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:204)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:617)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:186)
        at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:473)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open 
SparkInterpreter
        at 
org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:131)
        at 
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:61)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at 
org.apache.zeppelin.spark.PySparkInterpreter.getSparkInterpreter(PySparkInterpreter.java:665)
        at 
org.apache.zeppelin.spark.PySparkInterpreter.createGatewayServerAndStartScript(PySparkInterpreter.java:273)
        at 
org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:201)
        ... 11 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:204)
        at 
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:126)
        at 
org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:82)
        at 
org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:108)
        ... 16 more
Caused by: java.net.BindException: Cannot assign requested address: Service 
'sparkDriver' failed after 16 retries! Consider explicitly setting the 
appropriate port for the service 'sparkDriver' (for example spark.ui.port for 
SparkUI) to an available port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:463)
        at sun.nio.ch.Net.bind(Net.java:455)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at 
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at 
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at 
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at 
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        ... 1 more
 INFO [2018-03-13 21:48:41,450] ({pool-2-thread-2} 
SchedulerFactory.java[jobFinished]:115) - Job 20180313-214506_1608528313 
finished by scheduler interpreter_1164251230

{code}




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to