KamalGalrani commented on pull request #167:
URL: https://github.com/apache/incubator-livy/pull/167#issuecomment-698732347


   I was able to setup Livy using the helm chart, but when I create a session 
it fails. I am using the default configuration with minikube
   
   Create session payload
   ```
   {
       "kind": "pyspark",
       "name": "test-session1234",
       "conf": {
         "spark.kubernetes.namespace": "livy"
       }
   }
   ```
   ```
   20/09/25 04:00:58 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Exception in thread "main" 
io.fabric8.kubernetes.client.KubernetesClientException: Operation: [create]  
for kind: [Pod]  with name: [null]  in namespace: [livy]  failed.
        at 
io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:64)
        at 
io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:72)
        at 
io.fabric8.kubernetes.client.dsl.base.BaseOperation.create(BaseOperation.java:337)
        at 
io.fabric8.kubernetes.client.dsl.base.BaseOperation.create(BaseOperation.java:330)
        at 
org.apache.spark.deploy.k8s.submit.Client$$anonfun$run$2.apply(KubernetesClientApplication.scala:141)
        at 
org.apache.spark.deploy.k8s.submit.Client$$anonfun$run$2.apply(KubernetesClientApplication.scala:140)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2543)
        at 
org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:140)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication$$anonfun$run$5.apply(KubernetesClientApplication.scala:250)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication$$anonfun$run$5.apply(KubernetesClientApplication.scala:241)
        at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2543)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:241)
        at 
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:204)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method)
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
        at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
        at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
        at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
        at 
sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:894)
        at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:865)
        at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
        at okio.Okio$1.write(Okio.java:79)
        at okio.AsyncTimeout$1.write(AsyncTimeout.java:180)
        at okio.RealBufferedSink.flush(RealBufferedSink.java:224)
        at okhttp3.internal.http2.Http2Writer.settings(Http2Writer.java:203)
        at 
okhttp3.internal.http2.Http2Connection.start(Http2Connection.java:515)
        at 
okhttp3.internal.http2.Http2Connection.start(Http2Connection.java:505)
        at 
okhttp3.internal.connection.RealConnection.startHttp2(RealConnection.java:298)
        at 
okhttp3.internal.connection.RealConnection.establishProtocol(RealConnection.java:287)
        at 
okhttp3.internal.connection.RealConnection.connect(RealConnection.java:168)
        at 
okhttp3.internal.connection.StreamAllocation.findConnection(StreamAllocation.java:257)
        at 
okhttp3.internal.connection.StreamAllocation.findHealthyConnection(StreamAllocation.java:135)
        at 
okhttp3.internal.connection.StreamAllocation.newStream(StreamAllocation.java:114)
        at 
okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:42)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
        at 
okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:93)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
        at 
okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:126)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
        at 
io.fabric8.kubernetes.client.utils.BackwardsCompatibilityInterceptor.intercept(BackwardsCompatibilityInterceptor.java:119)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
        at 
io.fabric8.kubernetes.client.utils.ImpersonatorInterceptor.intercept(ImpersonatorInterceptor.java:68)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
        at 
io.fabric8.kubernetes.client.utils.HttpClientUtils.lambda$createHttpClient$3(HttpClientUtils.java:110)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
        at 
okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
        at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:254)
        at okhttp3.RealCall.execute(RealCall.java:92)
        at 
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:411)
        at 
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:372)
        at 
io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleCreate(OperationSupport.java:241)
        at 
io.fabric8.kubernetes.client.dsl.base.BaseOperation.handleCreate(BaseOperation.java:819)
        at 
io.fabric8.kubernetes.client.dsl.base.BaseOperation.create(BaseOperation.java:334)
        ... 17 more
   20/09/25 04:01:00 INFO ShutdownHookManager: Shutdown hook called
   20/09/25 04:01:00 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-343d41df-d58c-4ed4-8a03-2eabbc21da1d
   
   Kubernetes Diagnostics: 
   Operation: [list]  for kind: [Pod]  with name: [null]  in namespace: [null]  
failed.
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to