[
https://issues.apache.org/jira/browse/SPARK-27927?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16854474#comment-16854474
]
Edwin Biemond commented on SPARK-27927:
---------------------------------------
on the same hanging driver pod when I do the manual spark submit it works fine .
{noformat}
bash-4.2# /opt/spark/bin/spark-submit --deploy-mode client --class
org.apache.spark.deploy.PythonRunner /pyspark_min.py
[2019-06-03 09:22:04,121] [oehpcs-ui] [WARN] []
[org.apache.hadoop.util.NativeCodeLoader] ({main}
NativeCodeLoader.java[<clinit>]:62) - 'Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable'
[2019-06-03 09:22:04,970] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkContext]
({Thread-5} Logging.scala[logInfo]:54) - 'Running Spark version 2.4.3'
[2019-06-03 09:22:04,997] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkContext]
({Thread-5} Logging.scala[logInfo]:54) - 'Submitted application: hello_world'
[2019-06-03 09:22:05,062] [oehpcs-ui] [INFO] []
[org.apache.spark.SecurityManager] ({Thread-5} Logging.scala[logInfo]:54) -
'Changing view acls to: root'
[2019-06-03 09:22:05,063] [oehpcs-ui] [INFO] []
[org.apache.spark.SecurityManager] ({Thread-5} Logging.scala[logInfo]:54) -
'Changing modify acls to: root'
[2019-06-03 09:22:05,063] [oehpcs-ui] [INFO] []
[org.apache.spark.SecurityManager] ({Thread-5} Logging.scala[logInfo]:54) -
'Changing view acls groups to: '
[2019-06-03 09:22:05,064] [oehpcs-ui] [INFO] []
[org.apache.spark.SecurityManager] ({Thread-5} Logging.scala[logInfo]:54) -
'Changing modify acls groups to: '
[2019-06-03 09:22:05,064] [oehpcs-ui] [INFO] []
[org.apache.spark.SecurityManager] ({Thread-5} Logging.scala[logInfo]:54) -
'SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(root); groups with view permissions: Set(); users with modify
permissions: Set(root); groups with modify permissions: Set()'
[2019-06-03 09:22:05,419] [oehpcs-ui] [INFO] [] [org.apache.spark.util.Utils]
({Thread-5} Logging.scala[logInfo]:54) - 'Successfully started service
'sparkDriver' on port 44002.'
[2019-06-03 09:22:05,448] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkEnv]
({Thread-5} Logging.scala[logInfo]:54) - 'Registering MapOutputTracker'
[2019-06-03 09:22:05,469] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkEnv]
({Thread-5} Logging.scala[logInfo]:54) - 'Registering BlockManagerMaster'
[2019-06-03 09:22:05,473] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManagerMasterEndpoint] ({Thread-5}
Logging.scala[logInfo]:54) - 'Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology information'
[2019-06-03 09:22:05,474] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManagerMasterEndpoint] ({Thread-5}
Logging.scala[logInfo]:54) - 'BlockManagerMasterEndpoint up'
[2019-06-03 09:22:05,485] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.DiskBlockManager] ({Thread-5}
Logging.scala[logInfo]:54) - 'Created local directory at
/var/data/spark-cd530621-c059-4268-8f1b-9092fdd3a53c/blockmgr-0b6b81b5-574a-41e0-8e75-f5441d4c8671'
[2019-06-03 09:22:05,506] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.memory.MemoryStore] ({Thread-5}
Logging.scala[logInfo]:54) - 'MemoryStore started with capacity 366.3 MB'
[2019-06-03 09:22:05,526] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkEnv]
({Thread-5} Logging.scala[logInfo]:54) - 'Registering OutputCommitCoordinator'
[2019-06-03 09:22:05,632] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.util.log] ({Thread-5} Log.java[initialized]:192) -
'Logging initialized @2973ms'
[2019-06-03 09:22:05,726] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.Server] ({Thread-5} Server.java[doStart]:351) -
'jetty-9.3.z-SNAPSHOT, build timestamp: 2017-11-21T21:27:37Z, git hash:
82b8fb23f757335bb3329d540ce37a2a2615f0a8'
[2019-06-03 09:22:05,751] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.Server] ({Thread-5} Server.java[doStart]:419) -
'Started @3094ms'
[2019-06-03 09:22:05,772] [oehpcs-ui] [WARN] [] [org.apache.spark.util.Utils]
({Thread-5} Logging.scala[logWarning]:66) - 'Service 'SparkUI' could not bind
on port 4040. Attempting port 4041.'
[2019-06-03 09:22:05,780] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.AbstractConnector] ({Thread-5}
AbstractConnector.java[doStart]:278) - 'Started
ServerConnector@4c394b20{HTTP/1.1,[http/1.1]}{0.0.0.0:4041}'
[2019-06-03 09:22:05,782] [oehpcs-ui] [INFO] [] [org.apache.spark.util.Utils]
({Thread-5} Logging.scala[logInfo]:54) - 'Successfully started service
'SparkUI' on port 4041.'
[2019-06-03 09:22:05,817] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@42a13299{/jobs,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,818] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@48ce5908{/jobs/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,819] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@407a7f1e{/jobs/job,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,820] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@5fb991dd{/jobs/job/json,null,AVAILABLE,@Spark}'[2019-06-03
09:22:05,821] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@6505953b{/stages,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,822] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@7614940a{/stages/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,823] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@6bef407d{/stages/stage,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,824] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@63c89b20{/stages/stage/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,825] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@7235d38c{/stages/pool,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,826] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@52c3284d{/stages/pool/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,827] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@14c5d76b{/storage,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,828] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@33f35f48{/storage/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,829] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@6f864309{/storage/rdd,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,830] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@244c9ff4{/storage/rdd/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,831] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@1e087a99{/environment,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,832] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@37a87f0f{/environment/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,833] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@6aebecf8{/executors,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,834] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@2a987d6e{/executors/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,840] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@77991616{/executors/threadDump,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,841] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@3bd5fed{/executors/threadDump/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,851] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@3684ef4e{/static,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,852] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@11a6ac4f{/,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,853] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@46cf502e{/api,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,854] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@4b69bda{/jobs/job/kill,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,855] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@6915bcd9{/stages/stage/kill,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:05,858] [oehpcs-ui] [INFO] [] [org.apache.spark.ui.SparkUI]
({Thread-5} Logging.scala[logInfo]:54) - 'Bound SparkUI to 0.0.0.0, and started
at http://spark-046a786055254e78a532d01f986fb700-drv:4041'
[2019-06-03 09:22:05,979] [oehpcs-ui] [INFO] []
[org.apache.spark.executor.Executor] ({Thread-5} Logging.scala[logInfo]:54) -
'Starting executor ID driver on host localhost'
[2019-06-03 09:22:06,066] [oehpcs-ui] [INFO] [] [org.apache.spark.util.Utils]
({Thread-5} Logging.scala[logInfo]:54) - 'Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 52446.'
[2019-06-03 09:22:06,067] [oehpcs-ui] [INFO] []
[org.apache.spark.network.netty.NettyBlockTransferService] ({Thread-5}
Logging.scala[logInfo]:54) - 'Server created on
spark-046a786055254e78a532d01f986fb700-drv:52446'
[2019-06-03 09:22:06,069] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManager] ({Thread-5} Logging.scala[logInfo]:54)
- 'Using org.apache.spark.storage.RandomBlockReplicationPolicy for block
replication policy'
[2019-06-03 09:22:06,099] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManagerMaster] ({Thread-5}
Logging.scala[logInfo]:54) - 'Registering BlockManager BlockManagerId(driver,
spark-046a786055254e78a532d01f986fb700-drv, 52446, None)'
[2019-06-03 09:22:06,104] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManagerMasterEndpoint]
({dispatcher-event-loop-2} Logging.scala[logInfo]:54) - 'Registering block
manager spark-046a786055254e78a532d01f986fb700-drv:52446 with 366.3 MB RAM,
BlockManagerId(driver, spark-046a786055254e78a532d01f986fb700-drv, 52446, None)'
[2019-06-03 09:22:06,112] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManagerMaster] ({Thread-5}
Logging.scala[logInfo]:54) - 'Registered BlockManager BlockManagerId(driver,
spark-046a786055254e78a532d01f986fb700-drv, 52446, None)'
[2019-06-03 09:22:06,114] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManager] ({Thread-5} Logging.scala[logInfo]:54)
- 'Initialized BlockManager: BlockManagerId(driver,
spark-046a786055254e78a532d01f986fb700-drv, 52446, None)'
[2019-06-03 09:22:06,356] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@76a03d48{/metrics/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:06,604] [oehpcs-ui] [INFO] []
[org.apache.spark.sql.internal.SharedState] ({Thread-5}
Logging.scala[logInfo]:54) - 'Setting hive.metastore.warehouse.dir ('null') to
the value of spark.sql.warehouse.dir ('file:/logs/spark-warehouse').'
[2019-06-03 09:22:06,604] [oehpcs-ui] [INFO] []
[org.apache.spark.sql.internal.SharedState] ({Thread-5}
Logging.scala[logInfo]:54) - 'Warehouse path is 'file:/logs/spark-warehouse'.'
[2019-06-03 09:22:06,616] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@61f5aef9{/SQL,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:06,616] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@534092cd{/SQL/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:06,617] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@6247408b{/SQL/execution,null,AVAILABLE,@Spark}'[2019-06-03
09:22:06,618] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@153e0bd3{/SQL/execution/json,null,AVAILABLE,@Spark}'
[2019-06-03 09:22:06,620] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.handler.ContextHandler] ({Thread-5}
ContextHandler.java[doStart]:781) - 'Started
o.s.j.s.ServletContextHandler@df87eca{/static/sql,null,AVAILABLE,@Spark}'
-----------------------
[2019-06-03 09:22:07,266] [oehpcs-ui] [INFO] []
[org.apache.spark.sql.execution.streaming.state.StateStoreCoordinatorRef]
({Thread-5} Logging.scala[logInfo]:54) - 'Registered StateStoreCoordinator
endpoint'
Our Spark version is 2.4.3
Spark context information: <SparkContext master=local[*] appName=hello_world>
parallelism=4 python version=3.6
---------------------------------
[2019-06-03 09:22:07,319] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkContext]
({Thread-1} Logging.scala[logInfo]:54) - 'Invoking stop() from shutdown hook'
[2019-06-03 09:22:07,327] [oehpcs-ui] [INFO] []
[org.spark_project.jetty.server.AbstractConnector] ({Thread-1}
AbstractConnector.java[doStop]:318) - 'Stopped
Spark@4c394b20{HTTP/1.1,[http/1.1]}{0.0.0.0:4041}'
[2019-06-03 09:22:07,330] [oehpcs-ui] [INFO] [] [org.apache.spark.ui.SparkUI]
({Thread-1} Logging.scala[logInfo]:54) - 'Stopped Spark web UI at
http://spark-046a786055254e78a532d01f986fb700-drv:4041'
[2019-06-03 09:22:07,344] [oehpcs-ui] [INFO] []
[org.apache.spark.MapOutputTrackerMasterEndpoint] ({dispatcher-event-loop-3}
Logging.scala[logInfo]:54) - 'MapOutputTrackerMasterEndpoint stopped!'
[2019-06-03 09:22:07,355] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.memory.MemoryStore] ({Thread-1}
Logging.scala[logInfo]:54) - 'MemoryStore cleared'
[2019-06-03 09:22:07,356] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManager] ({Thread-1} Logging.scala[logInfo]:54)
- 'BlockManager stopped'
[2019-06-03 09:22:07,364] [oehpcs-ui] [INFO] []
[org.apache.spark.storage.BlockManagerMaster] ({Thread-1}
Logging.scala[logInfo]:54) - 'BlockManagerMaster stopped'
[2019-06-03 09:22:07,368] [oehpcs-ui] [INFO] []
[org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint]
({dispatcher-event-loop-0} Logging.scala[logInfo]:54) -
'OutputCommitCoordinator stopped!'
[2019-06-03 09:22:07,374] [oehpcs-ui] [INFO] [] [org.apache.spark.SparkContext]
({Thread-1} Logging.scala[logInfo]:54) - 'Successfully stopped SparkContext'
[2019-06-03 09:22:07,375] [oehpcs-ui] [INFO] []
[org.apache.spark.util.ShutdownHookManager] ({Thread-1}
Logging.scala[logInfo]:54) - 'Shutdown hook called'
[2019-06-03 09:22:07,376] [oehpcs-ui] [INFO] []
[org.apache.spark.util.ShutdownHookManager] ({Thread-1}
Logging.scala[logInfo]:54) - 'Deleting directory
/tmp/spark-b0ec3bd2-c469-4e69-88c5-83e3aca018a6'
[2019-06-03 09:22:07,379] [oehpcs-ui] [INFO] []
[org.apache.spark.util.ShutdownHookManager] ({Thread-1}
Logging.scala[logInfo]:54) - 'Deleting directory
/var/data/spark-cd530621-c059-4268-8f1b-9092fdd3a53c/spark-d13c0d59-c6d7-4336-8345-ffbd0418bb88/pyspark-7a23bc71-bb0f-487a-a3ab-41e808ae0d46'
[2019-06-03 09:22:07,382] [oehpcs-ui] [INFO] []
[org.apache.spark.util.ShutdownHookManager] ({Thread-1}
Logging.scala[logInfo]:54) - 'Deleting directory
/var/data/spark-cd530621-c059-4268-8f1b-9092fdd3a53c/spark-d13c0d59-c6d7-4336-8345-ffbd0418bb88'{noformat}
> driver pod hangs with pyspark 2.4.3 and master on kubenetes
> -----------------------------------------------------------
>
> Key: SPARK-27927
> URL: https://issues.apache.org/jira/browse/SPARK-27927
> Project: Spark
> Issue Type: Bug
> Components: Kubernetes
> Affects Versions: 3.0.0, 2.4.3
> Environment: k8s 1.11.9
> spark 2.4.3 and master branch.
> Reporter: Edwin Biemond
> Priority: Major
>
> When we run a simple pyspark on spark 2.4.3 or 3.0.0 the driver pods hangs
> and never calls the shutdown hook.
> {code:java}
> #!/usr/bin/env python
> from __future__ import print_function
> import os
> import os.path
> import sys
> # Are we really in Spark?
> from pyspark.sql import SparkSession
> spark = SparkSession.builder.appName('hello_world').getOrCreate()
> print('Our Spark version is {}'.format(spark.version))
> print('Spark context information: {} parallelism={} python version={}'.format(
> str(spark.sparkContext),
> spark.sparkContext.defaultParallelism,
> spark.sparkContext.pythonVer
> ))
> {code}
> When we run this on kubernetes the driver and executer are just hanging. We
> see the output of this python script.
> {noformat}
> bash-4.2# cat stdout.log
> Our Spark version is 2.4.3
> Spark context information: <SparkContext
> master=k8s://https://kubernetes.default.svc:443 appName=hello_world>
> parallelism=2 python version=3.6{noformat}
> What works
> * a simple python with a print works fine on 2.4.3 and 3.0.0
> * same setup on 2.4.0
> * 2.4.3 spark-submit with the above pyspark
>
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]