Anyone?
This question is not regarding my application running on top of Spark.
The question is about the upgrade of spark itself from 2.2 to 2.4.
I expected atleast that spark would recover from upgrades gracefully and
recover its own persisted objects.
--
Sent from: http://apache-spark-user-li
Dear all
I already had a python function which is used to query data from HBase and
HDFS with given parameters. This function returns a pyspark dataframe and
the SparkContext it used.
With client's increasing demands, I need to merge data from multiple query.
I tested using "union" function to m
>From what I understand, the session is a singleton so even if you think you
>are creating new instances you are just reusing it.
On Wed, 29 Jan 2020 02:24:05 -1100 icbm0...@gmail.com wrote
Dear all
I already had a python function which is used to query data from HBase and
HDFS w
I am on k8s 1.17 in a small 4 node cluster. I am running Spark 2.4.4 but
with
updated kubernetes-client jars to work around the 403 CVE issue.
I am running on a pod in the 'default' namespace of my cluster in a Jupyter
notebook. I am trying to configure 'client mode' so I can use pyspark
interacti
Dear Yeikel
I checked my code and it uses getOrCreate to create a SparkSession.
Therefore, I should be retrieving the same SparkSession instance everytime I
call that method.
Thanks for your reminding.
Best regard
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---
On Wed, Jan 29, 2020 at 5:02 PM pisymbol . wrote:
>
> The problem is when spark initailizes I see the following error:
>
> io.fabric8.kubernetes.client.KubernetesClientException: pods is forbidden:
> User "system:serviceaccount:default:default" cannot watch resource "pods"
> in
> API group "" in
On Wed, Jan 29, 2020 at 9:58 PM pisymbol . wrote:
>
>
> On Wed, Jan 29, 2020 at 5:02 PM pisymbol . wrote:
>
>>
>> The problem is when spark initailizes I see the following error:
>>
>> io.fabric8.kubernetes.client.KubernetesClientException: pods is forbidden:
>> User "system:serviceaccount:defau
unsubscribe
On Fri, Jan 17, 2020 at 11:39 AM Bruno S. de Barros
wrote:
>
>
>
>
>
>
>
> - To
> unsubscribe e-mail: user-unsubscr...@spark.apache.org