Hi David,
Which type of incompatibility problems do you have with Apache Livy?
BR,
Liana
From: David Espinosa
Sent: 15 March 2018 12:06:20
To: user@spark.apache.org
Subject: What's the best way to have Spark a service?
Hi all,
I'm quite new to Spark, and I w
unsubscribe
DISCLAIMER: Aquest missatge pot contenir informació confidencial. Si vostè no
n'és el destinatari, si us plau, esborri'l i faci'ns-ho saber immediatament a
la següent adreça: le...@eurecat.org Si el destinatari d'aquest missatge no
consent la utilitz
CoarseGrainedScheduler`. So, it's probably
something related to the resources.
From: Timur Shenkao
Sent: 10 January 2018 20:07:37
To: Liana Napalkova
Cc: user@spark.apache.org
Subject: Re: py4j.protocol.Py4JJavaError: An error occurred while calling
o794.parquet
C
Hello,
Has anybody faced the following problem in PySpark? (Python 2.7.12):
df.show() # works fine and shows the first 5 rows of DataFrame
df.write.parquet(outputPath + '/data.parquet', mode="overwrite") # throws
the error
The last line throws the following error:
py4j.protocol.Py4J
Thanks, Timur.
The problem is that if I run `foreachPartitions`, then I cannot ` start` the
streaming query. Or perhaps I miss something.
From: Timur Shenkao
Sent: 18 December 2017 16:11:06
To: Liana Napalkova
Cc: Silvio Fiorito; user@spark.apache.org
Subject
If there is no other way, then I will follow this recommendation.
From: Silvio Fiorito
Sent: 18 December 2017 16:20:03
To: Liana Napalkova; user@spark.apache.org
Subject: Re: How to properly execute `foreachPartition` in Spark 2.2
Couldn’t you readStream from
.
From: Silvio Fiorito
Sent: 18 December 2017 16:00:39
To: Liana Napalkova; user@spark.apache.org
Subject: Re: How to properly execute `foreachPartition` in Spark 2.2
Why don’t you just use the Kafka sink for Spark 2.2?
https://spark.apache.org/docs/2.2.0
er: KafkaProducer[String,String] = _
def this(producer: KafkaProducer[String,String])
{
this()
this.producer = producer
}
override def process(row: String): Unit =
{
// ...
}
override def close(errorOrNull: Throwable): Unit = {}
override def open(partitionId: Long, version: Lo