Hi ,
I was able to successfully build the project(source code), from intellij.
But when i try to run any of the examples present in $SPARK_HOME/examples
folder , i am getting different errors for different example jobs.
example:
for structuredkafkawordcount example,
Exception in thread "main" ja
Did you follow the guide in `IDE Setup` -> `IntelliJ` section of
http://spark.apache.org/developer-tools.html ?
Bests,
Dongjoon.
On Wed, Jun 28, 2017 at 5:13 PM, satyajit vegesna <
satyajit.apas...@gmail.com> wrote:
> Hi All,
>
> When i try to build source code of apache spark code from
> https:
Hi All,
When i try to build source code of apache spark code from
https://github.com/apache/spark.git, i am getting below errors,
Error:(9, 14) EventBatch is already defined as object EventBatch
public class EventBatch extends org.apache.avro.specific.SpecificRecordBase
implements org.apache.avro
Hi All,
I am trying too build Kafka-0-10-sql module under external folder in apache
spark source code.
Once i generate jar file using,
build/mvn package -DskipTests -pl external/kafka-0-10-sql
i get jar file created under external/kafka-0-10-sql/target.
And try to run spark-shell with jars create
Hi,
Got a question here around! I am setting a setJobGroup to be cancelled within a
Future.
Thing is, such future is affected by several map, flatMap, for comprehensions
and even andThen side effects. I am being unable to cancel the job properly
within a group, and I am unsure whether this is
Oh, sorry, I was wrong. Concurrent collections in Scala are available since
2.8. Any objections against replacing mutable list and synchronized with a
concurrent collection like or based on TrieMap for instance?
On Wed, Jun 28, 2017, 16:05 Oleksandr Vayda
wrote:
> Cool. I will be happy to create
Hey guys,
I need to capture runtime stats and metrics for every executed SQL query.
For that I'm using two listeners: the QueryExecutionListener, that gives me
information about executed QueryExecution, and the SparkListener, that
provides me the info about the jobs, stages and tasks.
The problem I
Cool. I will be happy to create a PR. The simplest and most obvious
solution that came to my mind was using Java concurrent collections instead
of Scala mutable. Don't you mind to have this bit of Java inside Spark? :)
Or perhaps we could use Scala concurrent collections, but they are only
availabl