I think that a big advantage to not use Spark Streaming when your solution is
already based on Kafka is that you don't have to deal with another cluster. I
mean ...
Imagine that your solution is already based on Kafka as ingestion systems for
your events and then you need to do some real time
No it's running in standalone mode as Docker image on Kubernetes.
The only way I found was to access "stderr" file created under the "work"
directory in the SPARK_HOME but ... is it the right way ?
Paolo Patierno
Senior Software Engineer (IoT) @ Red Hat
Microsoft MVP on Wi
o verbose
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
Thanks
Paolo.
Paolo Patierno
Senior S
Sorry I missed the "return" on the last line ... coming from Scala :-)
Paolo PatiernoSenior Software Engineer (IoT) @ Red Hat
Microsoft MVP on Windows Embedded & IoTMicrosoft Azure Advisor
Twitter : @ppatierno
Linkedin : paolopatierno
Blog : DevExperience
From: ppatie...@live.com
To:
Hi all,
I wrote following function in Python :
def createStreamingContext():
conf = (SparkConf().setMaster("local[2]").setAppName("my_app"))
conf.set("spark.streaming.receiver.writeAheadLog.enable", "true")
sc = SparkContext(conf=conf)
ssc = StreamingContext(sc, 1)
Hi all,
what is the difference between JavaReceiverInputDStream and JavaDStream ?
I see that the last one is always used in alla custom receiver when the
createStream is going to be used for Python.
Thanks,
Paolo.
Hi all,
I'm working on an AMQP extension for Apache Spark Streaming, developing a
reliable receiver for that.
After
MQTT support (I see it in the Apache Bahir repository), another messaging/IoT
protocol
could be very useful for the Apache Spark Streaming ecosystem. Out there a lot
of
park/blob/2.0.0-preview/pom.xml#L164
>
> but was updated to 2.6.5 soon after that, since it was 2.6.5 in 2.0.0-RC1:
>
> https://github.com/apache/spark/blob/v2.0.0-rc1/pom.xml#L163
>
> On Sat, Jul 2, 2016 at 3:04 PM, Paolo Patierno <ppatie...@live.com> wrote:
>
on is 2.6.5 in master
> and branch-2.0, and jackson-module-scala is managed to this version
> along with all the other jackson artifacts.
>
> On Sat, Jul 2, 2016 at 1:35 PM, Paolo Patierno <ppatie...@live.com> wrote:
> > What I see is the following ...
> >
> &g
What I see is the following ...
- Working configuration
Spark Version : "2.0.0-SNAPSHOT"
The Vert.x library brings ...
jackson-annotations:2.6.0
jackson-core:2.6.1
jackson-databind:2.6.1
Spark brings
jackson-annotations:2.6.5
jackson-core:2.6.5
jackson-databind:2.6.5
jackson-module-scala_2.11:
Hi,
developing a custom receiver up today I used spark version "2.0.0-SNAPSHOT" and
scala version 2.11.7.
With these version all tests work fine.
I have just switching to "2.0.0-preview" as spark version but not I have
following error :
An exception or error caused a run to abort: class
Hi,
following the socketStream[T] function implementation from the official spark
GitHub repo :
ef socketStream[T](
hostname: String,
port: Int,
converter: JFunction[InputStream, java.lang.Iterable[T]],
ging:
private[spark] trait Logging {
FYI
On Mon, Jun 27, 2016 at 8:20 AM, Paolo Patierno <ppatie...@live.com> wrote:
Hello,
I'm trying to use the Utils.createTempDir() method importing
org.apache.spark.util.Utils but the scala compiler says me that :
object Utils in package u
Hello,
I'm trying to use the Utils.createTempDir() method importing
org.apache.spark.util.Utils but the scala compiler says me that :
object Utils in package util cannot be accessed in package org.apache.spark.util
I'm facing the same problem with Logging.
My sbt file has following dependency
;yuzhih...@gmail.com> wrote:
See this related thread:
http://search-hadoop.com/m/q3RTtEor1vYWbsW=RE+Configuring+Log4J+Spark+1+5+on+EMR+4+1+
On Fri, Jun 24, 2016 at 6:07 AM, Paolo Patierno <ppatie...@live.com> wrote:
Hi,
developing a Spark Streaming custom receiver I noticed that the Logg
Hi,
developing a Spark Streaming custom receiver I noticed that the Logging trait
isn't accessible anymore in Spark 2.0.
trait Logging in package internal cannot be accessed in package
org.apache.spark.internal
For developing a custom receiver what is the preferred way for logging ? Just
16 matches
Mail list logo