Hi,
It’s better creating a script that delete the kafka folder where exist the
kafka topic and after create it again if need.
BR
Eduardo Costa Alfaia
Ph.D. Student in Telecommunications Engineering
Università degli Studi di Brescia
Tel: +39 3209333018
On 5/11/16, 09:48, "Sneh
Hi Aida,
The installation has detected a maven version 3.0.3. Update to 3.3.3 and
try again.
Il 08/Mar/2016 14:06, "Aida" ha scritto:
> Hi all,
>
> Thanks everyone for your responses; really appreciate it.
>
> Eduardo - I tried your suggestions but ran into some issues,
Hi Aida
Run only "build/mvn -DskipTests clean package”
BR
Eduardo Costa Alfaia
Ph.D. Student in Telecommunications Engineering
Università degli Studi di Brescia
Tel: +39 3209333018
On 3/4/16, 16:18, "Aida" <aida1.tef...@gmail.com> wrote:
>Hi all,
&
Hi,
try http://OAhtvJ5MCA:8080
BR
On 2/19/16, 07:18, "vasbhat" wrote:
>OAhtvJ5MCA
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
-
To unsubscribe, e-mail:
Hi Gourav,
I did a prove as you said, for me it’s working, I am using spark in local mode,
master and worker in the same machine. I run the example in spark-shell
—package com.databricks:spark-csv_2.10:1.3.0 without errors.
BR
From: Gourav Sengupta
Date: Monday,
Hi Guys,
How could I unsubscribe the email e.costaalf...@studenti.unibs.it, that is an
alias from my email e.costaalf...@unibs.it and it is registered in the mail
list .
Thanks
Eduardo Costa Alfaia
PhD Student Telecommunication Engineering
Università degli Studi di Brescia-UNIBS
Hi Guys,
How could I solving this problem?
% Failed to produce message: Local: Queue full
% Failed to produce message: Local: Queue full
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
Hi Magnus
I think this answer
c) producing messages at a higher rate than the network or broker can
handle
How could I manager this?
> On 26 Oct 2015, at 17:45, Magnus Edenhill wrote:
>
> c) producing messages at a higher rate than the network or broker can
> handle
--
Thanks Ted.
On Feb 10, 2015, at 20:06, Ted Yu yuzhih...@gmail.com wrote:
Please take a look at:
examples/scala-2.10/src/main/java/org/apache/spark/examples/streaming/JavaDirectKafkaWordCount.java
which was checked in yesterday.
On Sat, Feb 7, 2015 at 10:53 AM, Eduardo Costa Alfaia
Hi Guys,
I have some doubts about the Kafka, the first is Why sometimes the applications
prefer to connect to zookeeper instead brokers? Connecting to zookeeper could
create an overhead, because we are inserting other element between producer and
consumer. Another question is about the
Hi Guys,
How could I doing in Java the code scala below?
val KafkaDStreams = (1 to numStreams) map {_ =
KafkaUtils.createStream[String, String, StringDecoder, StringDecoder](ssc,
kafkaParams, topicMap,storageLevel = StorageLevel.MEMORY_ONLY).map(_._2)
}
val unifiedStream =
Hi Guys,
I’m getting this error in KafkaWordCount;
TaskSetManager: Lost task 0.0 in stage 4095.0 (TID 1281, 10.20.10.234):
java.lang.ClassCastException: [B cannot be cast to java.lang.String
I don’t think so Sean.
On Feb 5, 2015, at 16:57, Sean Owen so...@cloudera.com wrote:
Is SPARK-4905 / https://github.com/apache/spark/pull/4371/files the same
issue?
On Thu, Feb 5, 2015 at 7:03 AM, Eduardo Costa Alfaia
e.costaalf...@unibs.it wrote:
Hi Guys,
I’m getting this error
.
`DefaultDecoder` is to return Array[Byte], not String, so here class casting
will meet error.
Thanks
Jerry
-Original Message-
From: Eduardo Costa Alfaia [mailto:e.costaalf...@unibs.it]
Sent: Friday, February 6, 2015 12:04 AM
To: Sean Owen
Cc: user@spark.apache.org
Subject: Re: Error
Hi Guys,
I would like to put in the kafkawordcount scala code the kafka parameter: val
kafkaParams = Map(“fetch.message.max.bytes” - “400”). I’ve put this
variable like this
val KafkaDStreams = (1 to numStreams) map {_ =
Hi Guys,
some idea how solve this error
[error]
/sata_disk/workspace/spark-1.1.1/examples/src/main/scala/org/apache/spark/examples/streaming/KafkaWordCount.scala:76:
missing parameter type for expanded function ((x$6, x$7) = x$6.$plus(x$7))
Hi Guys,
I would like to put in the kafkawordcount scala code the kafka parameter: val
kafkaParams = Map(“fetch.message.max.bytes” - “400”). I’ve put this
variable like this
val KafkaDStreams = (1 to numStreams) map {_ =
Hi All,
I am having an issue when using kafka with librdkafka. I've changed the
message.max.bytes to 2MB in my server.properties config file, that is the size
of my message, when I run the command line ./rdkafka_performance -C -t test -p
0 -b computer49:9092, after consume some messages the
fetch.message.max.bytes=400 to your
command line.
Regards,
Magnus
2015-01-19 17:52 GMT+01:00 Eduardo Costa Alfaia e.costaalf...@unibs.it:
Hi All,
I am having an issue when using kafka with librdkafka. I've changed the
message.max.bytes to 2MB in my server.properties config file, that is the
size
Hi Guys,
I am doing some tests with JavaKafkaWordCount, my cluster is composed by 8
workers and 1 driver con spark-1.1.0, I am using Kafka too and I have some
questions about.
1 - When I launch the command:
bin/spark-submit --class org.apache.spark.examples.streaming.JavaKafkaWordCount
Hi guys,
The Kafka’s examples in master branch were canceled?
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail:
Hi Guys
Anyone could explain me this information?
208K), 0.0086120 secs] [Times: user=0.06 sys=0.00, real=0.01 secs]
2014-11-06T12:20:55.673+0100: 1256.382: [GC2014-11-06T12:20:55.674+0100:
1256.382: [ParNew: 551115K-2816K(613440K), 0.0204130 secs]
560218K-13933K(4126208K), 0.0205130 secs]
Hi Guys,
How could I use the Consumer and Producer configs in my Kafka environment?
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
Hi Guys,
I am doing some tests with Spark Streaming and Kafka, but I have seen something
strange, I have modified the JavaKafkaWordCount to use ReducebyKeyandWindow and
to print in the screen the accumulated numbers of the words, in the beginning
spark works very well in each interaction the
.
On Thu, Nov 6, 2014 at 9:32 AM, Eduardo Costa Alfaia e.costaalf...@unibs.it
wrote:
Hi Guys,
I am doing some tests with Spark Streaming and Kafka, but I have seen
something strange, I have modified the JavaKafkaWordCount to use
ReducebyKeyandWindow and to print in the screen
Hi Guys,
I am doing some tests with Spark Streaming and Kafka, but I have seen something
strange, I have modified the JavaKafkaWordCount to use ReducebyKeyandWindow and
to print in the screen the accumulated numbers of the words, in the beginning
spark works very well in each interaction the
.
On Thu, Nov 6, 2014 at 9:32 AM, Eduardo Costa Alfaia e.costaalf...@unibs.it
wrote:
Hi Guys,
I am doing some tests with Spark Streaming and Kafka, but I have seen
something strange, I have modified the JavaKafkaWordCount to use
ReducebyKeyandWindow and to print in the screen
in spark should grow up - The spark
word-count example doesn't accumulate.
It gets an RDD every n seconds and counts the words in that RDD. So we
don't expect the count to go up.
On Mon, Nov 3, 2014 at 6:57 AM, Eduardo Costa Alfaia
e.costaalf...@unibs.it
wrote:
Hi Guys,
Anyone could
Hi Dudes,
I would like to know if the producer and consumer’s properties files into the
config folder should be configured. I have configured only the
server.properties, is it enough? I am doing some tests about the performance,
for example network throughput my scenario is:
Like producer I
Hi Guys,
Anyone could explain me how to work Kafka with Spark, I am using the
JavaKafkaWordCount.java like a test and the line command is:
./run-example org.apache.spark.streaming.examples.JavaKafkaWordCount
spark://192.168.0.13:7077 computer49:2181 test-consumer-group unibs.it 3
and like a
Hi Guys,
Anyone could explain me how to work Kafka with Spark, I am using the
JavaKafkaWordCount.java like a test and the line command is:
./run-example org.apache.spark.streaming.examples.JavaKafkaWordCount
spark://192.168.0.13:7077 computer49:2181 test-consumer-group unibs.it 3
and like a
Hi Guys,
Is there a manner of cleaning a kafka queue after that the consumer consume
the messages?
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
the log
more info on that config here
https://kafka.apache.org/08/configuration.html
if you want to delete a message after the consumer processed a message
there is no api for it.
-Harsha
On Tue, Oct 21, 2014, at 08:00 AM, Eduardo Costa Alfaia wrote:
Hi Guys,
Is there a manner of cleaning
Hi TD,
I have sent more informations now using 8 workers. The gap has been 27 sec now.
Have you seen?
Thanks
BR
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
Ok Andrew,
Thanks
I sent informations of test with 8 worker and the gap is grown up.
On May 4, 2014, at 2:31, Andrew Ash and...@andrewash.com wrote:
From the logs, I see that the print() starts printing stuff 10 seconds
after the context is started. And that 10 seconds is taken by the
. And that does
not seem to be a persistent problem as after that 10 seconds, the data is
being received and processed.
TD
On Fri, May 2, 2014 at 2:14 PM, Eduardo Costa Alfaia e.costaalf...@unibs.it
wrote:
Hi TD,
I got the another information today using Spark 1.0 RC3 and the situation
Hi TD,
In my tests with spark streaming, I'm using JavaNetworkWordCount(modified) code
and a program that I wrote that sends words to the Spark worker, I use TCP as
transport. I verified that after starting Spark, it connects to my source which
actually starts sending, but the first word count
no
room for processing the received data. It could be that after 30 seconds, the
server disconnects, the receiver terminates, releasing the single slot for
the processing to proceed.
TD
On Tue, Apr 29, 2014 at 2:28 PM, Eduardo Costa Alfaia
e.costaalf...@unibs.it wrote:
Hi TD
are facing?
TD
On Fri, Apr 4, 2014 at 8:03 AM, Eduardo Costa Alfaia
e.costaalf...@unibs.it mailto:e.costaalf...@unibs.it wrote:
Hi guys,
I would like knowing if the part of code is right to use in Window.
JavaPairDStreamString, Integer wordCounts = words.map(
103 new
Hi Guys,
I would like understanding why the Driver's RAM goes down, Does the
processing occur only in the workers?
Thanks
# Start Tests
computer1(Worker/Source Stream)
23:57:18 up 12:03, 1 user, load average: 0.03, 0.31, 0.44
total used free shared
Hi all,
Could anyone explain me about the lines below?
computer1 - worker
computer8 - driver(master)
14/04/04 14:24:56 INFO BlockManagerMasterActor$BlockManagerInfo: Added
input-0-1396614314800 in memory on computer1.ant-net:60820 (size: 1262.5
KB, free: 540.3 MB)
14/04/04 14:24:56 INFO
Hi all,
I am doing some tests using JavaNetworkWordcount and I have some
questions about the performance machine, my tests' time are
approximately 2 min.
Why does the RAM Memory decrease meaningly? I have done tests with 2, 3
machines and I had gotten the same behavior.
What should I
Hi Guys,
Could anyone help me understand this driver behavior when I start the
JavaNetworkWordCount?
computer8
16:24:07 up 121 days, 22:21, 12 users, load average: 0.66, 1.27, 1.55
total used free shared buffers
cached
Mem: 5897
Hi all,
I have put this line in my spark-env.sh:
-Dspark.default.parallelism=20
this parallelism level, is it correct?
The machine's processor is a dual core.
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
Hi Guys,
Could anyone explain me this behavior? After 2 min of tests
computer1- worker
computer10 - worker
computer8 - driver(master)
computer1
18:24:31 up 73 days, 7:14, 1 user, load average: 3.93, 2.45, 1.14
total used free shared buffers
cached
, Eduardo Costa Alfaia
e.costaalf...@unibs.it mailto:e.costaalf...@unibs.it wrote:
Hi all,
I have put this line in my spark-env.sh:
-Dspark.default.parallelism=20
this parallelism level, is it correct?
The machine's processor is a dual core.
Thanks
--
Informativa
problem you are facing?
TD
On Fri, Apr 4, 2014 at 8:03 AM, Eduardo Costa Alfaia
e.costaalf...@unibs.it mailto:e.costaalf...@unibs.it wrote:
Hi guys,
I would like knowing if the part of code is right to use in Window.
JavaPairDStreamString, Integer wordCounts = words.map
Hi Guys
I would like printing the content inside of line in :
JavaDStreamString lines = ssc.socketTextStream(args[1],
Integer.parseInt(args[2]));
JavaDStreamString words = lines.flatMap(new
FlatMapFunctionString, String() {
@Override
public IterableString call(String x) {
Thank you very much Sourav
BR
Em 3/26/14, 17:29, Sourav Chandra escreveu:
def print() {
def foreachFunc = (rdd: RDD[T], time: Time) = {
val total = rdd.collect().toList
println (---)
println (Time: + time)
println
Hi Guys,
I think that I already did this question, but I don't remember if anyone
has answered me. I would like changing in the function print() the
quantity of words and the frequency number that are sent to driver's
screen. The default value is 10.
Anyone could help me with this?
Best
Hi Guys,
Could anyone help me to understand this piece of log in red? Why is this
happened?
Thanks
14/03/10 16:55:20 INFO SparkContext: Starting job: first at
NetworkWordCount.scala:87
14/03/10 16:55:20 INFO JobScheduler: Finished job streaming job
1394466892000 ms.0 from job set of time
Yes TD,
I can use tcpdump to see if the data are being accepted by the receiver
and if else them are arriving into the IP packet.
Thanks
Em 3/8/14, 4:19, Tathagata Das escreveu:
I am not sure how to debug this without any more information about the
source. Can you monitor on the receiver side
Hi Guys,
Could anyone help me understanding the logs below? Why the result in the
second log is 0?
Thanks Guys
14/02/20 19:06:00 INFO JobScheduler: Finished job streaming job
1392919557000 ms.0 from job set of time 1392919557000 ms
14/02/20 19:06:00 INFO JobScheduler: Total delay: 3.185 s
Hi Guys,
I am doing some test with NetworkWordCount scala code where I am
counting and summing a stream of words received from network using
foreach action, thanks TD. Firstly I have began with this scenario 1
Master + 1 Worker(also actioning like Stream source) and I have obtained
the
Hi Sai
Have you already tried running with JDK-7?
BR
On Feb 11, 2014, at 6:00, Sai Prasanna ansaiprasa...@gmail.com wrote:
When i ran sbt/sbt assembly after installing scala 2.9.3 and downloading
Spark 0.8.1 binaries and JDK-6 being intalled, for a standalone spark, i got
the following
Hi Guys,
I am getting this error when I compile NetworkWordCount.scala:
nfo] Compiling 1 Scala source to
/opt/unibs_test/incubator-spark-tdas/examples/target/scala-2.10/classes...
[error]
on replication if you want
fault-tolerance.
TD
On Mon, Feb 3, 2014 at 3:19 PM, Eduardo Costa Alfaia e.costaalf...@unibs.it
wrote:
Hi Tathagata,
You were right when you have said for me to use scala against java, scala
is very easy. I have implemented that code you have given (in bold
Hi Guys,
I'm not very good like java programmer, so anybody could me help with this
code piece from JavaNetworkWordcount:
JavaPairDStreamString, Integer wordCounts = words.map(
new PairFunctionString, String, Integer() {
@Override
public Tuple2String, Integer call(String
= rdd.take(n)
println(First N elements = + firstN)
// Count the number of elements in each batch
println(RDD has + rdd.count() + elements)
})
Alternatively, just for printing the counts, you can also do
yourDStream.count.print()
Hope this helps!
TD
2014/1/20 Eduardo Costa
can also do
yourDStream.count.print()
Hope this helps!
TD
2014/1/20 Eduardo Costa Alfaia e.costaalf...@studenti.unibs.it
Hi guys,
Somebody help me, Where do I get change the print() function to print
more
than 10 lines in screen? Is there a manner
2014/1/20 Eduardo Costa Alfaia e.costaalf...@studenti.unibs.it
Hi guys,
Somebody help me, Where do I get change the print() function to print more
than 10 lines in screen? Is there a manner to print the count total of all
words in a batch?
Best Regards
--
---
INFORMATIVA SUL
2014/1/20 Eduardo Costa Alfaia e.costaalf...@studenti.unibs.it
Hi guys,
Somebody help me, Where do I get change the print() function to print more
than 10 lines in screen? Is there a manner to print the count total of all
words in a batch?
Best Regards
--
---
INFORMATIVA SUL
Hi guys,
Somebody help me, Where do I get change the print() function to print more than
10 lines in screen? Is there a manner to print the count total of all words in
a batch?
Best Regards
--
---
INFORMATIVA SUL TRATTAMENTO DEI DATI PERSONALI
I dati utilizzati per l'invio del presente
On Wed, Jan 15, 2014 at 6:01 AM, Eduardo Costa Alfaia
e.costaalf...@studenti.unibs.it wrote:
Hi Guys,
I did some changes in JavaNetworkWordCount for my researches in streaming
process and I have added to the code the following lines in red:
ssc1.checkpoint(hdfs://computer22:54310
Hi Dears,
I did recently(yesterday) a clone from github of the master incubator spark, I
am configuring spark with zookeeper, I have read the spark-standalone
documentation, I did the zookeeper’s installation and configuration, it’s ok,
it works, I am using 2 masters nodes but when I start
65 matches
Mail list logo