Hello guys,
I have a job that reads compressed (Snappy) data but when I run the job, it
is throwing an error "native snappy library not available: this version
of libhadoop was built without snappy support".
.
I followed this instruction but it did not resolve the issue:
https://community.hortonwo
Hello all,
What I wanted to do is configure the spark streaming job to read the
database using JdbcRDD and cache the results. This should occur only once
at the start of the job. It should not make any further connection to DB
afterwards. Is it possible to do that?
message. If now, continue summing the
> others.
>
> I can provide scala samples, my java is beyond rusty :)
>
> -adrian
>
> From: Uthayan Suthakar
> Date: Friday, October 23, 2015 at 2:10 PM
> To: Sander van Dijk
> Cc: user
> Subject: Re: [Spark Streaming] How do we rese
> rusty :) and perhaps others can suggest better spark streaming methods that
> can be used, but hopefully the idea is clear.
>
> Sander
>
> On Thu, Oct 22, 2015 at 4:06 PM Uthayan Suthakar <
> uthayan.sutha...@gmail.com> wrote:
>
>> Hello guys,
>>
>>
I need to take the value from a RDD and update the state of the other RDD.
Is this possible?
On 22 October 2015 at 16:06, Uthayan Suthakar
wrote:
> Hello guys,
>
> I have a stream job that will carryout computations and update the state
> (SUM the value). At some point, I would l
Hello guys,
I have a stream job that will carryout computations and update the state
(SUM the value). At some point, I would like to reset the state. I could
drop the state by setting 'None' but I don't want to drop it. I would like
to keep the state but update the state.
For example:
JavaPairD
ctober 2015 at 21:02, Tathagata Das wrote:
> Are you sure that there are not log4j errors in the driver logs? What if
> you try enabling debug level? And what does the streaming UI say?
>
>
> On Mon, Oct 12, 2015 at 12:50 PM, Uthayan Suthakar <
> uthayan.sutha...@gmail.com>
Any suggestions? Is there anyway that I could debug this issue?
Cheers,
Uthay
On 11 October 2015 at 18:39, Uthayan Suthakar
wrote:
> Hello all,
>
> I have a Spark Streaming job that run and produce results successfully.
> However, after a few days the job stop producing any outpu
Hello all,
I have a Spark Streaming job that run and produce results successfully.
However, after a few days the job stop producing any output. I can see the
job is still running ( polling data from Flume, completing jobs and it's
subtasks) however, it is failing to produce any output. I have to r
t; On Fri, Sep 25, 2015 at 10:22 AM, Tathagata Das
> wrote:
>
>> Are you by any chance setting DStream.remember() with null?
>>
>> On Thu, Sep 24, 2015 at 5:02 PM, Uthayan Suthakar <
>> uthayan.sutha...@gmail.com> wrote:
>>
>>> Hello all,
>>
Hello all,
My Stream job is throwing below exception at every interval. It is first
deleting the the checkpoint file and then it's trying to checkpoint, is
this normal behaviour? I'm using Spark 1.3.0. Do you know what may cause
this issue?
15/09/24 16:35:55 INFO scheduler.TaskSetManager: Finishe
will be read from disk -- better than recomputing in most cases.
>
> On Tue, Sep 22, 2015 at 4:20 AM, Uthayan Suthakar <
> uthayan.sutha...@gmail.com> wrote:
>
>>
>> Hello All,
>>
>> We have a Spark Streaming job that reads data from DB (three tables) and
>>
Hello All,
We have a Spark Streaming job that reads data from DB (three tables) and
cache them into memory ONLY at the start then it will happily carry out the
incremental calculation with the new data. What we've noticed occasionally
is that one of the RDDs caches only 90% of the data. Therefore,
Hello all,
I'm using Yarn-cluster mode to run the Spark Streaming job, but I could
only get the logs once the job is complete (manual intervention). But I
would like to see the logs while it is running, is this possible?
ng deprecated.
>>
>> If you're genuinely stuck with something ancient, then you need to
>> include the JAR that contains the class, and 1.9.13 does not. Why do you
>> think you need that particular version?
>>
>>
>>
>>
>> —
>> p...@
Hello Guys,
I'm running into below error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/codehaus/jackson/annotate/JsonClass
I have created a uber jar with Jackson-core-asl.1.9.13 and passed it with
--jars configuration, but still getting errors. I searched on the net and
found a
16 matches
Mail list logo