unsubscribe

2022-03-18 Thread Basavaraj
unsubscribe

smime.p7s
Description: S/MIME cryptographic signature


unsubscribe

2022-03-11 Thread Basavaraj



smime.p7s
Description: S/MIME cryptographic signature


unsubscribe

2022-03-02 Thread Basavaraj
unsubscribe

smime.p7s
Description: S/MIME cryptographic signature


unsubscribe

2022-02-11 Thread Basavaraj
unsubscribe

smime.p7s
Description: S/MIME cryptographic signature


unsubscribe

2020-05-15 Thread Basavaraj



smime.p7s
Description: S/MIME cryptographic signature


Request for a working example of using Pregel API in GraphX using Spark Scala

2019-05-05 Thread Basavaraj
Hello All

I am a beginner in Spark, trying to use GraphX for an iterative processing by 
connecting to Kafka Stream Processing

Looking for any git reference to real application example, in Scala 

Please revert with any reference to it, or if someone is trying to build, I 
could join them 

Regards
Basavaraj K N




smime.p7s
Description: S/MIME cryptographic signature


Re: Checking if cascading graph computation is possible in Spark

2019-04-05 Thread Basavaraj
I have checked broadcast of accumulated values, but not satellite stateful
stabbing

But, I am not sure how that helps here

On Fri, 5 Apr 2019, 10:13 pm Jason Nerothin, 
wrote:

> Have you looked at Arbitrary Stateful Streaming and Broadcast Accumulators?
>
> On Fri, Apr 5, 2019 at 10:55 AM Basavaraj  wrote:
>
>> Hi
>>
>> Have two questions
>>
>> #1
>> I am trying to process events in realtime, outcome of the processing has
>> to find a node in the GraphX and update that node as well (in case if any
>> anomaly or state change), If a node is updated, I have to update the
>> related nodes as well, want to know if GraphX can help in this by providing
>> some native support
>>
>> #2
>> I want to do the above as a event driven way, without using the batches
>> (i tried micro batches, but I realised that’s not what I want), i.e., for
>> each arriving event or as soon as a event message come my stream, not by
>> accumulating the event
>>
>> I humbly welcome any pointers, constructive criticism
>>
>> Regards
>> Basav
>> - To
>> unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>
>
> --
> Thanks,
> Jason
>


Checking if cascading graph computation is possible in Spark

2019-04-05 Thread Basavaraj
HiHave two questions #1 I am trying to process events in realtime, outcome of the processing has to find a node in the GraphX and update that node as well (in case if any anomaly or state change), If a node is updated, I have to update the related nodes as well, want to know if GraphX can help in this by providing some native support#2 I want to do the above as a event driven way, without using the batches (i tried micro batches, but I realised that’s not what I want), i.e., for each arriving event or as soon as a event message come my stream, not by accumulating the event I humbly welcome any pointers, constructive criticism RegardsBasav
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Checking if cascading graph computation is possible in Spark

2019-04-05 Thread Basavaraj
Hi

Have two questions 

#1 
I am trying to process events in realtime, outcome of the processing has to 
find a node in the GraphX and update that node as well (in case if any anomaly 
or state change), If a node is updated, I have to update the related nodes as 
well, want to know if GraphX can help in this by providing some native support

#2 
I want to do the above as a event driven way, without using the batches (i 
tried micro batches, but I realised that’s not what I want), i.e., for each 
arriving event or as soon as a event message come my stream, not by 
accumulating the event 

I humbly welcome any pointers, constructive criticism 

Regards
Basav

smime.p7s
Description: S/MIME cryptographic signature


Re: Job submission API

2015-04-07 Thread Veena Basavaraj
The following might be helpful.

http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/What-dependencies-to-submit-Spark-jobs-programmatically-not-via/td-p/24721

http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/

On 7 April 2015 at 16:32, michal.klo...@gmail.com michal.klo...@gmail.com
wrote:

 A SparkContext can submit jobs remotely.

 The spark-submit options in general can be populated into a SparkConf and
 passed in when you create a SparkContext.

 We personally have not had too much success with yarn-client remote
 submission, but standalone cluster mode was easy to get going.

 M



 On Apr 7, 2015, at 7:01 PM, Prashant Kommireddi prash1...@gmail.com
 wrote:

 Hello folks,

 Newbie here! Just had a quick question - is there a job submission API
 such as the one with hadoop

 https://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/mapreduce/Job.html#submit()
 to submit Spark jobs to a Yarn cluster? I see in example that
 bin/spark-submit is what's out there, but couldn't find any APIs around it.

 Thanks,
 Prashant




-- 
Regards
vybs