I have only written Akka code in Scala only. Here is the akka documentation
that would help you to get started...
http://doc.akka.io/docs/akka/2.4.0/intro/getting-started.html

>JavaSparkContext(conf)
The idea is to create a SparkContext and pass it as a props (constructor in
java sense) to an akka actor so that you can send interactive spark jobs to
the actor system. The only caveat is that, you'll have to run this spark
application in client mode.

>sc.parallelize(list).foreach
>// here we will have db transaction as well.
The way I had done DB Transaction is to run a synchronous (awaitable call
from Akka sense) to perform db operation atomic to the data being processed
using slick (http://slick.typesafe.com/doc/3.1.0/gettingstarted.html).
In your case the following two links could shed some light...
-
http://stackoverflow.com/questions/24896233/how-to-save-apache-spark-schema-output-in-mysql-database
-
https://databricks.gitbooks.io/databricks-spark-reference-applications/content/logs_analyzer/chapter3/save_an_rdd_to_a_database.html

On a side note, I noticed that you provide a custom serializer. In my case,
I have used case classes (a construct from Scala) that can use the default
serializer provided by Spark.

Hope this helps.

Thanks,
Muthu


On Sat, Nov 14, 2015 at 10:18 PM, Netai Biswas <mail2efo...@gmail.com>
wrote:

> Hi,
>
> Thanks for your response. I will give a try with akka also, if you have
> any sample code or useful link please do share with me. Anyway I am sharing
> one sample code of mine.
>
> Sample Code:
>
> @Autowiredprivate SpringBean springBean;
> public void test() throws Exception {
>     SparkConf conf = new SparkConf().setAppName("APP").setMaster(masterURL);
>     conf.set("spark.serializer", 
> "de.paraplu.springspark.serialization.SpringAwareSerializer");
>    sc = new JavaSparkContext(conf);
>
> sc.parallelize(list).foreach(new VoidFunction<String>() {
>     private static final long serialVersionUID = 1L;
>
>         @Override
>         public void call(String t) throws Exception {
>             springBean.someAPI(t); // here we will have db transaction as 
> well.
>         }
>     });}
>
> Thanks,
> Netai
>
> On Sat, Nov 14, 2015 at 10:40 PM, Muthu Jayakumar <bablo...@gmail.com>
> wrote:
>
>> You could try to use akka actor system with apache spark, if you are
>> intending to use it in online / interactive job execution scenario.
>>
>> On Sat, Nov 14, 2015, 08:19 Sabarish Sasidharan <
>> sabarish.sasidha...@manthan.com> wrote:
>>
>>> You are probably trying to access the spring context from the executors
>>> after initializing it at the driver. And running into serialization issues.
>>>
>>> You could instead use mapPartitions() and initialize the spring context
>>> from within that.
>>>
>>> That said I don't think that will solve all of your issues because you
>>> won't be able to use the other rich transformations in Spark.
>>>
>>> I am afraid these two don't gel that well, unless and otherwise all your
>>> context lookups for beans happen in the driver.
>>>
>>> Regards
>>> Sab
>>> On 13-Nov-2015 4:17 pm, "Netai Biswas" <mail2efo...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am facing issue while integrating spark with spring.
>>>>
>>>> I am getting "java.lang.IllegalStateException: Cannot deserialize
>>>> BeanFactory with id" errors for all beans. I have tried few solutions
>>>> available in web. Please help me out to solve this issue.
>>>>
>>>> Few details:
>>>>
>>>> Java : 8
>>>> Spark : 1.5.1
>>>> Spring : 3.2.9.RELEASE
>>>>
>>>> Please let me know if you need more information or any sample code.
>>>>
>>>> Thanks,
>>>> Netai
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-and-Spring-Integrations-tp25375.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>

Reply via email to