Re: Using a non serializable third party JSON serializable on a spark worker node throws NotSerializableException

2016-03-01 Thread Shixiong(Ryan) Zhu
Could you show the full companion object? It looks weird that having
`override` in a companion object of a case class.

On Tue, Mar 1, 2016 at 11:16 AM, Yuval Itzchakov  wrote:

> As I said, it is the method which eventually serializes the object. It is
> declared inside a companion object of a case class.
>
> The problem is that Spark will still try to serialize the method, as it
> needs to execute on the worker. How will that change the fact that
> `EncodeJson[T]` is not serializable?
>
>
> On Tue, Mar 1, 2016, 21:12 Shixiong(Ryan) Zhu 
> wrote:
>
>> Don't know where "argonaut.EncodeJson$$anon$2" comes from. However, you
>> can always put your codes into an method of an "object". Then just call it
>> like a Java static method.
>>
>> On Tue, Mar 1, 2016 at 10:30 AM, Yuval.Itzchakov 
>> wrote:
>>
>>> I have a small snippet of code which relays on  argonaut
>>>    for JSON serialization which is ran from a
>>> `PairRDDFunctions.mapWithState` once a session is completed.
>>>
>>> This is the code snippet (not that important):
>>>
>>>   override def sendMessage(pageView: PageView): Unit = {
>>> Future {
>>>   LogHolder.logger.info(s"Sending pageview: ${pageView.id} to
>>> automation")
>>>   try {
>>> Http(url)
>>>   .postData(pageView.asJson.toString)
>>>   .option(HttpOptions.connTimeout(timeOutMilliseconds))
>>>   .asString
>>>   .throwError
>>>   }
>>>   catch {
>>> case NonFatal(e) => LogHolder.logger.error("Failed to send
>>> pageview", e)
>>>   }
>>> }
>>>   }
>>>
>>> argonaut relys on a user implementation of a trait called
>>> `EncodeJson[T]`,
>>> which tells argonaut how to serialize and deserialize the object.
>>>
>>> The problem is, that the trait `EncodeJson[T]` is not serializable, thus
>>> throwing a NotSerializableException:
>>>
>>> Caused by: java.io.NotSerializableException: argonaut.EncodeJson$$anon$2
>>> Serialization stack:
>>> - object not serializable (class: argonaut.EncodeJson$$anon$2,
>>> value: argonaut.EncodeJson$$anon$2@6415f61e)
>>>
>>> This is obvious and understandable.
>>>
>>> The question I have is - What possible ways are there to work around
>>> this?
>>> I'm currently depended on a third-party library which I can't control of
>>> change to implement Serializable in anyway. I've seen this  this
>>> StackOverflow answer
>>> <
>>> http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou
>>> >
>>> but couldn't implement any reasonable workaround.
>>>
>>> Anyone have any ideas?
>>>
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-a-non-serializable-third-party-JSON-serializable-on-a-spark-worker-node-throws-NotSerializablen-tp26372.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>


Re: Using a non serializable third party JSON serializable on a spark worker node throws NotSerializableException

2016-03-01 Thread Yuval Itzchakov
As I said, it is the method which eventually serializes the object. It is
declared inside a companion object of a case class.

The problem is that Spark will still try to serialize the method, as it
needs to execute on the worker. How will that change the fact that
`EncodeJson[T]` is not serializable?

On Tue, Mar 1, 2016, 21:12 Shixiong(Ryan) Zhu 
wrote:

> Don't know where "argonaut.EncodeJson$$anon$2" comes from. However, you
> can always put your codes into an method of an "object". Then just call it
> like a Java static method.
>
> On Tue, Mar 1, 2016 at 10:30 AM, Yuval.Itzchakov 
> wrote:
>
>> I have a small snippet of code which relays on  argonaut
>>    for JSON serialization which is ran from a
>> `PairRDDFunctions.mapWithState` once a session is completed.
>>
>> This is the code snippet (not that important):
>>
>>   override def sendMessage(pageView: PageView): Unit = {
>> Future {
>>   LogHolder.logger.info(s"Sending pageview: ${pageView.id} to
>> automation")
>>   try {
>> Http(url)
>>   .postData(pageView.asJson.toString)
>>   .option(HttpOptions.connTimeout(timeOutMilliseconds))
>>   .asString
>>   .throwError
>>   }
>>   catch {
>> case NonFatal(e) => LogHolder.logger.error("Failed to send
>> pageview", e)
>>   }
>> }
>>   }
>>
>> argonaut relys on a user implementation of a trait called `EncodeJson[T]`,
>> which tells argonaut how to serialize and deserialize the object.
>>
>> The problem is, that the trait `EncodeJson[T]` is not serializable, thus
>> throwing a NotSerializableException:
>>
>> Caused by: java.io.NotSerializableException: argonaut.EncodeJson$$anon$2
>> Serialization stack:
>> - object not serializable (class: argonaut.EncodeJson$$anon$2,
>> value: argonaut.EncodeJson$$anon$2@6415f61e)
>>
>> This is obvious and understandable.
>>
>> The question I have is - What possible ways are there to work around this?
>> I'm currently depended on a third-party library which I can't control of
>> change to implement Serializable in anyway. I've seen this  this
>> StackOverflow answer
>> <
>> http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou
>> >
>> but couldn't implement any reasonable workaround.
>>
>> Anyone have any ideas?
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-a-non-serializable-third-party-JSON-serializable-on-a-spark-worker-node-throws-NotSerializablen-tp26372.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>


Re: Using a non serializable third party JSON serializable on a spark worker node throws NotSerializableException

2016-03-01 Thread Shixiong(Ryan) Zhu
Don't know where "argonaut.EncodeJson$$anon$2" comes from. However, you can
always put your codes into an method of an "object". Then just call it like
a Java static method.

On Tue, Mar 1, 2016 at 10:30 AM, Yuval.Itzchakov  wrote:

> I have a small snippet of code which relays on  argonaut
>    for JSON serialization which is ran from a
> `PairRDDFunctions.mapWithState` once a session is completed.
>
> This is the code snippet (not that important):
>
>   override def sendMessage(pageView: PageView): Unit = {
> Future {
>   LogHolder.logger.info(s"Sending pageview: ${pageView.id} to
> automation")
>   try {
> Http(url)
>   .postData(pageView.asJson.toString)
>   .option(HttpOptions.connTimeout(timeOutMilliseconds))
>   .asString
>   .throwError
>   }
>   catch {
> case NonFatal(e) => LogHolder.logger.error("Failed to send
> pageview", e)
>   }
> }
>   }
>
> argonaut relys on a user implementation of a trait called `EncodeJson[T]`,
> which tells argonaut how to serialize and deserialize the object.
>
> The problem is, that the trait `EncodeJson[T]` is not serializable, thus
> throwing a NotSerializableException:
>
> Caused by: java.io.NotSerializableException: argonaut.EncodeJson$$anon$2
> Serialization stack:
> - object not serializable (class: argonaut.EncodeJson$$anon$2,
> value: argonaut.EncodeJson$$anon$2@6415f61e)
>
> This is obvious and understandable.
>
> The question I have is - What possible ways are there to work around this?
> I'm currently depended on a third-party library which I can't control of
> change to implement Serializable in anyway. I've seen this  this
> StackOverflow answer
> <
> http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou
> >
> but couldn't implement any reasonable workaround.
>
> Anyone have any ideas?
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-a-non-serializable-third-party-JSON-serializable-on-a-spark-worker-node-throws-NotSerializablen-tp26372.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Using a non serializable third party JSON serializable on a spark worker node throws NotSerializableException

2016-03-01 Thread Yuval.Itzchakov
I have a small snippet of code which relays on  argonaut
   for JSON serialization which is ran from a
`PairRDDFunctions.mapWithState` once a session is completed.

This is the code snippet (not that important):

  override def sendMessage(pageView: PageView): Unit = {
Future {
  LogHolder.logger.info(s"Sending pageview: ${pageView.id} to
automation")
  try {
Http(url)
  .postData(pageView.asJson.toString)
  .option(HttpOptions.connTimeout(timeOutMilliseconds))
  .asString
  .throwError
  }
  catch {
case NonFatal(e) => LogHolder.logger.error("Failed to send
pageview", e)
  }
}
  }

argonaut relys on a user implementation of a trait called `EncodeJson[T]`,
which tells argonaut how to serialize and deserialize the object.

The problem is, that the trait `EncodeJson[T]` is not serializable, thus
throwing a NotSerializableException:

Caused by: java.io.NotSerializableException: argonaut.EncodeJson$$anon$2
Serialization stack:
- object not serializable (class: argonaut.EncodeJson$$anon$2,
value: argonaut.EncodeJson$$anon$2@6415f61e)

This is obvious and understandable.

The question I have is - What possible ways are there to work around this?
I'm currently depended on a third-party library which I can't control of
change to implement Serializable in anyway. I've seen this  this
StackOverflow answer

  
but couldn't implement any reasonable workaround.

Anyone have any ideas?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-a-non-serializable-third-party-JSON-serializable-on-a-spark-worker-node-throws-NotSerializablen-tp26372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org