[ 
https://issues.apache.org/jira/browse/PIG-5283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16118075#comment-16118075
 ] 

liyunzhang_intel commented on PIG-5283:
---------------------------------------

[~szita]:  
{quote}
My only question is that if we should only write those properties that are 
required for a PigSplit instead of writing the full jobConf (6-700 entries) for 
optimization.

{quote}

not initialize all the items. it is ok to just initialize few items to make it 
work. Will PigInputFormatSpark#createRecordReader initialize all items after 
bypassing current issue?

> Configuration is not passed to SparkPigSplits on the backend
> ------------------------------------------------------------
>
>                 Key: PIG-5283
>                 URL: https://issues.apache.org/jira/browse/PIG-5283
>             Project: Pig
>          Issue Type: Bug
>          Components: spark
>            Reporter: Adam Szita
>            Assignee: Adam Szita
>         Attachments: PIG-5283.0.patch
>
>
> When a Hadoop ObjectWritable is created during a Spark job, the instantiated 
> PigSplit (wrapped into a SparkPigSplit) is given an empty Configuration 
> instance.
> This happens 
> [here|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SerializableWritable.scala#L44]



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to