[ 
https://issues.apache.org/jira/browse/TINKERPOP-1025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15047443#comment-15047443
 ] 

Marko A. Rodriguez commented on TINKERPOP-1025:
-----------------------------------------------

If you {{Spark.close()}} right before you use {{BulkLoaderVertexProgram}} all 
is good and you can do as many bulk loads as you want. I can't seem to 
understand what the pattern is... what in the {{SparkContext}} is making 
{{BulkLoaderVertexProgram}} unhappy. Even if I do {{Spark.create("local[4]")}} 
right before doing the {{BulkLoaderVertexProgramTest}} (and nothing else), it 
fails. I don't understand. And again, it only fails for {{InputRDD}} reads.

> Solve SparkContext Persistence Issues with BulkLoaderVertexProgram
> ------------------------------------------------------------------
>
>                 Key: TINKERPOP-1025
>                 URL: https://issues.apache.org/jira/browse/TINKERPOP-1025
>             Project: TinkerPop
>          Issue Type: Bug
>          Components: hadoop
>    Affects Versions: 3.1.0-incubating
>            Reporter: Marko A. Rodriguez
>             Fix For: 3.1.1-incubating
>
>
> {{BulkLoaderVertexProgramTest}} fails when a persisted {{SparkContext}} is 
> used WITH an {{InputRDD}}. 
> Weird.
> If you use persisted context and {{InputFormat}}. Good.
> If you use a non-persisted context and {{InputFormat}}. Good.
> If you use a non-persisted context and {{InputRDD}}. Good.
> If you used a persisted context and {{InputRDD}}. Bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to