[
https://issues.apache.org/jira/browse/PIG-4489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14499065#comment-14499065
]
liyunzhang_intel commented on PIG-4489:
---------------------------------------
[~mohitsabharwal]:
The patch looks good to me. But i don't understand the issues you
mentioned, before your patch, we use following command to run one unit test:
ant -Dhadoopversion=23 -Dtestcase=TestLoadStoreFunctionLifeCycle
-Dexectype=spark test
After we use PIG-4489_1.patch, we can still use above command to run one
unit test. If you want to run one unit test in local or nonlocal mode,
please export SPARK_MASTER="local" or export SPARK_MASTER=“spark://xxx:7077"
The patch seems do following changes:
before we run all unit test in spark mode by ant test-spark
-Dhadoopversion=23. ( if you need run it in local or nonlocal mode, please
export SPARK_MASTER="local" or export SPARK_MASTER="spark://xxx:7077" , if not
export, the default value is "local")
now we can run all unit tests in spark mode by local or nonlocal mode:
local: ant test-spark-local
nolocal: ant test-spark (please export SPARK_MASTER="spark://xxx:7077"
otherwise it will still run in local mode)
Is my understanding to your patch right?
> Enable local mode tests for Spark engine
> ----------------------------------------
>
> Key: PIG-4489
> URL: https://issues.apache.org/jira/browse/PIG-4489
> Project: Pig
> Issue Type: Sub-task
> Components: spark
> Reporter: Mohit Sabharwal
> Assignee: Mohit Sabharwal
> Fix For: spark-branch
>
> Attachments: PIG-4489.1.patch, PIG-4489.patch
>
>
> Util.getLocalTestMode() currently only returns "tez_local" or "local".
> I see that ~212 testcases do this check, and we are not running these tests
> against Spark at this point.
> Currently all Spark tests run in local mode ("local" as a the Spark Cluster
> URL passed to JavaSparkContext), so we should enable these tests as well.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)