[ 
https://issues.apache.org/jira/browse/PIG-4174?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14148565#comment-14148565
 ] 

liyunzhang_intel commented on PIG-4174:
---------------------------------------

found some error when i run e2e test:
1. export  OLD_PIG_HOME  HADOOP_CONF_DIR  HADOOP_BIN variables
2. generate Data
ant -Dharness.old.pig=$OLD_PIG_HOME -Dharness.cluster.conf=$HADOOP_CONF_DIR 
-Dharness.cluster.bin=$HADOOP_BIN test-e2e-deploy-local     Success
3. copy data to hdfs to use with Spark
ant -Dharness.old.pig=$OLD_PIG_HOME -Dharness.cluster.conf=$HADOOP_CONF_DIR 
-Dharness.cluster.bin=$HADOOP_BIN -Dtests.to.run="-t Checkin_1" test-e2e-spark  
 

although the result of the above command is success, but I saw the error 
message in the log(attachment copy.data.to.hdfs.log)
*[exec] ERROR: driver->run() returned the following error message 
[./test_harness.pl at 156: Cannot create HDFS directory 
/user/pig/out/root-1411634072-nightly.conf/: 256 - Illegal seek*



> e2e tests for Spark
> -------------------
>
>                 Key: PIG-4174
>                 URL: https://issues.apache.org/jira/browse/PIG-4174
>             Project: Pig
>          Issue Type: Sub-task
>          Components: spark
>            Reporter: Praveen Rachabattuni
>            Assignee: Praveen Rachabattuni
>         Attachments: PIG-4174-1.patch, copy.data.to.hdfs.log
>
>
> Setup e2e tests for pig on spark like that pig on map-reduce and pig on tez.
> Steps to setup e2e tests:
> 1. Initialize Variables
> export OLD_PIG_HOME=/usr/local/Cellar/pig/0.12.0 # Should rather be 14
> export HADOOP_CONF_DIR=/usr/local/Cellar/hadoop/1.0.4/conf
> export HADOOP_BIN=/usr/local/Cellar/hadoop/1.0.4/bin/hadoop
> 2. Generate Data
> ant -Dharness.old.pig=$OLD_PIG_HOME -Dharness.cluster.conf=$HADOOP_CONF_DIR 
> -Dharness.cluster.bin=$HADOOP_BIN test-e2e-deploy-local
> (You might want to install necessary cpan modules incase of any dependency 
> errors 
> https://cwiki.apache.org/confluence/display/PIG/HowToTest#HowToTest-End-to-endTesting)
> Copy data to hdfs to use with Spark
> hadoop fs -put test/e2e/pig/testdist/data ./
> 3. Run particular test
> ant -Dharness.old.pig=$OLD_PIG_HOME -Dharness.cluster.conf=$HADOOP_CONF_DIR 
> -Dharness.cluster.bin=$HADOOP_BIN -Dtests.to.run="-t Checkin_1" test-e2e-spark
> 4. Run all tests
> ant -Dharness.old.pig=$OLD_PIG_HOME -Dharness.cluster.conf=$HADOOP_CONF_DIR 
> -Dharness.cluster.bin=$HADOOP_BIN test-e2e-spark



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to