1u0 commented on a change in pull request #8495: [FLINK-12556][e2e] Extend some
end-to-end tests to run with custom (input) File System implementation
URL: https://github.com/apache/flink/pull/8495#discussion_r286513937
##########
File path: flink-end-to-end-tests/test-scripts/test_yarn_kerberos_docker.sh
##########
@@ -92,34 +88,28 @@ do
sleep 2
done
-CLUSTER_STARTED=1
-for (( i = 0; i < $CLUSTER_SETUP_RETRIES; i++ ))
-do
- if start_hadoop_cluster; then
- echo "Cluster started successfully."
- CLUSTER_STARTED=0
- break #continue test, cluster set up succeeded
- fi
-
- echo "ERROR: Could not start hadoop cluster. Retrying..."
- docker-compose -f
$END_TO_END_DIR/test-scripts/docker-hadoop-secure-cluster/docker-compose.yml
down
-done
-
-if [[ ${CLUSTER_STARTED} -ne 0 ]]; then
+if ! retry_times $CLUSTER_SETUP_RETRIES 0 start_hadoop_cluster; then
echo "ERROR: Could not start hadoop cluster. Aborting..."
exit 1
fi
+mkdir -p $FLINK_TARBALL_DIR
+tar czf $FLINK_TARBALL_DIR/$FLINK_TARBALL -C $(dirname $FLINK_DIR) .
Review comment:
They are moved here to be run later. So if the test fails early (for example
on cluster spin up), it won't create unused tarball.
This change was part of the test refactoring (you can check it by reading
commits one-by-one).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services