[jira] [Commented] (SPARK-20210) Scala tests aborted in Spark SQL on ppc64le

2017-04-05 Thread Kazuaki Ishizaki (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15956436#comment-15956436
 ] 

Kazuaki Ishizaki commented on SPARK-20210:
--

I run the following two tests (DatasetCacheSuite and CachedTableSuite) with 
Ubuntu 16.04 / java 1.8.0_111. However, I did not see this exception.

{noformat}
% build/sbt "sql/test-only *sql.DatasetCacheSuite"
...
[info] ScalaTest
[info] Run completed in 10 seconds, 806 milliseconds.
[info] Total number of tests run: 4
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 4, Failed 0, Errors 0, Passed 4
[success] Total time: 637 s, completed Apr 5, 2017 4:00:31 PM
% build/sbt "sql/test-only *sql.CachedTableSuite"
...
[info] ScalaTest
[info] Run completed in 24 seconds, 214 milliseconds.
[info] Total number of tests run: 30
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 30, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 30, Failed 0, Errors 0, Passed 30
[success] Total time: 44 s, completed Apr 5, 2017 4:10:27 PM
{noformat}

{noformat}
$ cat /etc/os-release 
NAME="Ubuntu"
VERSION="16.04.1 LTS (Xenial Xerus)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 16.04.1 LTS"
VERSION_ID="16.04"
HOME_URL="http://www.ubuntu.com/;
SUPPORT_URL="http://help.ubuntu.com/;
BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/;
VERSION_CODENAME=xenial
UBUNTU_CODENAME=xenial
$ java -version
openjdk version "1.8.0_111"
OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-2ubuntu0.16.04.2-b14)
OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode)
{noformat}

> Scala tests aborted in Spark SQL on ppc64le
> ---
>
> Key: SPARK-20210
> URL: https://issues.apache.org/jira/browse/SPARK-20210
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.2.0
> Environment: Ubuntu 14.04 ppc64le 
> $ java -version
> openjdk version "1.8.0_111"
> OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-3~14.04.1-b14)
> OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode)
>Reporter: Sonia Garudi
>Priority: Minor
>  Labels: ppc64le
>
> The tests get aborted with the following error :
> {code}
> *** RUN ABORTED ***
>   org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 
> seconds]. This timeout is controlled by spark.rpc.askTimeout
>   at 
> org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
>   at 
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
>   at 
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
>   at 
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
>   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
>   at 
> org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)
>   at 
> org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)
>   at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   ...
>   Cause: java.util.concurrent.TimeoutException: Futures timed out after 
> [120 seconds]
>   at 
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>   at 
> org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)
>   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
>   at 
> org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)
>   at 
> org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)
>   at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>   ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: 

[jira] [Commented] (SPARK-20210) Scala tests aborted in Spark SQL on ppc64le

2017-04-04 Thread Sonia Garudi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15955032#comment-15955032
 ] 

Sonia Garudi commented on SPARK-20210:
--

[~srowen] , its not a flaky test. Although I am not sure whether its due to the 
PPC arch. I have tested it on both ppc64le and x86 arch. The tests ran and 
passed on the x86 platform.

> Scala tests aborted in Spark SQL on ppc64le
> ---
>
> Key: SPARK-20210
> URL: https://issues.apache.org/jira/browse/SPARK-20210
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 2.2.0
> Environment: Ubuntu 14.04 ppc64le 
> $ java -version
> openjdk version "1.8.0_111"
> OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-3~14.04.1-b14)
> OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode)
>Reporter: Sonia Garudi
>Priority: Minor
>  Labels: ppc64le
>
> The tests get aborted with the following error :
> {code}
> *** RUN ABORTED ***
>   org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 
> seconds]. This timeout is controlled by spark.rpc.askTimeout
>   at 
> org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
>   at 
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
>   at 
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
>   at 
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
>   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
>   at 
> org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)
>   at 
> org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)
>   at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   ...
>   Cause: java.util.concurrent.TimeoutException: Futures timed out after 
> [120 seconds]
>   at 
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>   at 
> org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)
>   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
>   at 
> org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)
>   at 
> org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)
>   at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at 
> org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>   ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org