[jira] [Updated] (SPARK-17674) Warnings from SparkR tests being ignored without redirecting to errors

2016-10-21 Thread Felix Cheung (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17674?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Felix Cheung updated SPARK-17674:
-
Fix Version/s: (was: 2.0.2)

> Warnings from SparkR tests being ignored without redirecting to errors
> --
>
> Key: SPARK-17674
> URL: https://issues.apache.org/jira/browse/SPARK-17674
> Project: Spark
>  Issue Type: Test
>  Components: SparkR
>Reporter: Hyukjin Kwon
>Assignee: Felix Cheung
> Fix For: 2.1.0
>
>
> For example, _currently_ we are having warnings as below:
> https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65905/consoleFull
> {code}
> Warnings 
> ---
> 1. spark.mlp (@test_mllib.R#400) - is.na() applied to non-(list or vector) of 
> type 'NULL'
> 2. spark.mlp (@test_mllib.R#401) - is.na() applied to non-(list or vector) of 
> type 'NULL'
> {code}
> This should be errors as specified in 
> https://github.com/apache/spark/blob/master/R/pkg/tests/run-all.R#L22 
> However, it seems passing the tests fine.
> This seems related with the behaciour in `testhat` library. We should 
> invesigate and fix. This was also discussed in 
> https://github.com/apache/spark/pull/15232



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-17674) Warnings from SparkR tests being ignored without redirecting to errors

2016-09-26 Thread Hyukjin Kwon (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17674?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-17674:
-
Description: 
For example, _currently_ we are having warnings as below:

https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65905/consoleFull

{code}
Warnings ---
1. spark.mlp (@test_mllib.R#400) - is.na() applied to non-(list or vector) of 
type 'NULL'

2. spark.mlp (@test_mllib.R#401) - is.na() applied to non-(list or vector) of 
type 'NULL'
{code}

This should be errors as specified in 
https://github.com/apache/spark/blob/master/R/pkg/tests/run-all.R#L22 

However, it seems passing the tests fine.

This seems related with the behaciour in `testhat` library. We should 
invesigate and fix. This was also discussed in 
https://github.com/apache/spark/pull/15232

  was:
For example, _currently_ we are having warnings as below:

{code}

```
Warnings ---
1. spark.mlp (@test_mllib.R#400) - is.na() applied to non-(list or vector) of 
type 'NULL'

2. spark.mlp (@test_mllib.R#401) - is.na() applied to non-(list or vector) of 
type 'NULL'
```

{code}

This should be errors as specified in 
https://github.com/apache/spark/blob/master/R/pkg/tests/run-all.R#L22 

However, it seems passing the tests fine.

This seems related with the behaciour in `testhat` library. We should 
invesigate and fix. This was also discussed in 
https://github.com/apache/spark/pull/15232


> Warnings from SparkR tests being ignored without redirecting to errors
> --
>
> Key: SPARK-17674
> URL: https://issues.apache.org/jira/browse/SPARK-17674
> Project: Spark
>  Issue Type: Test
>  Components: SparkR
>Reporter: Hyukjin Kwon
>
> For example, _currently_ we are having warnings as below:
> https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65905/consoleFull
> {code}
> Warnings 
> ---
> 1. spark.mlp (@test_mllib.R#400) - is.na() applied to non-(list or vector) of 
> type 'NULL'
> 2. spark.mlp (@test_mllib.R#401) - is.na() applied to non-(list or vector) of 
> type 'NULL'
> {code}
> This should be errors as specified in 
> https://github.com/apache/spark/blob/master/R/pkg/tests/run-all.R#L22 
> However, it seems passing the tests fine.
> This seems related with the behaciour in `testhat` library. We should 
> invesigate and fix. This was also discussed in 
> https://github.com/apache/spark/pull/15232



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org