Does scoverage work with the spark build in 2.11?  That sounds like a big
win

On Sun, Jul 26, 2015 at 1:29 PM, Josh Rosen <rosenvi...@gmail.com> wrote:

> Given that 2.11 may be more stringent with respect to warnings, we might
> consider building with 2.11 instead of 2.10 in the pull request builder.
> This would also have some secondary benefits in terms of letting us use
> tools like Scapegoat or SCoverage highlighting.
>
> On Sat, Jul 25, 2015 at 8:52 AM, Iulian Dragoș <iulian.dra...@typesafe.com
> > wrote:
>
>> On Fri, Jul 24, 2015 at 8:19 PM, Reynold Xin <r...@databricks.com> wrote:
>>
>> Jenkins only run Scala 2.10. I'm actually not sure what the behavior is
>>> with 2.11 for that patch.
>>>
>>> iulian - can you take a look into it and see if it is working as
>>> expected?
>>>
>> It is, in the sense that warnings fail the build. Unfortunately there are
>> warnings in 2.11 that were not there in 2.10, and that fail the build. For
>> instance:
>>
>> [error] 
>> /Users/dragos/workspace/git/spark/core/src/main/scala/org/apache/spark/rdd/BinaryFileRDD.scala:31:
>>  no valid targets for annotation on value conf - it is discarded unused. You 
>> may specify targets with meta-annotations, e.g. @(transient @param)
>> [error]     @transient conf: Configuration,
>> [error]
>>
>> Currently the 2.11 build is broken. I don’t think fixing these is too
>> hard, but it requires these parameters to become vals. I haven’t looked
>> at all warnings, but I think this is the most common one (if not the only
>> one).
>>
>> iulian
>>
>>
>>>
>>> On Fri, Jul 24, 2015 at 10:24 AM, Iulian Dragoș <
>>> iulian.dra...@typesafe.com> wrote:
>>>
>>>> On Thu, Jul 23, 2015 at 6:08 AM, Reynold Xin <r...@databricks.com>
>>>> wrote:
>>>>
>>>> Hi all,
>>>>>
>>>>> FYI, we just merged a patch that fails a build if there is a scala
>>>>> compiler warning (if it is not deprecation warning).
>>>>>
>>>> I’m a bit confused, since I see quite a lot of warnings in
>>>> semi-legitimate code.
>>>>
>>>> For instance, @transient (plenty of instances like this in
>>>> spark-streaming) might generate warnings like:
>>>>
>>>> abstract class ReceiverInputDStream[T: ClassTag](@transient ssc_ : 
>>>> StreamingContext)
>>>>   extends InputDStream[T](ssc_) {
>>>>
>>>> // and the warning is:
>>>> no valid targets for annotation on value ssc_ - it is discarded unused. 
>>>> You may specify targets with meta-annotations, e.g. @(transient @param)
>>>>
>>>> At least that’s what happens if I build with Scala 2.11, not sure if
>>>> this setting is only for 2.10, or something really weird is happening on my
>>>> machine that doesn’t happen on others.
>>>>
>>>> iulian
>>>>
>>>>
>>>>> In the past, many compiler warnings are actually caused by legitimate
>>>>> bugs that we need to address. However, if we don't fail the build with
>>>>> warnings, people don't pay attention at all to the warnings (it is also
>>>>> tough to pay attention since there are a lot of deprecated warnings due to
>>>>> unit tests testing deprecated APIs and reliance on Hadoop on deprecated
>>>>> APIs).
>>>>>
>>>>> Note that ideally we should be able to mark deprecation warnings as
>>>>> errors as well. However, due to the lack of ability to suppress individual
>>>>> warning messages in the Scala compiler, we cannot do that (since we do 
>>>>> need
>>>>> to access deprecated APIs in Hadoop).
>>>>>
>>>>>
>>>>>  ​
>>>> --
>>>>
>>>> --
>>>> Iulian Dragos
>>>>
>>>> ------
>>>> Reactive Apps on the JVM
>>>> www.typesafe.com
>>>>
>>>>
>>>  ​
>> --
>>
>> --
>> Iulian Dragos
>>
>> ------
>> Reactive Apps on the JVM
>> www.typesafe.com
>>
>>
>

Reply via email to