I think this is Java 8 v Java 7, if you look at the previous build you see
a lot of the same missing classes but tagged as "warning" rather than
"error". I think all in all it makes sense to stick to JDK7 to build the
legacy build which have been built with it previously.

If there is consensus on that I'm happy to update the env variables for the
RC3 build to set a JDK7 JAVA_HOME (but I'd want to double check with
someone about which jobs need to be updated to make sure I don't miss any).

On Sat, Apr 15, 2017 at 2:33 AM, Sean Owen <so...@cloudera.com> wrote:

> I don't think this is an example of Java 8 javadoc being more strict; it
> is not finding classes, not complaining about syntax.
> (Hyukjin cleaned up all of the javadoc 8 errors in master, and they're
> different and much more extensive!)
>
> It wouldn't necessarily break anything to build with Java 8 because it'll
> still emit Java 7 bytecode, etc.
>
> That said, it may very well be that it is somehow due to Java 7 vs 8, and
> is probably best to stick to 1.7 in the release build.
>
> On Sat, Apr 15, 2017 at 1:38 AM Ryan Blue <rb...@netflix.com.invalid>
> wrote:
>
>> I've hit this before, where Javadoc for 1.8 is much more strict than 1.7.
>>
>> I think we should definitely use Java 1.7 for the release if we used it
>> for the previous releases in the 2.1 line. We don't want to break java 1.7
>> users in a patch release.
>>
>> rb
>>
>> On Fri, Apr 14, 2017 at 5:21 PM, Holden Karau <hol...@pigscanfly.ca>
>> wrote:
>>
>>> Ok and with a bit more digging between RC2 and RC3 we apparently
>>> switched which JVM we are building the docs with.
>>>
>>> The relevant side by side diff of the build logs (
>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%
>>> 20Release/job/spark-release-docs/60/consoleFull https://
>>> amplab.cs.berkeley.edu/jenkins/view/Spark%20Release/
>>> job/spark-release-docs/59/consoleFull ):
>>>
>>> HEAD is now at 2ed19cf... Preparing Spark release v2.1.1-rc3  | HEAD is
>>> now at 02b165d... Preparing Spark release v2.1.1-rc2
>>> Checked out Spark git hash 2ed19cf                            | Checked
>>> out Spark git hash 02b165d
>>> Building Spark docs                                             Building
>>> Spark docs
>>> Configuration file: /home/jenkins/workspace/spark-release-doc
>>> Configuration file: /home/jenkins/workspace/spark-release-doc
>>> Moving to project root and building API docs.                   Moving
>>> to project root and building API docs.
>>> Running 'build/sbt -Pkinesis-asl clean compile unidoc' from /   Running
>>> 'build/sbt -Pkinesis-asl clean compile unidoc' from /
>>> Using /usr/java/jdk1.8.0_60 as default JAVA_HOME.             | Using
>>> /usr/java/jdk1.7.0_79 as default JAVA_HOME.
>>> Note, this will be overridden by -java-home if it is set.       Note,
>>> this will be overridden by -java-home if it is set.
>>>
>>> There have been some known issues with building the docs with JDK8 and I
>>> believe those fixes are in mainline, and we could cherry pick these changes
>>> in -- but I think it might be more reasonable to just build the 2.1 docs
>>> with JDK7.
>>>
>>> What do people think?
>>>
>>>


-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau

Reply via email to