[ 
https://issues.apache.org/jira/browse/SPARK-1439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13971662#comment-13971662
 ] 

Sean Owen commented on SPARK-1439:
----------------------------------

I had a run at this today. First I tried Maven-based formulas, but didn't quite 
do the trick. I made some progress with unidoc although not all the way. Maybe 
an SBT expert can help me figure how to finish it.

*Maven*

http://stackoverflow.com/questions/12301620/how-to-generate-an-aggregated-scaladoc-for-a-maven-site

This works, but, generates *javadoc* for everything, including Scala source. 
The resulting javadoc is not so helpful. It also complains a lot about not 
finding references since javadoc doesn't quite understand links in the same way.

*Maven #2*

You can also invoke the scala-maven-plugin 'doc' goal as part of the site 
generation:

{code:xml}
  <reporting>
    <plugins>
      ...
      <plugin>
        <groupId>net.alchim31.maven</groupId>
        <artifactId>scala-maven-plugin</artifactId>
        <reportSets>
          <reportSet>
            <reports>
              <report>doc</report>
            </reports>
          </reportSet>
        </reportSets>
      </plugin>
    </plugins>
  </reporting>
{code}

It lacks a goal like "aggregate" that the javadoc plugin has, which takes care 
of combining everything into one set of docs. This only generates scaladoc in 
each module in exploded format.

*Unidoc / SBT*

It is almost as easy as:

- adding the plugin to plugins.sbt: {{addSbtPlugin("com.eed3si9n" % 
"sbt-unidoc" % "0.3.0")}}
- {{import sbtunidoc.Plugin.\_}} and {{UnidocKeys.\_}} in SparkBuild.scala
- adding "++ unidocSettings" to rootSettings in SparkBuild.scala

but it was also necessary to:

- {{SPARK_YARN=true}} and {{SPARK_HADOOP_VERSION=2.2.0}}, for example, to make 
YARN scaladoc work
- Exclude {{yarn-alpha}} since scaladoc doesn't like the collision of class 
names:

{code}
  def rootSettings = sharedSettings ++ unidocSettings ++ Seq(
    unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- 
inProjects(yarnAlpha),
    publish := {}
  )
{code}

I still get SBT errors since I think this is not quite correctly finessing the 
build. But it seems almost there.


> Aggregate Scaladocs across projects
> -----------------------------------
>
>                 Key: SPARK-1439
>                 URL: https://issues.apache.org/jira/browse/SPARK-1439
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Documentation
>            Reporter: Matei Zaharia
>             Fix For: 1.0.0
>
>
> Apparently there's a "Unidoc" plugin to put together ScalaDocs across 
> modules: https://github.com/akka/akka/blob/master/project/Unidoc.scala



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to