Merge branch 'asf-site' of https://github.com/apache/spark-website into 
add-sbt-package


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/3c96a509
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/3c96a509
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/3c96a509

Branch: refs/heads/asf-site
Commit: 3c96a509cb1e00df716d8f38eb5b214d8778e45d
Parents: 9902531 aa1c66e
Author: Stan Zhai <zhaishi...@haizhi.com>
Authored: Tue Apr 25 15:17:57 2017 +0800
Committer: Stan Zhai <zhaishi...@haizhi.com>
Committed: Tue Apr 25 15:17:57 2017 +0800

----------------------------------------------------------------------
 _layouts/global.html                            |   1 +
 community.md                                    |   6 +
 contributing.md                                 |   4 +
 developer-tools.md                              |  57 +++-
 improvement-proposals.md                        |  91 ++++++
 ...3-31-spark-summit-june-2017-agenda-posted.md |  15 +
 site/committers.html                            |   7 +-
 site/community.html                             |  13 +-
 site/contributing.html                          |  11 +-
 site/developer-tools.html                       | 106 ++++++-
 site/documentation.html                         |   7 +-
 site/downloads.html                             |   7 +-
 site/examples.html                              |   7 +-
 site/faq.html                                   |   7 +-
 site/graphx/index.html                          |   7 +-
 site/improvement-proposals.html                 | 295 +++++++++++++++++++
 site/index.html                                 |   7 +-
 site/mailing-lists.html                         |   7 +-
 site/mllib/index.html                           |   7 +-
 site/news/amp-camp-2013-registration-ope.html   |   7 +-
 .../news/announcing-the-first-spark-summit.html |   7 +-
 .../news/fourth-spark-screencast-published.html |   7 +-
 site/news/index.html                            |  16 +-
 site/news/nsdi-paper.html                       |   7 +-
 site/news/one-month-to-spark-summit-2015.html   |   7 +-
 .../proposals-open-for-spark-summit-east.html   |   7 +-
 ...registration-open-for-spark-summit-east.html |   7 +-
 .../news/run-spark-and-shark-on-amazon-emr.html |   7 +-
 site/news/spark-0-6-1-and-0-5-2-released.html   |   7 +-
 site/news/spark-0-6-2-released.html             |   7 +-
 site/news/spark-0-7-0-released.html             |   7 +-
 site/news/spark-0-7-2-released.html             |   7 +-
 site/news/spark-0-7-3-released.html             |   7 +-
 site/news/spark-0-8-0-released.html             |   7 +-
 site/news/spark-0-8-1-released.html             |   7 +-
 site/news/spark-0-9-0-released.html             |   7 +-
 site/news/spark-0-9-1-released.html             |   7 +-
 site/news/spark-0-9-2-released.html             |   7 +-
 site/news/spark-1-0-0-released.html             |   7 +-
 site/news/spark-1-0-1-released.html             |   7 +-
 site/news/spark-1-0-2-released.html             |   7 +-
 site/news/spark-1-1-0-released.html             |   7 +-
 site/news/spark-1-1-1-released.html             |   7 +-
 site/news/spark-1-2-0-released.html             |   7 +-
 site/news/spark-1-2-1-released.html             |   7 +-
 site/news/spark-1-2-2-released.html             |   7 +-
 site/news/spark-1-3-0-released.html             |   7 +-
 site/news/spark-1-4-0-released.html             |   7 +-
 site/news/spark-1-4-1-released.html             |   7 +-
 site/news/spark-1-5-0-released.html             |   7 +-
 site/news/spark-1-5-1-released.html             |   7 +-
 site/news/spark-1-5-2-released.html             |   7 +-
 site/news/spark-1-6-0-released.html             |   7 +-
 site/news/spark-1-6-1-released.html             |   7 +-
 site/news/spark-1-6-2-released.html             |   7 +-
 site/news/spark-1-6-3-released.html             |   7 +-
 site/news/spark-2-0-0-released.html             |   7 +-
 site/news/spark-2-0-1-released.html             |   7 +-
 site/news/spark-2-0-2-released.html             |   7 +-
 site/news/spark-2-1-0-released.html             |   7 +-
 site/news/spark-2.0.0-preview.html              |   7 +-
 .../spark-accepted-into-apache-incubator.html   |   7 +-
 site/news/spark-and-shark-in-the-news.html      |   7 +-
 site/news/spark-becomes-tlp.html                |   7 +-
 site/news/spark-featured-in-wired.html          |   7 +-
 .../spark-mailing-lists-moving-to-apache.html   |   7 +-
 site/news/spark-meetups.html                    |   7 +-
 site/news/spark-screencasts-published.html      |   7 +-
 site/news/spark-summit-2013-is-a-wrap.html      |   7 +-
 site/news/spark-summit-2014-videos-posted.html  |   7 +-
 site/news/spark-summit-2015-videos-posted.html  |   7 +-
 site/news/spark-summit-agenda-posted.html       |   7 +-
 .../spark-summit-east-2015-videos-posted.html   |   7 +-
 .../spark-summit-east-2016-cfp-closing.html     |   7 +-
 .../spark-summit-east-2017-agenda-posted.html   |   7 +-
 site/news/spark-summit-east-agenda-posted.html  |   7 +-
 .../news/spark-summit-europe-agenda-posted.html |   7 +-
 site/news/spark-summit-europe.html              |   7 +-
 .../spark-summit-june-2016-agenda-posted.html   |   7 +-
 .../spark-summit-june-2017-agenda-posted.html   | 221 ++++++++++++++
 site/news/spark-tips-from-quantifind.html       |   7 +-
 .../spark-user-survey-and-powered-by-page.html  |   7 +-
 site/news/spark-version-0-6-0-released.html     |   7 +-
 .../spark-wins-cloudsort-100tb-benchmark.html   |   7 +-
 ...-wins-daytona-gray-sort-100tb-benchmark.html |   7 +-
 .../strata-exercises-now-available-online.html  |   7 +-
 .../news/submit-talks-to-spark-summit-2014.html |   7 +-
 .../news/submit-talks-to-spark-summit-2016.html |   7 +-
 .../submit-talks-to-spark-summit-east-2016.html |   7 +-
 .../submit-talks-to-spark-summit-eu-2016.html   |   7 +-
 site/news/two-weeks-to-spark-summit-2014.html   |   7 +-
 ...deo-from-first-spark-development-meetup.html |   7 +-
 site/powered-by.html                            |   7 +-
 site/release-process.html                       |   7 +-
 site/releases/spark-release-0-3.html            |   7 +-
 site/releases/spark-release-0-5-0.html          |   7 +-
 site/releases/spark-release-0-5-1.html          |   7 +-
 site/releases/spark-release-0-5-2.html          |   7 +-
 site/releases/spark-release-0-6-0.html          |   7 +-
 site/releases/spark-release-0-6-1.html          |   7 +-
 site/releases/spark-release-0-6-2.html          |   7 +-
 site/releases/spark-release-0-7-0.html          |   7 +-
 site/releases/spark-release-0-7-2.html          |   7 +-
 site/releases/spark-release-0-7-3.html          |   7 +-
 site/releases/spark-release-0-8-0.html          |   7 +-
 site/releases/spark-release-0-8-1.html          |   7 +-
 site/releases/spark-release-0-9-0.html          |   7 +-
 site/releases/spark-release-0-9-1.html          |   7 +-
 site/releases/spark-release-0-9-2.html          |   7 +-
 site/releases/spark-release-1-0-0.html          |   7 +-
 site/releases/spark-release-1-0-1.html          |   7 +-
 site/releases/spark-release-1-0-2.html          |   7 +-
 site/releases/spark-release-1-1-0.html          |   7 +-
 site/releases/spark-release-1-1-1.html          |   7 +-
 site/releases/spark-release-1-2-0.html          |   7 +-
 site/releases/spark-release-1-2-1.html          |   7 +-
 site/releases/spark-release-1-2-2.html          |   7 +-
 site/releases/spark-release-1-3-0.html          |   7 +-
 site/releases/spark-release-1-3-1.html          |   7 +-
 site/releases/spark-release-1-4-0.html          |   7 +-
 site/releases/spark-release-1-4-1.html          |   7 +-
 site/releases/spark-release-1-5-0.html          |   7 +-
 site/releases/spark-release-1-5-1.html          |   7 +-
 site/releases/spark-release-1-5-2.html          |   7 +-
 site/releases/spark-release-1-6-0.html          |   7 +-
 site/releases/spark-release-1-6-1.html          |   7 +-
 site/releases/spark-release-1-6-2.html          |   7 +-
 site/releases/spark-release-1-6-3.html          |   7 +-
 site/releases/spark-release-2-0-0.html          |   7 +-
 site/releases/spark-release-2-0-1.html          |   7 +-
 site/releases/spark-release-2-0-2.html          |   7 +-
 site/releases/spark-release-2-1-0.html          |   7 +-
 site/research.html                              |   7 +-
 site/screencasts/1-first-steps-with-spark.html  |   7 +-
 .../2-spark-documentation-overview.html         |   7 +-
 .../3-transformations-and-caching.html          |   7 +-
 .../4-a-standalone-job-in-spark.html            |   7 +-
 site/screencasts/index.html                     |   7 +-
 site/sitemap.xml                                |  20 +-
 site/sql/index.html                             |   7 +-
 site/streaming/index.html                       |   7 +-
 site/third-party-projects.html                  |   7 +-
 site/trademarks.html                            |   7 +-
 site/versioning-policy.html                     |   7 +-
 144 files changed, 1360 insertions(+), 413 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark-website/blob/3c96a509/developer-tools.md
----------------------------------------------------------------------
diff --cc developer-tools.md
index 0723115,7c14de8..17f7b26
--- a/developer-tools.md
+++ b/developer-tools.md
@@@ -128,37 -111,60 +128,91 @@@ To run individual Java tests, you can u
  build/mvn test -DwildcardSuites=none 
-Dtest=org.apache.spark.streaming.JavaAPISuite test
  ```
  
 +<h3>ScalaTest Issues</h3>
 +
 +If the following error occurs when running ScalaTest
 +
 +```
 +An internal error occurred during: "Launching XYZSuite.scala".
 +java.lang.NullPointerException
 +```
 +It is due to an incorrect Scala library in the classpath. To fix it:
 +
 +- Right click on project
 +- Select `Build Path | Configure Build Path`
 +- `Add Library | Scala Library`
 +- Remove `scala-library-2.10.4.jar - lib_managed\jars`
 +
 +In the event of "Could not find resource path for Web UI: 
org/apache/spark/ui/static", 
 +it's due to a classpath issue (some classes were probably not compiled). To 
fix this, it 
 +sufficient to run a test from the command line:
 +
 +```
 +build/sbt "test-only org.apache.spark.rdd.SortingSuite"
 +```
 +
 +<h3>Running Different Test Permutations on Jenkins</h3>
 +
 +When running tests for a pull request on Jenkins, you can add special phrases 
to the title of 
 +your pull request to change testing behavior. This includes:
 +
 +- `[test-maven]` - signals to test the pull request using maven
 +- `[test-hadoop2.7]` - signals to test using Spark's Hadoop 2.7 profile
 +
+ <h3>Binary compatibility</h3>
+ 
+ To ensure binary compatibility, Spark uses 
[MiMa](https://github.com/typesafehub/migration-manager).
+ 
+ <h4>Ensuring binary compatibility</h4>
+ 
+ When working on an issue, it's always a good idea to check that your changes 
do
+ not introduce binary incompatibilities before opening a pull request.
+ 
+ You can do so by running the following command:
+ 
+ ```
+ $ dev/mima
+ ```
+ 
+ A binary incompatibility reported by MiMa might look like the following:
+ 
+ ```
+ [error] method this(org.apache.spark.sql.Dataset)Unit in class 
org.apache.spark.SomeClass does not have a correspondent in current version
+ [error] filter with: 
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SomeClass.this")
+ ```
+ 
+ If you open a pull request containing binary incompatibilities anyway, Jenkins
+ will remind you by failing the test build with the following message:
+ 
+ ```
+ Test build #xx has finished for PR yy at commit ffffff.
+ 
+   This patch fails MiMa tests.
+   This patch merges cleanly.
+   This patch adds no public classes.
+ ```
+ 
+ <h4>Solving a binary incompatibility</h4>
+ 
+ If you believe that your binary incompatibilies are justified or that MiMa
+ reported false positives (e.g. the reported binary incompatibilities are 
about a
+ non-user facing API), you can filter them out by adding an exclusion in
+ 
[project/MimaExcludes.scala](https://github.com/apache/spark/blob/master/project/MimaExcludes.scala)
+ containing what was suggested by the MiMa report and a comment containing the
+ JIRA number of the issue you're working on as well as its title.
+ 
+ For the problem described above, we might add the following:
+ 
+ {% highlight scala %}
+ // [SPARK-zz][CORE] Fix an issue
+ 
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SomeClass.this")
+ {% endhighlight %}
+ 
+ Otherwise, you will have to resolve those incompatibilies before opening or
+ updating your pull request. Usually, the problems reported by MiMa are
+ self-explanatory and revolve around missing members (methods or fields) that
+ you will have to add back in order to maintain binary compatibility.
+ 
  <h3>Checking Out Pull Requests</h3>
  
  Git provides a mechanism for fetching remote pull requests into your own 
local repository. 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/3c96a509/site/developer-tools.html
----------------------------------------------------------------------
diff --cc site/developer-tools.html
index 62793ef,06efb7c..680e089
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@@ -303,38 -287,54 +304,89 @@@ $ build/mvn package -DskipTests -pl cor
  <pre><code>build/mvn test -DwildcardSuites=none 
-Dtest=org.apache.spark.streaming.JavaAPISuite test
  </code></pre>
  
++<<<<<<< HEAD
 +<h3>ScalaTest Issues</h3>
 +
 +<p>If the following error occurs when running ScalaTest</p>
 +
 +<pre><code>An internal error occurred during: "Launching XYZSuite.scala".
 +java.lang.NullPointerException
 +</code></pre>
 +<p>It is due to an incorrect Scala library in the classpath. To fix it:</p>
 +
 +<ul>
 +  <li>Right click on project</li>
 +  <li>Select <code>Build Path | Configure Build Path</code></li>
 +  <li><code>Add Library | Scala Library</code></li>
 +  <li>Remove <code>scala-library-2.10.4.jar - lib_managed\jars</code></li>
 +</ul>
 +
 +<p>In the event of &#8220;Could not find resource path for Web UI: 
org/apache/spark/ui/static&#8221;, 
 +it&#8217;s due to a classpath issue (some classes were probably not 
compiled). To fix this, it 
 +sufficient to run a test from the command line:</p>
 +
 +<pre><code>build/sbt "test-only org.apache.spark.rdd.SortingSuite"
 +</code></pre>
 +
 +<h3>Running Different Test Permutations on Jenkins</h3>
 +
 +<p>When running tests for a pull request on Jenkins, you can add special 
phrases to the title of 
 +your pull request to change testing behavior. This includes:</p>
 +
 +<ul>
 +  <li><code>[test-maven]</code> - signals to test the pull request using 
maven</li>
 +  <li><code>[test-hadoop2.7]</code> - signals to test using Spark&#8217;s 
Hadoop 2.7 profile</li>
 +</ul>
++=======
+ <h3>Binary compatibility</h3>
+ 
+ <p>To ensure binary compatibility, Spark uses <a 
href="https://github.com/typesafehub/migration-manager";>MiMa</a>.</p>
+ 
+ <h4>Ensuring binary compatibility</h4>
+ 
+ <p>When working on an issue, it&#8217;s always a good idea to check that your 
changes do
+ not introduce binary incompatibilities before opening a pull request.</p>
+ 
+ <p>You can do so by running the following command:</p>
+ 
+ <pre><code>$ dev/mima
+ </code></pre>
+ 
+ <p>A binary incompatibility reported by MiMa might look like the 
following:</p>
+ 
+ <pre><code>[error] method this(org.apache.spark.sql.Dataset)Unit in class 
org.apache.spark.SomeClass does not have a correspondent in current version
+ [error] filter with: 
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SomeClass.this")
+ </code></pre>
+ 
+ <p>If you open a pull request containing binary incompatibilities anyway, 
Jenkins
+ will remind you by failing the test build with the following message:</p>
+ 
+ <pre><code>Test build #xx has finished for PR yy at commit ffffff.
+ 
+   This patch fails MiMa tests.
+   This patch merges cleanly.
+   This patch adds no public classes.
+ </code></pre>
+ 
+ <h4>Solving a binary incompatibility</h4>
+ 
+ <p>If you believe that your binary incompatibilies are justified or that MiMa
+ reported false positives (e.g. the reported binary incompatibilities are 
about a
+ non-user facing API), you can filter them out by adding an exclusion in
+ <a 
href="https://github.com/apache/spark/blob/master/project/MimaExcludes.scala";>project/MimaExcludes.scala</a>
+ containing what was suggested by the MiMa report and a comment containing the
+ JIRA number of the issue you&#8217;re working on as well as its title.</p>
+ 
+ <p>For the problem described above, we might add the following:</p>
+ 
+ <figure class="highlight"><pre><code class="language-scala" 
data-lang="scala"><span></span><span class="c1">// [SPARK-zz][CORE] Fix an 
issue</span>
+ <span class="nc">ProblemFilters</span><span class="o">.</span><span 
class="n">exclude</span><span class="o">[</span><span 
class="kt">DirectMissingMethodProblem</span><span class="o">](</span><span 
class="s">&quot;org.apache.spark.SomeClass.this&quot;</span><span 
class="o">)</span></code></pre></figure>
+ 
+ <p>Otherwise, you will have to resolve those incompatibilies before opening or
+ updating your pull request. Usually, the problems reported by MiMa are
+ self-explanatory and revolve around missing members (methods or fields) that
+ you will have to add back in order to maintain binary compatibility.</p>
++>>>>>>> aa1c66e424e024cb2e9f962aae8952bb4ad75cb5
  
  <h3>Checking Out Pull Requests</h3>
  
@@@ -376,6 -376,48 +428,51 @@@ $ build/mvn -DskipTests instal
  $ build/mvn dependency:tree
  </code></pre>
  
++<<<<<<< HEAD
++=======
+ <p><a name="individual-tests"></a></p>
+ <h3>Running Build Targets For Individual Projects</h3>
+ 
+ <pre><code>$ # sbt
+ $ build/sbt package
+ $ # Maven
+ $ build/mvn package -DskipTests -pl assembly
+ </code></pre>
+ 
+ <h3>ScalaTest Issues</h3>
+ 
+ <p>If the following error occurs when running ScalaTest</p>
+ 
+ <pre><code>An internal error occurred during: "Launching XYZSuite.scala".
+ java.lang.NullPointerException
+ </code></pre>
+ <p>It is due to an incorrect Scala library in the classpath. To fix it:</p>
+ 
+ <ul>
+   <li>Right click on project</li>
+   <li>Select <code>Build Path | Configure Build Path</code></li>
+   <li><code>Add Library | Scala Library</code></li>
+   <li>Remove <code>scala-library-2.10.4.jar - lib_managed\jars</code></li>
+ </ul>
+ 
+ <p>In the event of &#8220;Could not find resource path for Web UI: 
org/apache/spark/ui/static&#8221;, 
+ it&#8217;s due to a classpath issue (some classes were probably not 
compiled). To fix this, it is
+ sufficient to run a test from the command line:</p>
+ 
+ <pre><code>build/sbt "test-only org.apache.spark.rdd.SortingSuite"
+ </code></pre>
+ 
+ <h3>Running Different Test Permutations on Jenkins</h3>
+ 
+ <p>When running tests for a pull request on Jenkins, you can add special 
phrases to the title of 
+ your pull request to change testing behavior. This includes:</p>
+ 
+ <ul>
+   <li><code>[test-maven]</code> - signals to test the pull request using 
maven</li>
+   <li><code>[test-hadoop2.7]</code> - signals to test using Spark&#8217;s 
Hadoop 2.7 profile</li>
+ </ul>
+ 
++>>>>>>> aa1c66e424e024cb2e9f962aae8952bb4ad75cb5
  <h3>Organizing Imports</h3>
  
  <p>You can use a <a href="https://plugins.jetbrains.com/plugin/7350";>IntelliJ 
Imports Organizer</a> 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to