[ https://issues.apache.org/jira/browse/SPARK-8806?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Rekha Joshi updated SPARK-8806: ------------------------------- Description: ./dev/run-tests Scala Style must fail if it does not adhere to Spark Code Style Guide Spark Scala Style :https://cwiki.apache.org/confluence/display/SPARK/Spark+Code+Style+Guide The scala style test passes even if it does not adhere to style guide. Now scala style pass check gives only false illusion of correctness. Alterntively if we can have spark-format.xml for IDE (intellij/eclipse) similar to hadoop-format.xml to avoid style issues? was: ./dev/run-tests Scala Style must fail if it does not adhere to Spark Code Style Guide Spark Scala Style :https://cwiki.apache.org/confluence/display/SPARK/Spark+Code+Style+Guide The scala style test passes even if it does not adhere to style guide.Now scala style pass check gives only false illusion of correctness. Alterntively if we can have spark-format.xml for IDE (intellij/eclipse) similar to hadoop-format.xml to avoid style issues? > run-tests scala style must fail if it does not adhere to Spark Code Style > Guide > ------------------------------------------------------------------------------- > > Key: SPARK-8806 > URL: https://issues.apache.org/jira/browse/SPARK-8806 > Project: Spark > Issue Type: Wish > Components: Build > Affects Versions: 1.5.0 > Reporter: Rekha Joshi > > ./dev/run-tests Scala Style must fail if it does not adhere to Spark Code > Style Guide > Spark Scala Style > :https://cwiki.apache.org/confluence/display/SPARK/Spark+Code+Style+Guide > The scala style test passes even if it does not adhere to style guide. > Now scala style pass check gives only false illusion of correctness. > Alterntively if we can have spark-format.xml for IDE (intellij/eclipse) > similar to hadoop-format.xml to avoid style issues? -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org