Github user markhamstra commented on a diff in the pull request:
https://github.com/apache/spark/pull/14332#discussion_r71999920
--- Diff:
examples/src/main/scala/org/apache/spark/examples/ml/DataFrameExample.scala ---
@@ -54,14 +54,13 @@ object DataFrameExample {
}
}
- parser.parse(args, defaultParams).map { params =>
- run(params)
- }.getOrElse {
- sys.exit(1)
+ parser.parse(args, defaultParams) match {
+ case Some(params) => run(params)
+ case _ => sys.exit(1)
}
--- End diff --
The argument from consistency says to treat `Option` as a collection or
monad, not as something special, and to treat `Unit` as just another type, not
as something special. Treated consistently, the return of the zero-value in a
fold over a None is not more surprising than folding over an empty List
producing the zero-value. To functional programmer aware of the semantics of
fold, producing zeroes from folding over empty collections is every bit as
explicit as an if-else or pattern match.
The argument about using `fold` with `Option` in Spark isn't going anywhere
at this point, but you should look at the Scala API docs, which include the
comment that "[a] less-idiomatic way to use scala.Option values is via pattern
matching." Also see other commentary, such as
http://blog.originate.com/blog/2014/06/15/idiomatic-scala-your-options-do-not-match/
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]