Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/6928#discussion_r33059857
--- Diff: docs/sparkr.md ---
@@ -62,7 +62,9 @@ head(df)
SparkR supports operating on a variety of data sources through the
`DataFrame` interface. This section describes the general methods for loading
and saving data using Data Sources. You can check the Spark SQL programming
guide for more [specific
options](sql-programming-guide.html#manually-specifying-options) that are
available for the built-in data sources.
-The general method for creating DataFrames from data sources is `read.df`.
This method takes in the `SQLContext`, the path for the file to load and the
type of data source. SparkR supports reading JSON and Parquet files natively
and through [Spark Packages](http://spark-packages.org/) you can find data
source connectors for popular file formats like
[CSV](http://spark-packages.org/package/databricks/spark-csv) and
[Avro](http://spark-packages.org/package/databricks/spark-avro).
+The general method for creating DataFrames from data sources is `read.df`.
This method takes in the `SQLContext`, the path for the file to load and the
type of data source. SparkR supports reading JSON and Parquet files natively
and through [Spark Packages](http://spark-packages.org/) you can find data
source connectors for popular file formats like
[CSV](http://spark-packages.org/package/databricks/spark-csv) and
[Avro](http://spark-packages.org/package/databricks/spark-avro). These packages
can either be added by
+specifying `--packages` with `sparm-submit` or `sparkR` commands, or if
creating context through `init`
--- End diff --
typo: `spark-submit`
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]