[
https://issues.apache.org/jira/browse/SPARK-9486?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Michael Armbrust updated SPARK-9486:
------------------------------------
Target Version/s: 1.5.0
> Add aliasing to data sources to allow external packages to register
> themselves with Spark
> -----------------------------------------------------------------------------------------
>
> Key: SPARK-9486
> URL: https://issues.apache.org/jira/browse/SPARK-9486
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Reporter: Joseph Batchik
> Priority: Minor
>
> Currently Spark allows users to use external data sources like spark-avro,
> spark-csv, etc by having them specifying their full class name:
> {code:java}
> sqlContext.read.format("com.databricks.spark.avro").load(path)
> {code}
> Typing in a full class is not the best idea so it would be nice to allow the
> external packages to be able to register themselves with Spark to allow users
> to do something like:
> {code:java}
> sqlContext.read.format("avro").load(path)
> {code}
> This would make it so that the external data source packages follow the same
> convention as the built in data sources do, parquet, json, jdbc, etc.
> This could be accomplished by using a ServiceLoader.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]