Sebastián Ramírez created SPARK-13040:
-----------------------------------------
Summary: JDBC using SPARK_CLASSPATH is deprecated but is the only
way documented
Key: SPARK-13040
URL: https://issues.apache.org/jira/browse/SPARK-13040
Project: Spark
Issue Type: Documentation
Components: Documentation, Examples
Affects Versions: 1.6.0
Reporter: Sebastián Ramírez
Priority: Minor
The documentation says that to use a JDBC driver it must be set with the
environment variable SPARK_CLASSPATH, as in:
{code}
SPARK_CLASSPATH=postgresql-9.3-1102-jdbc41.jar bin/spark-shell
{code}
http://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases
But when run like that, the output says that using that environment variable is
deprecated:
{code}
SPARK_CLASSPATH was detected (set to
'/home/senseta/postgresql-9.4.1207.jre7.jar').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --driver-class-path to augment the driver classpath
- spark.executor.extraClassPath to augment the executor classpath
16/01/27 13:36:57 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath'
to '/home/senseta/postgresql-9.4.1207.jre7.jar' as a work-around.
16/01/27 13:36:57 WARN spark.SparkConf: Setting 'spark.driver.extraClassPath'
to '/home/senseta/postgresql-9.4.1207.jre7.jar' as a work-around.
{code}
It would be good to have an example with the current official syntax (I'm
actually not sure of which would be the correct parameters), for Scala and
Python (in case they differ).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]