Re: Pls assist: which conf file do i need to modify if i want spark-shell to inclucde external packages?

2016-04-21 Thread Mich Talebzadeh
try this using the shell parameter SPARK_CLASSPATH in $HIVE_HOME/conf cp spark-env.sh.template spark-env.sh Then edit that file and set export SPARK_CLASSPATH= Connect to spark-shell and see if it find it HTH Dr Mich Talebzadeh LinkedIn *

Re: Pls assist: which conf file do i need to modify if i want spark-shell to inclucde external packages?

2016-04-21 Thread Marco Mistroni
Thank mich but I seem to remember to modify a config file so that I don't need to specify the --packages option every time I start the shell Kr On 21 Apr 2016 3:20 pm, "Mich Talebzadeh" wrote: > on spark-shell this will work > > $SPARK_HOME/bin/spark-shell *--packages

Re: Pls assist: which conf file do i need to modify if i want spark-shell to inclucde external packages?

2016-04-21 Thread Mich Talebzadeh
on spark-shell this will work $SPARK_HOME/bin/spark-shell *--packages *com.databricks:spark-csv_2.11:1.3.0 HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Pls assist: which conf file do i need to modify if i want spark-shell to inclucde external packages?

2016-04-21 Thread Marco Mistroni
HI all i need to use spark-csv in my spark instance, and i want to avoid launching spark-shell by passing the package name every time I seem to remember that i need to amend a file in the /conf directory to inlcude e,g spark.packages com.databricks:spark-csv_2.11:1.4.0 but i cannot find