great thank you 

    On Friday, 19 February 2016, 15:33, Holden Karau <[email protected]> 
wrote:
 

 So with --packages to spark-shell and spark-submit Spark will automatically 
fetch the requirements from maven. If you want to use an explicit local jar you 
can do that with the --jars syntax. You might find 
http://spark.apache.org/docs/latest/submitting-applications.html useful.
On Fri, Feb 19, 2016 at 7:26 AM, Ashok Kumar <[email protected]> 
wrote:

 Hi,
I downloaded the zipped csv libraries from databricks/spark-csv
|   |
|   |  |   |   |   |   |   |
| databricks/spark-csvspark-csv - CSV data source for Spark SQL and DataFrames |
|  |
| View on github.com | Preview by Yahoo |
|  |
|   |


Now I have a directory created called spark-csv-master.  I would like to use 
this in spark-shell with ---packgage like below
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-csv_2.11:1.3.0
Do I need to use mvn to create a zipped file to start. or may be added to spark 
CLASSPATH. What is needful here please to make it work 
thanks




-- 
Cell : 425-233-8271Twitter: https://twitter.com/holdenkarau

  

Reply via email to