Hi,
I'm trying to run a java application that connects to a local standalone
spark cluster. I start the cluster with the default configuration, using
start-all.sh. When I go to the web page for the cluster, it is started ok.
I can connect to this cluster with SparkR, but when I use the same
Hi,
I’m trying to run a simple test program to access Spark though Java. I’m
using JDK 1.8, and Spark 1.5. I’m getting an Exception from the
JavaSparkContext constructor. My initialization code matches all the sample
code I’ve found online, so not sure what I’m doing wrong.
Here is my code:
Hi,
I have a library of clustering algorithms that I'm trying to run in the
SparkR interactive shell. (I am working on a proof of concept for a document
classification tool.) Each algorithm takes a term document matrix in the
form of a dataframe. When I pass the method a local dataframe, the
Also, just for completeness, matrix.csv contains:
1,2,3
4,5,6
7,8,9
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-calling-as-vector-with-rdd-dataframe-causes-error-tp24717p24719.html
Sent from the Apache Spark User List mailing list archive at