unsubscribe
Hi all,
I have calculated a covariance?? it's a Matrix type ,now i want to save
the result to hdfs, how can i do it?
thx
Hi all,
I have calculated a covariance?? it's a Matrix type ,now i want to save
the result to hdfs, how can i do it?
thx
iance(); "
The cov is localMatrix, if the Matrix is much more than 5G , is there
any better way to do it.
THX.
-- --
??: "Yanbo Liang";<yblia...@gmail.com>;
: 2016??1??25??(??) 1:31
??:
atrix type.
-- --
??: "Srivathsan Srinivas";<srivathsan.srini...@gmail.com>;
: 2016??1??22??(??) 1:12
??: "zhangjp"<592426...@qq.com>;
: "user"<user@spark.apache.org>;
: Re: retrieve cell v
use apply(i,j) function.
can u know how to save matrix to a file using java language?
-- --
??: "Srivathsan Srinivas";;
: 2016??1??21??(??) 9:04
??: "user";
:
Hi ,all
I have get a Matrix type result with java , But i don't know how to save
the result to a file.
"Matrix cov = mat.computeCovariance();"
THX.
hi all,
I want to use sparkR or spark MLlib load csv file on hdfs then calculate
covariance, how to do it .
thks.
5
??: "Andy Davidson"<a...@santacruzintegration.com>;
"zhangjp"<592426...@qq.com>; "Yanbo Liang"<yblia...@gmail.com>;
: "user"<user@spark.apache.org>;
: Re: how to use sparkR or spark MLlib load csv file on hdfs thencal
Hi all,
I'm using saprk prebuild version 1.5.2+hadoop2.6 and hadoop version is 2.6.2,
when i use java client jdbc to execute sql,there has some issues.
java.lang.RuntimeException: Could not load shims in class
org.apache.hadoop.hive.schshim.FairSchedulerShim
at
I have encountered the same issues. before I changed the spark version i setted
up environment as follows.
spark 1.5.2
hadoop 2.6.2
hive 1.2.1
but no luck it's not work well, even through i run essembly hive in spark with
jdbc mode there is also some properblems.
then I changed the spark
Hi all,
I download the prebuilt version 1.5.2 with hadoop 2.6, when I use spark-sql
there is no problem, but when i start thriftServer and then want to query hive
table useing jdbc there is errors as follows.
Caused by: java.lang.ClassNotFoundException:
Hi,
has any spark write orc document which like the parquet document.
http://spark.apache.org/docs/latest/sql-programming-guide.html#parquet-files
Thanks
13 matches
Mail list logo