hi, spark community,  i have setup 3 nodes cluster using spark 0.9 and
shark 0.9,  My question is :
1. is there any neccessary to install shark on every node since it is a
client to use spark service ?
2. when i run shark-withinfo, i got such warning:
 WARN shark.SharkEnv: Hive Hadoop shims detected local mode, but Shark is
not running locally.
WARN shark.SharkEnv: Setting mapred.job.tracker to 'Spark_1394093746930'
(was 'local')
what does this log want to tell us ?
is it a problem to run shark?
3. i want to load data from hdfs , so i run "LOAD DATA INPATH
'/user/root/input/test.txt' into table b; " , but i got this error:No files
matching path file:/user/root/input/test.txt , but this file exists on
hdfs.

thanks.

Reply via email to