Hi,
I have spark up and running, spark-shell works, and issues jobs to the
spark cluster.
Following Version i am using :
scala 2.9.3
spark 0.8.0
shark 0.7.0
hive 0.9.0
i have tested sample code on sparrk and it running fine but only when If I
try from shark shell select query I get following errors:
WARN cluster.ClusterScheduler: Initial job has not accepted any
resources; check your cluster UI to ensure that workers are registered
but on UI i can see workers are running up for spark
Basically shark doesn't seem to be communicating with the spark cluster,
and I cant' see why, especially since raw spark seems to work.
is their any setting i am missing ?
Any help appreciated...
Thanks,
Vinayak