Yup, sorry about that. This error message should not produce incorrect
behavior, but it is annoying. Posted a patch to fix it:
https://github.com/apache/spark/pull/361
Thanks for reporting it!
On Tue, Apr 8, 2014 at 9:54 AM, Koert Kuipers wrote:
> when i start spark-shell i now see
>
> ls: can
when i start spark-shell i now see
ls: cannot access /usr/local/lib/spark/lib_managed/jars/: No such file or
directory
we do not package a lib_managed with our spark build (never did). maybe the
logic in compute-classpath.sh that searches for datanucleus should check
for the existence of lib_mana