Great!!!

When i built it on another disk whose format is ext4, it works right now.

hadoop@ubuntu-1:~$ df -Th
Filesystem            Type      Size  Used Avail Use% Mounted on
/dev/sdb6             ext4      135G  8.6G  119G   7% /
udev                  devtmpfs  7.7G  4.0K  7.7G   1% /dev
tmpfs                 tmpfs     3.1G  316K  3.1G   1% /run
none                  tmpfs     5.0M     0  5.0M   0% /run/lock
none                  tmpfs     7.8G  4.0K  7.8G   1% /run/shm
/dev/sda1             ext4      112G  3.7G  103G   4% /faststore
/home/hadoop/.Private ecryptfs  135G  8.6G  119G   7% /home/hadoop

Thanks again, Marcelo Vanzin.


Francis.Hu

-----邮件原件-----
发件人: Marcelo Vanzin [mailto:van...@cloudera.com] 
发送时间: Saturday, April 05, 2014 1:13
收件人: user@spark.apache.org
主题: Re: java.lang.NoClassDefFoundError: 
scala/tools/nsc/transform/UnCurry$UnCurryTransformer...

Hi Francis,

This might be a long shot, but do you happen to have built spark on an
encrypted home dir?

(I was running into the same error when I was doing that. Rebuilding
on an unencrypted disk fixed the issue. This is a known issue /
limitation with ecryptfs. It's weird that the build doesn't fail, but
you do get warnings about the long file names.)


On Wed, Apr 2, 2014 at 3:26 AM, Francis.Hu <francis...@reachjunction.com> wrote:
> I stuck in a NoClassDefFoundError.  Any helps that would be appreciated.
>
> I download spark 0.9.0 source, and then run this command to build it :
> SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly

>
> java.lang.NoClassDefFoundError:
> scala/tools/nsc/transform/UnCurry$UnCurryTransformer$$anonfun$14$$anonfun$apply$5$$anonfun$scala$tools$nsc$transform$UnCurry$UnCurryTransformer$$anonfun$$anonfun$$transformInConstructor$1$1

-- 
Marcelo

Reply via email to