The trunk should work just fine. I think in your case the download is failing for Hadoop or for Mahout.
On Fri, Feb 17, 2012 at 6:33 PM, Frank Scholten <[email protected]>wrote: > Hi all, > > I am having trouble starting a Hadoop / Mahout cluster with Whirr > trunk, commit 44fb39fc8. > > Several errors are reported. The first one is: > > Bootstrapping cluster > Configuring template > Starting 1 node(s) with roles [hadoop-jobtracker, hadoop-namenode, > mahout-client] > Configuring template > Starting 4 node(s) with roles [hadoop-datanode, hadoop-tasktracker] > Dying because - net.schmizz.sshj.transport.TransportException: Broken > transport; encountered EOF > Dying because - net.schmizz.sshj.transport.TransportException: Broken > transport; encountered EOF > << > (ubuntu:rsa[fingerprint(af:e3:53:27:e0:12:18:54:1c:fc:3b:24:b9:18:39:10),sha1(83:6a:70:2f:c2:d5:3d:e0:05:7a:4a:e5:1a:51:67:dc:2b:56:62:18)]@ > 50.17.130.132:22) > error acquiring SSHClient(timeout=60000) (attempt 1 of 7): Socket > closed > > This repeats several times until I get a stacktrace > > call get() on this exception to get access to the task in progress > at > org.jclouds.compute.callables.BlockUntilInitScriptStatusIsZeroThenReturnOutput.get(BlockUntilInitScriptStatusIsZeroThenReturnOutput.java:195) > at > org.jclouds.compute.callables.RunScriptOnNodeAsInitScriptUsingSshAndBlockUntilComplete.doCall(RunScriptOnNodeAsInitScriptUsingSshAndBlockUntilComplete.java:60) > ... 8 more > > which is also repeated for several roles > > and at the end I get > > Successfully executed configure script: [output=, error=chown: invalid > user: `hadoop:hadoop' > cp: target `/usr/local/hadoop/conf' is not a directory > cp: cannot create regular file `/usr/local/hadoop/conf': No such file > or directory > chown: invalid user: `hadoop:hadoop' > chown: invalid user: `hadoop:hadoop' > chown: invalid user: `hadoop:hadoop' > Unknown id: hadoop > Unknown id: hadoop > , exitCode=0] > > for several roles. > > Has something changed recently that caused this problem? > > Cheers, > > Frank >
