I found updated materials (ampcamp4 from Feb 2014)  and tried the
instructions at
http://ampcamp.berkeley.edu/4/exercises/launching-a-bdas-cluster-on-ec2.html
.

We have some similar errors even on the latest

cp: cannot create regular file `/root/mesos-ec2/': Is a directory


RSYNC'ing /root/mesos-ec2 to slaves...
ec2-54-204-51-194.compute-1.amazonaws.com
rsync: link_stat "/root/mesos-ec2" failed: No such file or directory (2)
rsync error: some files/attrs were not transferred (see previous errors)
(code 23) at main.c(1039) [sender=3.0.6]
ec2-54-82-3-108.compute-1.amazonaws.com
rsync: link_stat "/root/mesos-ec2" failed: No such file or directory (2)


But  in this latest version the mesos errors appear not to be  fatal: the
cluster is in the process of coming up (copying wikipedia data now..)
.



2014-03-08 6:26 GMT-08:00 Stephen Boesch <java...@gmail.com>:

> The spark-training scripts are not  presently working 100%: the errors
> displayed when starting the slaves are shown below.
>
> Possibly a newer location for the files exists (I pulled from
> https://github.com/amplab/training-scripts an it is nearly 6 months old)
>
>
> cp: cannot create regular file `/root/mesos-ec2/': Is a directory
>
>
> cp: cannot create regular file `/root/mesos-ec2/cluster-url': No such file
> or directory
> rsync: link_stat "/root/mesos-ec2" failed: No such file or directory (2)
> rsync error: some files/attrs were not transferred (see previous errors)
> (code 23) at main.c(1039) [sender=3.0.6]
> ec2-54-224-242-28.compute-1.amazonaws.com
> rsync: link_stat "/root/mesos-ec2" failed: No such file or directory (2)
> rsync error: some files/attrs were not transferred (see previous errors)
> (code 23) at main.c(1039) [sender=3.0.6]
> ec2-54-221-177-212.compute-1.amazonaws.com
> ..
> main.c(1039) [sender=3.0.6]
> ./spark-standalone/setup.sh: line 18: /root/spark/bin/stop-all.sh: No such
> file or directory
> ./spark-standalone/setup.sh: line 23: /root/spark/bin/start-master.sh: No
> such file or directory
> ./spark-standalone/setup.sh: line 29: /root/spark/bin/start-slaves.sh: No
> such file or directory
>
> ~ ~/spark-ec2
> ./training/setup.sh: line 36: pushd: /root/MLI: No such file or directory
> fatal: Not a git repository (or any of the parent directories): .git
> fatal: Not a git repository (or any of the parent directories): .git
> fatal: Not a git repository (or any of the parent directories): .git
> ./training/setup.sh: line 41: ./sbt/sbt: No such file or directory
> RSYNC'ing /root/MLI to slaves...
> ec2-50-19-61-228.compute-1.amazonaws.com
> rsync: link_stat "/root/MLI" failed: No such file or directory (2)
> rsync error: some files/attrs were not transferred (see previous errors)
> (code 23) at main.c(1039) [sender=3.0.6]
> ec2-54-224-242-28.compute-1.amazonaws.com
> rsync: link_stat "/root/MLI" failed: No such file or directory (2)
> rsync error: some files/attrs were not transferred (see previous errors)
> (code 23) at main.c(1039) [sender=3.0.6]
> ..
> main.c(1039) [sender=3.0.6]
> ~/spark-ec2
> ./training/setup.sh: line 46: pushd: /root/hive_blinkdb: No such file or
> directory
> Already up-to-date.
> Buildfile: build.xml does not exist!
> Build failed
> RSYNC'ing /root/hive_blinkdb to slaves...
> ec2-50-19-61-228.compute-1.amazonaws.com
> rsync: link_stat "/root/hive_blinkdb" failed: No such file or directory (2)
> rsync error: some files/attrs were not transferred (see previous errors)
> (code 23) at main.c(1039) [sender=3.0.6]
> ec2-54-224-242-28.compute-1.amazonaws.com
> ..
> main.c(1039) [sender=3.0.6]
> ./training/setup.sh: line 50: popd: directory stack empty
> ~/blinkdb ~/spark-ec2
>
> Waiting for cluster to start...
> Traceback (most recent call last):
>   File "./spark_ec2.py", line 915, in <module>
>     main()
>   File "./spark_ec2.py", line 758, in main
>     err = wait_for_spark_cluster(master_nodes, opts)
>    File "./spark_ec2.py", line 724, in wait_for_spark_cluster
>     err = check_spark_cluster(master_nodes, opts)
>   File "./spark_ec2.py", line 453, in check_spark_cluster
>     response = urllib2.urlopen(url)
>   File
> "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py",
> line 127, in urlopen
>     return _opener.open(url, data, timeout)
>   File
> "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py",
> line 404, in open
>     response = self._open(req, data)
>   File
> "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py",
> line 422, in _open
>     '_open', req)
>   File
> "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py",
> line 382, in _call_chain
>     result = func(*args)
>   File
> "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py",
> line 1214, in http_open
>     return self.do_open(httplib.HTTPConnection, req)
>   File
> "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py",
> line 1184, in do_open
>     raise URLError(err)
> urllib2.URLError: <urlopen error [Errno 61] Connection refused>
>

Reply via email to