I'm a rank beginner with clusters, but am determined to move into them, starting with Hadoop. I have a habuntu machine under VMware on my MacBook Pro for starters (got it on a DVD when visiting at the Googleplex).
Now I've just received an Apple Xserve and four Mac Minis to set up a cluster. Though I know little, I did see a suggestion somewhere that I set up a small Mac OS X partition on each, under VMware Fusion, for general admin via the Apple Server, for Mac firmware updates, monitoring, etc. Then I'd set up a larger partition under VMware on each of the five machines with linux images, to do the heavy lifting. I'm looking for any and all suggestions from whomever. You may well be able to skip all sorts of details and just point me to answers/examples out there, people, search terms, etc. Then I can start doing my homework. I'm happy to experiment, setting up stuff, stripping it clean, setting up another approach, etc., as part of learning what works. Looking forward to the standard mix of confusion, frustration, success, failure, elation, and lots of messy work. Oh, what do I do? Want to map out image analysis and text analysis work to the nodes. I analyze, extract from, Biomedical research papers. - Bob Robert P. Futrelle Associate Professor Biological Knowledge Laboratory College of Computer and Information Science Northeastern University MS WVH202 360 Huntington Ave. Boston, MA 02115 Office: 617-373-4239 Fax: 617-373-5121 http://www.ccs.neu.edu/home/futrelle http://www.bionlp.org http://www.diagrams.org
