This is a very small amount of memory for running Hadoop + user programs. You might consider running your tests on a cloud provider like Amazon. That will give you access to decent sized machines for a relatively small cost.
On Sun, Sep 15, 2013 at 11:27 AM, Mahmoud Al-Ewiwi <[email protected]> wrote: > Thanks to all, i'v tried to use some of these sandboxs, but unfortunately > most of them a high amount of memory(3GB) for the guest machine and i have > only a 3GB on my machine (old machine), so i'm going to go along with the > the normal installation (i have no choice) > > Thanks > > > On Sun, Sep 15, 2013 at 9:13 AM, Roman Shaposhnik <[email protected]> wrote: > > > On Sat, Sep 14, 2013 at 10:54 AM, Mahmoud Al-Ewiwi <[email protected]> > > wrote: > > > Hello, > > > > > > I'm new to Hadoop and i want to learn it in order to do a project. > > > I'v started reading the documentation at this site: > > > > > > > > > http://hadoop.apache.org/docs/r2.1.0-beta/hadoop-project-dist/hadoop-common/SingleCluster.html > > > > > > for setting a single node, but i could not figure a lot of things in > > these > > > documentation. > > > > For the first timer like yourself, perhaps using a Hadoop distribution > > would be the best way to get started. Bigtop offers a 100% community > > driven distro, but there are, of course, vendor choices as well. > > > > Here's the info on Bigtop: > > > > > https://cwiki.apache.org/confluence/display/BIGTOP/How+to+install+Hadoop+distribution+from+Bigtop+0.6.0 > > > > Thanks, > > Roman. > > >
