This issue is fixed for me. Leaving the below solution here for others.
=====
Step A:
Comment out "Install shellcheck" section that installs Cabal in below file.
/dev-support/docker/Dockerfile
The hadoop env setup completed successfully though it complained about low
memory (2 GB min required)
Step B:
So uncomment above lines and instead increased the memory allocated to the
vm via Vagrant config.
1.
Add below in Vagrantfile
config.vm.provider "virtualbox" do |vb|
# Customize the amount of memory on the VM: 3 GB
vb.memory = "3072"
end
2. ~$ vagrant reload
3. ~$ vagrant ssh
4. In the hadoop folder inside the ubuntu, type
~$ ./start-build-env.sh
This should now create the hadoop build env successfully without any error
and the overhead is ~ 1GB memory leaving ~2GB free.
Thanks,
Arun
On Wed, Mar 1, 2017 at 10:04 PM, Arun S. Kumar <[email protected]> wrote:
> Hi,
>
> I am trying to setup Hadoop locally by following https://wiki.apache.
> org/hadoop/HowToContribute
> and https://git-wip-us.apache.org/repos/asf?p=hadoop.git;a=
> blob;f=BUILDING.txt.
>
> I am on MacOSX Sierra (10.12.3) 8GB RAM 2.2GHZ intel core i7.
> I installed ubuntu-trusty-64 via vagrant and virtualbox provider.
>
> When I ssh'ed into ubuntu and tried running start-build-env.sh , it failed
> at
>
> Step 21/33 : RUN cabal update
>
> Killed
>
> The command '/bin/sh -c cabal update' returned a non-zero code: 137
> I googled around and found one possible reason could be that it doesn't
> have enough memory
>
> while creating the container and tried a suggestion to restart VM but
> again got different error
> below.
>
> Step 21/33 : RUN cabal update
>
> cabal: out of memory (requested 1048576 bytes)
>
> The command '/bin/sh -c cabal update' returned a non-zero code: 1
> Is Cabal really needed or optional ? Any help is much appreciated.
>
> TIA.
>
> ---------------------------------
> Regards,
> Arun S. Kumar
>
>
--
---------------------------------
Regards,
Arun S. Kumar