I hear what you say - it's certainly not normal for a piece of software
to require that much memory to compile - but in the end this reduces to
the fact that it's a complex scientific application. It's literally
orders of magnitude different in various features from "normal" pieces
of software including WebKit. There is really no comparison with
ordinary desktop software. For example, it would be typical to handle
100GB - 10TB datasets. Basically it's a different beast altogether.
Because of that, nobody would want to use this application in a 32-bit
environment or on ARM devices, so it really doesn't matter if it can't
compile for those cases because they're just not especially relevant.

As I understand it from the maintainer, the 4GB memory limit is generic
to buildd, whether running on Debian or Ubuntu. So, this issue is going
to keep biting until the limit is increased. This seems as good a place
as any to address the issue.

What's the real issue with making a moderate increase from 4GB to 8GB?
I'm still assuming it's actually hardcoded somewhere in buildd, which it
might not be of course. The builder hardware has limited memory and
multiple virtual builders, but in practice very few applications except
perhaps only this one are going to use up that much memory for a
compilation, so the change should hardly impact the builder servers.

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1090819

Title:
  libshogun-dev upgrade impossible - shogun-octave missing due to 4GB
  out-of-memory compilation error

To manage notifications about this bug go to:
https://bugs.launchpad.net/launchpad-buildd/+bug/1090819/+subscriptions

-- 
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to