On Thursday, 30 July 2015 at 22:39:38 UTC, Márcio Martins wrote:
On Thursday, 30 July 2015 at 21:27:09 UTC, deadalnix wrote:
On Thursday, 30 July 2015 at 15:10:59 UTC, Brandon Ragland
wrote:
It's a dog because Java is a dog. But that's not because of
the GC.
It's not really that bad either, I can open up Minecraft at
any time and have it sit in the background quietly using
~800Mb ram and virtually no cpu time.
Either your kid has tons of mods in their Minecraft or your
computer is a bit dated.
Now compare that kind of resources consumption with any game
based on Quake III engine. While you are at it, compare the
graphisms. Still not convinced ? Measure latencies, which are
critical for most games.
Heh, what about the original Quake or Unreal? Both ran quite
well on a Pentium 100 with 16MB of RAM. Not sure if memory is
serving me right, but I suspect the amount of visible triangles
per scene in Quake 1 can be comparable to that of Minecraft's,
except it was released almost 20 years ago... I think I was
able to run it on my 486 with 8MB of RAM, though it was more of
a slideshow than a game, even with minimum viewport size, but
still!
Are you kidding me? Visible triangles in Quake 1 was sub 2,000.
Visible triangles in a "far" render mode on Minecraft tops over
200,000.
Please don't make such a statement without looking. Each
minecraft block is 16 triangles. Multiply that by however many
blocks you have, and you'll see that number sky-rockets.