There is no simple answer to that question.  Some important factors
that spring to mind are:
1) What is the architecture? -- x86, SPARC, etc...
2) What is the operating system?
3) What Java runtime is being used?
4) What C compiler is it being compared against?
5) How long does job run?
6) What are the memory constraints?
7) How much (if any) floating point work is being done?
8) What sort of IO is being done?
9) How good are you with a profiler?

I've probably missed some important ones.  Also, what do you mean by
systems programming?  That phrase says kernels and device drivers to
me, but I'm sure that's not what you mean.

With a sufficiently stacked deck, either Java or C would handily
trounce the other.  I suspect that the average Hadoop job gives a
slight edge to Java, but I have nothing to back that up.

Reply via email to