On Thursday, 22 June 2017 at 07:32:51 UTC, Wulfklaue wrote:
On Thursday, 22 June 2017 at 07:15:26 UTC, Bienlein wrote:
In Java development there is almost no C or C++ and no Rust or
D at all. Memory is no problem. Some server needs 256 GB RAM
or maybe 512 GB?
That is just sloppy... There is this bad trend in the industry,
it has been going on for years. Performance issue, trow more
hardware at the problem. Optimizations? Trow more hardware at
the issue.
The problem being that it has becomes a constantly lowering
expectations bar. Hire some basic programmers, use
php/ruby/python and performance issues are simply overlooked as
part of the job.
In my daily job seeing php import scripts that suck up a GB
just for some basic work, is considered almost normal. Let the
client pay for more performing servers. Our developers need to
write more code, faster, be ready before the deadline so we can
bill the client for the fast work.
That's not an issue anywhere. As long as you get the
performance through parallelisation there is no need for C or
C++.
And while this works on a local server, the moment you start
with clusters, master-slave configurations etc, things get
complicated fast. Especially with latency issues.
You won't meet any Java EE archtitecture that will do anything
else than fight against calling C, C++ routines from Java.
That is only done in some very exceptional cases.
That same applies to just about every other language. Native
will always be prioritized before external calls.
The days of languages for systems programming are over. There
are only very few people that need such a language. That is
why D really needs a decent GC, otherwise it won't find any
users that would do production systems with it.
Technically, with Go, Rust, Crystal etc more about those high
performing languages, then before. Before it was all about
scripting languages, slowly you see a trend backing away from
them.
Massive relative price shocks can be important in shaping trends
in the use of technology. Storage prices drop 40% annually,
Moore'a Law isn't what it was, and dram latency isn't improving
nearly as fast as data sets are increasing in size. Intel non
volatile storage technology upsets all our assumptions about
where the bottlenecks are, and there seems a decent chance that
as you say the tilt away from slow languages is only just
beginning.
https://www.quora.com/Why-is-Python-so-popular-despite-being-so-slow/answer/Laeeth-Isharc?share=5ebdf91a&srid=35gE
Acam paper linked at the end of this.