On Thursday, 1 May 2014 at 06:04:57 UTC, Russel Winder via
Digitalmars-d wrote:
On Wed, 2014-04-30 at 15:21 -0400, Nick Sabalausky via
Digitalmars-d
wrote:
[…]
I've heard this a lot, but I've yet to hear anyone explain
concretely how this "dynamic mindset" causes the lack of
things like static code checks and low-overhead primitives
from being actual drawbacks. Maybe it really is a case of me
not "getting it", but it always sounds to me like this
"dynamic mindset" gets around these issues simply by ignoring
them. Since I don't personally do heavy development in dynamic
languages, I'd be interested in a strong rebuttal to this.
The best way of approaching this is not to try and attempt a
long
theoretical essay, but to try (a possibly lengthy exchange of
question
and answer accompanied by actually writing code in both Python
and D.
Also looking at things like Rosettacode and all Bearophile and
others'
work there.
There are two questions above to kick start things, let's take
the
second first:
Low-overhead primitives: This immediately implies you are
looking for
raw CPU-bound performance or you are undertaking premature
optimization.
Python has no direct "play" in this game. If a Python code
requires out
and out CPU-bound performance, you profile to find the small
bit of the
total code that is performance critical. If it is not already a
function
you extract it as a function, put it in it's own module and
Cythonize it
(*). Still Python code but with a few judicious extra
annotations so as
to enable Cython to generate C which is then compiled to native
code.
In my experience the problems with this approach are as follows:
1. The workflow: you write Python code, then profile it, extract
the bottlenecks and then compile them to native code. A lot of
overhead.
2. It can soon become messy when the code is changed, and there's
a mess of different technologies applied over the years (Swig,
Cython and whatnot).
In my opinion it's cleaner to write a project in D (C/C++). Code
> native, in one go. I always feel uneasy, when I have to bring
in third party software (i.e. a blackbox) to boost the
performance of my code. I accept that, if you already have a
substantial code base in Python, technologies like Cython are the
way to go. But when I start a new project, why on earth should I
go down that rocky path again?
Static code checks: The object (aka data) model, and variable
model of
Python (and indeed other dynamic languages) means there cannot
be any
static code checks. At this point, I have to yield (in this
co-routine
exchange :-) with a question: what is it about static code
checks that
you feel you have to have in order to be able to program?
(*) There are other techniques, but this is the lowest overhead
in terms
of programmer time, and it gives excellent runtime speed-up.