On Mon, 21 Jan 2013 11:22:19 -0800 "H. S. Teoh" <[email protected]> wrote:
> On Mon, Jan 21, 2013 at 01:47:44PM -0500, Nick Sabalausky wrote: > > On Mon, 21 Jan 2013 13:27:48 -0500 > > Nick Sabalausky <[email protected]> wrote: > [...] > > > You should to read Ted Dziuba's "Node.js is Cancer", he explains > > > it better than I can: (He's destroyed all his good older posts, > > > but I managed to save this one from a cache:) > > > https://semitwist.com/mirror/node-js-is-cancer.html > > > > > > > Also, FWIW, while most of that sounds like it would also apply to > > Vibe.d, it's drastically mitigated in Vibe.d's case because: > > > > A. Unlike Node.js, Vibe.d I/O will automatically trigger a fiber > > switch to a different request while the I/O, purely behind the > > scenes, completes asynchronously. > > > > B. Vibe.d uses a real language instead of requiring your server to > > be written in a toy language that thinks it's ok to force every > > f*@^ing variable in a "scalable" program to be an associative array > > of Variants, whether they need to be or not. > [...] > > Yeah, I remember when I was young and foolish, and believed for a > short moment the hype about "dynamic typing" languages. Yup. It's *inherently* slow and resource-intensive to execute no matter how well optimized[1], and (contrary to popular belief) it's ALSO slow to write since you spend half the time debugging the "bugs" that a grown-ups language would have caught and pointed out instantly[2]. And it's inherently bug-prone. [1] https://semitwist.com/articles/article/view/quake-shows-javascript-is-slow-not-fast [2] https://semitwist.com/articles/article/view/why-i-hate-python-or-any-dynamic-language-really Granted, there are many dynamic coders who *don't* spend half their time debugging statically-checkable things, but I guarantee those are the same people writing all those Python scripts that blow up immediately with a Traceback (ie Literally at least half of the Python-written software I've ever tried to use). So, it's slower and more bloated to execute, slower to write, and bug-riddled. So there's literally no benefit. The *claim* is that the benefits of dynamic are: A. Faster development: But this is a load of crap because the only non-superhuman way to prevent it from being riddled with bugs is to (remember to) manually perform or otherwise reimplement the work that a real compiler would have done automatically. B. Easier for non-programmers to use: But non-programmers should NEVER be writing production code, PERIOD. That's like a car manufacturer having their accountants and graphic designers designing the mechanical parts: Yea great f&^@ing idea... C. More powerful features, more safety, and simpler code: D proves this is bullshit. Heck, damn near any static language that isn't C++ or Java proves this is bullshit. But people think "non-dynamic" and they think "C++ and Java" - the two worst possible examples. I can't even tell you how many *programmers* I've come across that actually believe runtime reflection and hotspot optimizations are theoretically impossible without a VM. > I quickly > discovered that dynamic typing is about the worst plague ever to > befall the programming world as soon as you do anything more than > trivial textbook examples. You have to be constantly vigilant of what > types are being passed around in those variables, because they > *might* just happen to be a string where you expected an int, or an > int where you expected an object! So much code comes crashing down as > soon as you hand it a variable of the wrong type -- which the > language explicitly allows you to, and in fact, which is the > language's selling point in the first place! > And then all that careful manual checking becomes completely useless the moment you accidentally do something on the wrong variable. "Huh? He's setting the content type, but the GET request doesn't have a content type. I guess he wants me to make one! Yea, that must be it!" Me: D and Python, shoot the tin can. D: Ok, shooting the tin can. Python: Ok, shooting the tin can. Me: D and Python, shoot the glass bortle. D: Wait...what? Do what now? Python: I don't know what a bortle is so I'm just gonna spray bullets everywhere and hope I hit it. > So as soon as you move beyond trivial toy programs into real-world > applications, you start feeling the need -- the desperate need -- to > vet each and every single variable passed into your code for their > type, and worse, the exact fields present in an object type, because > so much code depends on variables being, uh, of specific, non-varying > types? It's either that, or constantly checking if something is equal > to NULL, or sorry, I mean, ===NULL (as opposed to ==NULL or =NULL or > <>NULL or ====NULL or whatever stupid UFO operator it is nowadays > that they use to patch a type system on top of a broken language). > And you quickly find yourself reimplementing a type system on top of > the language, which *should have provided one in the first place*. > Exactly. I suspect the root of the problem (or at least one root) is this: 1. Joe Dumbass decides "Hey, we should have people OTHER than programmers do some programming!" (A clearly bad idea in the first place. It's like "Hey, let's have our receptionists assist in open-heart surgery! That'll save us lots of time and money!") 2. Joe Dumbass (alias "Rasmus Lerdorf") either creates, or hires someone else to create, a language specifically for non-programmers. 3. The language, due to the requirement of being usable *specifically* by those who don't know what they're doing, is *by necessity* guaranteed to be garbage. 4. Those non-programmers start writing code, and therefore start believing they're real programmers who know what they're doing. They *could become* real programmers of course, but not before shedding the training-wheels language. That only happens in a minority of the cases, though. That is *known* to literally be how PHP was created. I think there's very compelling reason to believe the same is also true of JavaScript and Flash's ActionScript as well. > This article cinched it for me: > > http://existentialtype.wordpress.com/2011/03/19/dynamic-languages-are-static-languages/ > > (Summary for the tl;dr people: dynamic languages == static languages > straitjacketed with only a single type.) > Yea, I always liked that explanation of it. It really is absolutely true. > To conclude, I have to say that so far, the number of times I've > actually *needed* to use a Variant type in my entire career is 1 (out > of the hundreds if not thousands of programs I've written). And that > one time was when I was writing an interpreter for a DSL. Much the same here, only times I've *ever* felt any need for a Variant is when interfacing with something that's already variant or nearly-variant. Or as a bloated workaround when I'm using a language with really shitty (or no) template support. Algebraic types are another matter, though. Those can be useful, although D's Algebraic still needs some work. > How many > interpreters are written in Javascript? I rest my case. :) > Don't worry, no doubt it'll happen soon enough. Half the web devs out there are already convinced that JS is a valid "asm of the web" and that V8 is "fast" (it's only fast compared to some other JS engines, not to real languages - it *can't* be fast when literally everything is guaranteed to be a Variant[string]). If text editors written in JavaScript have become commonplace (<sarcasm>Thanks, Google!</sarcasm>), I'm sure JS-based interpreters, JS-based codecs and "F"FTs (rather SFTs), and other such nonsense aren't far behind. Just like Google already did with Quake, some jackass will write an MP3 decoder in JS and use it to claim that JS is fast (yea, as fast as a 486 which could decode MP3s just fine, too). "But if it can play quake and decode mp3s, that's all the power you need!" Then why the fuck did I just pay hundreds of dollars for what amounts to a brand-new Pentium 1?
