Garrett Hylltun wrote:
On Feb 13, 2006, at 12:49 PM, Richard Gaskin wrote:
It's a question of productivity, of the tradeoffs between hand-coded tight HTML vs. rapid development. And most of the code is the same, whether generated by hand or by machine. "<p>" is "<p>" whether typed by hand or generated.

I have to disagree. Just about all WYSIWYG html editors are prone to code trashing and adding unnecessary code. Download any of those editors and create a page with them, and of course create the same page by hand code. Then compare them in code and file size. You'll see what I mean.

Of course hand-edited code will usually be tighter. The question is whether the greater time required to hand-write the code is really worth a difference that is often marginal if both methods are done well.

Code generators often rely on generalization, and generalization done well will mean more error-checking, which of course means more code.

But generalization is not always a bad thing, and is often a very useful thing.

When you break out a block of inline Transcript statements into a reusable handler, much as the JS libs in DW or GL do for their code, you introduce a modest performance loss in the lookup for that handler as well as the need to error-check params.

But over the course of the project is the time saved during development and maintenance worth the performance hit? The literature on design patterns suggests it often is.

Of course the performance hit with JavaScript will be far greater than with nearly any other scripting language, since it's a half-baked language encumbered with semi-typed data and a structure based on compiler design and not optimized for interpretation. But still, with slow machines running at 1GHz today, the worst offending sites with load times are more a function of the design than the underlying code.


Many will use different code, or even MS specific only codes.

Using MS-specific code isn't a problem with generation tools as a whole, it simply means that the specific tool in question was created either by idiots or people with a vested interest in Micro$oft's vain attempt to control the Internet.

Good tools reinforce the central premise of the web, that it's platform-independent. Intranets not withstanding, any deviance from that is simply poor design, whether by hand or by machine. I've seen a good many hand-coded pages that pretend the world is run by Microsoft, more so than the platform-independent code generated by DW and GL.

To my knowledge only Microsoft tools attempt to trick the unwitting developer into believing that Microsoft-specific code is anything less than a stupid thing to force onto one's visitors, ever more so as Internet Exploder's marketshare continues to decline in favor or more standards-compliant alternatives like Firefox.


Or do something like "<div span>" instead of "<p>" or even several
"<br>" tags instead.  Many will inject css when it's not even needed,
just adding more the bulk of the code it will produce.

If you haven't played with the latest versions of these tools, in the hands of an experienced developer you might be surprised to see some of the good work that can be done with them.

Sure, I still consider it essential to know HTML and be prepared to tweak generated code as needed; UI gestures simply don't lend themselves to the full range of decisions one can make with HTML.

But by and large, by the time one adds the necessary browser-checking code needed for good rendering across all browsers and platforms, the net difference in code is more a question of style than size. If it isn't, in some cases it may simply mean the hand-written code isn't as thorough as the generated code designed by teams of some of the most experienced engineers at Adobe and Macromedia.


One could argue that all C++ introduced over C was the insertion of orders of magnitude more JSR statements into the object code. But of course that's only one view, a view that overlooks the productivity benefits of OOP. One could make a similar comparison of C vs. Assembler, or to bring it back home, of Transcript vs. any lower level alternative.

Comparing hand coded html to the use of a WYSIWYG editor with C++ and C doesn't even equate.

True in the purest sense; as Aristotle said, ultimately every metaphor breaks down because it's an approximation of what's being discussed and not the thing itself.

But in spite of the comparative impurity, C++ does introduce a tremendous number of JSRs into one's object code, with the result that equivalent functionality implemented in straight C will often be faster and have a smaller file size.

Users often bemoan that apps are bloated today over what was being turned out 20 years ago, but that misses the bigger point: features.

It would be a special form of hell on earth to try to implement modern application architectures in straight C, and prohibitively expensive in Assembler. Multi-level Undo as part of Apple's Core Data, for example, comes with relative ease for Objective C developers, but I know of few C-based apps that dared to attempt it profitably, and certainly no one insane enough to try implementing that in Assembler.

Similarly, one can indeed hand-type the JavaScript needed for modern menu systems, but why? Code generators do this well, typo-free, and in a fraction of a second.

Most modern HTML generators have options to optimize the output HTML when uploading. I wrote one in Transcript some time ago as an exercise, and even my modest two-hour hack produced leaner pages than most sane people would dare write by hand. The output loaded and rendered more quickly than the original, but was of course almost unreadable for humans.

In short, I believe that the differences between hand-typed code and machine generated code are minor compared to the productivity benefits. With lean output options, perhaps the biggest difference to the viewer is that the machine-assisted pages often include features that the hand-typer is still typing.

My personal feeling is to let machines do any work they profitably can, freeing up human resources for where uniquely human ingenuity is more critically needed.


And just to note, Rev is not a VPL, nor are language such as VB or any other MS language available. There are very very few VPL's around, and the M:Poster I mention is long since gone from the net.

I haven't seen anyone here claim that Rev was a VPL, but it does raise the interesting question of whether it would be fun to make one in Rev, and whether the output could be efficient enough to make it worthwhile. Given Rev's overall speed and the flexibility of Transcript, I'd bet it could be quite doable.

--
 Richard Gaskin
 Managing Editor, revJournal
 _______________________________________________________
 Rev tips, tutorials and more: http://www.revJournal.com
_______________________________________________
use-revolution mailing list
[email protected]
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to