Re: [9fans] Fonts

2009-07-09 Thread Ethan Grammatikidis
On Wed, 8 Jul 2009 23:17:22 -0400
erik quanstrom quans...@quanstro.net wrote:

  Which vera font? I just looked up http://google.gr/ in Firefox  pasted the 
  text into a terminal using Bitstream Vera Sans Mono  don't see any missing 
  letters. Nor do I if I set the font to Bitstream Vera Sans or Serif, or 
  Bitstream Charter
 
 i don't recall.  the date on the original conversion from .ttf is 2005.
 
  Odd thing is if I use hget or curl to get the page I get a lot of nulls. 
  The page is encoded in utf-8 so I don't know what the google.gr servers 
  might be doing.
 
 ; hget  http://google.gr/
 !doctype htmlhtmlheadmeta http-equiv=content-type 
 content=text/html; charset=ISO-8859-7
 
 i'm pretty sure that ISO-8859-7 != utf-8.

I guess that's server-side mucking about based on user-agent not reporting 
utf-8 capability or something stupid.  Firefox page info feature reports the 
page as utf-8, and on inspection of the source:

!doctype htmlhtmlheadmeta http-equiv=content-type content=text/html; 
charset=UTF-8

I wonder if there's some 'prefered encoding' message the UA can send to the 
server.

-- 
Ethan Grammatikidis

Those who are slower at parsing information must
necessarily be faster at problem-solving.



Re: [9fans] Fonts

2009-07-09 Thread Ethan Grammatikidis
On Wed, 8 Jul 2009 22:25:17 -0700
Russ Cox r...@swtch.com wrote:

 This conversation reminded me that I have been
 meaning to clean up a program I wrote a while back
 and integrate it into plan9port.  It generates Plan 9
 bitmap fonts on demand using the native window
 system fonts.  Right now it only works on OS X.
 I would gladly accept X11 support and OS X bug fixes.

Nifty idea but I have to ask what kind of X11 font support? There is the 
original old-school X font service, but it seems few actively-developed apps 
use that now. Most rely on fontconfig which only works with direct use of 
freetype by the application, or something like that. The two systems may be 
configured with completely different font paths at any rate. Modern yewnicks is 
_fun_. *shudders*

-- 
Ethan Grammatikidis

Those who are slower at parsing information must
necessarily be faster at problem-solving.



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Eric Van Hensbergen
On Wed, Jul 8, 2009 at 11:10 PM, erik quanstromquans...@quanstro.net wrote:
  I expect to see code immediately, by the way, finished or not, and you 
  better be
  around to answer my questions.

 You have something here: these are central software-development tenets
 of agile/scrum/xp/lean/kanban du jour, and help the open-source
 community work.  Essentially, done is an elusive illusion, so enlist
 others throughout the process.

 i'm just going to take a guess that you have never had egg
 on your face caused by publishing code before it's time?


We could all do with a bit more egg on our face.
No one here is super-guru coder who can do without peer review, many
are here just starting out and need lots of guidance.  Within project
teams I always publish my sandbox, if it compiles its committed.  The
benefit is not only peer review, but also so folks know what I'm
working on (avoiding duplicate work).  The only things which are not
public have not cleared the IP lawyers (yet).  Within the labs,
everyone's sandboxes were visible (compiling or not).  So to answer...

 name operating systems that develop in this way?

Inferno and Plan 9 were both developed this way - everyone's code was
visible to everyone else working on the project, often in a single
tree.  Integration problems were discovered sooner rather than later,
and it was always easy to get help with a particular problem because
anyone could immediately see the source.

 was under the impression that even, e.g., linux code is submitted
 in fairly complete fashion and tends to get rejected even
 on style grounds.

Possibility of rejection is no reason to not to submit for review
(just make sure folks understand the status of the code), or otherwise
make your code visible.

-eric



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread David Leimbach
Indeed, Voltaire had it right.  Better is the enemy...  (of my enemy is my
friend??)

On Wed, Jul 8, 2009 at 9:10 PM, erik quanstrom quans...@quanstro.netwrote:

   I expect to see code immediately, by the way, finished or not, and you
 better be
   around to answer my questions.
 
  You have something here: these are central software-development tenets
  of agile/scrum/xp/lean/kanban du jour, and help the open-source
  community work.  Essentially, done is an elusive illusion, so enlist
  others throughout the process.

 i'm just going to take a guess that you have never had egg
 on your face caused by publishing code before it's time?

 i can't stand my own silly mistakes, unfinished and crap code.
 why should i look at anyone elses?  by the way, can you
 name operating systems that develop in this way?  i
 was under the impression that even, e.g., linux code is submitted
 in fairly complete fashion and tends to get rejected even
 on style grounds.

 i think the idea that is illusary is that there is no difference
 between code that doesn't work and code that does work
 but might be improved.

 part of the craft of programming is to know when something
 is actually finished.  the mistake is to improve things that
 work well enough.  i think one could write quite an interesting
 book critiquing modern software development for failing to
 stop at good enough.  but one would need to be quite a bit
 smarter, more educated and less lazy than i.  i'll satisfy myself
 by quoting some such people.  (oddly #1 and #3 are missing
 from fortune)

 Rule 3.  Fancy algorithms are slow when n is small, and n is usually small.
- rob pike, Notes on Programming in C
 Inside every large problem is a small problem struggling to get out.
- niklaus wirth
 When in doubt, use brute force.
- ken thompson

 - erik




Re: [9fans] Fonts

2009-07-09 Thread Chad Brown

On Jul 9, 2009, at 3:02 AM, Ethan Grammatikidis wrote:

; hget  http://google.gr/
!doctype htmlhtmlheadmeta http-equiv=content-type  
content=text/html; charset=ISO-8859-7


i'm pretty sure that ISO-8859-7 != utf-8.


I guess that's server-side mucking about based on user-agent not  
reporting utf-8 capability or something stupid.  Firefox page info  
feature reports the page as utf-8, and on inspection of the source:


!doctype htmlhtmlheadmeta http-equiv=content-type  
content=text/html; charset=UTF-8


I wonder if there's some 'prefered encoding' message the UA can send  
to the server.


Accept-Charset is the http header that you want, but to do it `right'  
you probably want to muck about with http's q-value weighting system.   
The shorter form is that you'll have to tell the server you're ok with  
UTF, or it'll fall back to it's best-guess techniques, with the  
default fallback of iso-8859.


*Chad


Re: [9fans] Fonts

2009-07-09 Thread Federico G. Benavento
ie=utf-8 works fine

On Thu, Jul 9, 2009 at 12:43 PM, Chad Brownyand...@mit.edu wrote:
 On Jul 9, 2009, at 3:02 AM, Ethan Grammatikidis wrote:

 ; hget  http://google.gr/

 !doctype htmlhtmlheadmeta http-equiv=content-type
 content=text/html; charset=ISO-8859-7

 i'm pretty sure that ISO-8859-7 != utf-8.

 I guess that's server-side mucking about based on user-agent not reporting
 utf-8 capability or something stupid.  Firefox page info feature reports the
 page as utf-8, and on inspection of the source:

 !doctype htmlhtmlheadmeta http-equiv=content-type
 content=text/html; charset=UTF-8

 I wonder if there's some 'prefered encoding' message the UA can send to the
 server.

 Accept-Charset is the http header that you want, but to do it `right' you
 probably want to muck about with http's q-value weighting system.  The
 shorter form is that you'll have to tell the server you're ok with UTF, or
 it'll fall back to it's best-guess techniques, with the default fallback of
 iso-8859.

 *Chad




-- 
Federico G. Benavento



Re: [9fans] troff and ps related

2009-07-09 Thread hiro
Perhaps we should use troff and just convert it to tex?
Because I also hate to write/read tex.

On Thu, Jul 9, 2009 at 4:53 PM, Rudolf Sykorarudolf.syk...@gmail.com wrote:
 2009/7/8 Russ Cox r...@swtch.com:
 I assume you have a non-Plan 9 machine to play with.
 It's worth trying Heirloom troff there to see if the boxes
 are done better.  They probably are.

 as far as I see, they are not better.

 I need to fall back to the Plan 9 troff, because the low-level
 details seem to differ between the two.

 so, unlike TeX, different troffs produce different outputs, right?

 Honestly the box drawing has never really bothered me.
 I don't draw boxes around things because it's too noisy
 for my tastes anyway.

 it's not only boxes; it's tables, as I mentioned, as well.

 Thanks
 Ruda





Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread erik quanstrom
 Why would it take a book?  DMR made the point succinctly in his
 critique of Knuth's literate program, showing how a few command-line
 utilities do the work of the Don's elaborately constructed tries.

because, evidently, one book was not enough.

- erik



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Micah Stetson
 Why would it take a book?  DMR made the point succinctly in his
 critique of Knuth's literate program, showing how a few command-line
 utilities do the work of the Don's elaborately constructed tries.

Do you have a URL for this?

Micah



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Devon H. O'Dell
2009/7/9 Micah Stetson mi...@stetsonnet.org:
 Why would it take a book?  DMR made the point succinctly in his
 critique of Knuth's literate program, showing how a few command-line
 utilities do the work of the Don's elaborately constructed tries.

 Do you have a URL for this?

I looked this up yesterday, and there is some discussion on
literateprogramming.com -- the discussion actually illustrates how to
take the shell script and make *it* literate programming instead. It
actually wasn't DMR, it was McIlroy.

 Micah

--dho



Re: [9fans] troff and ps related

2009-07-09 Thread yy
2009/7/9 hiro 23h...@googlemail.com:
 Perhaps we should use troff and just convert it to tex?
 Because I also hate to write/read tex.


I have an awk script to write latex in plain text, with a syntax
similar to markdown. It is an ad-hoc solution I am using to write my
thesis, but if you are interested drop me a line off-list.


-- 
- yiyus || JGL .



[9fans] data analysis on plan9

2009-07-09 Thread hugo rivera
Hi,
since I discovered plan 9, about two years ago, I've been constantly
amazed by its simple yet quite powerful design.
From one year now, I am looking forward to move to plan 9 as my main
OS, but I am not able to do so because it lacks the data analysis
tools available in some other systems, like linux.
Because my work involves dealing with data coming from experiments in
astro-particle physics, I am more or less tied to data analysis
software like the R programming language, Python's Numpy, Cern's ROOT
and even gnuplot. While using them, I realized that most of the time I
deal with text files that go here and there as input or output of
small specific programs that perform a given task (I don't know if
this is the result of my Unix/Plan 9 background or just a
coincidence). Say I have a command 'clean' that removes undesired
points from a body of data, and another command 'four' that performs
the FFT; so they are used together as
clean data.txt | four  results.txt
so it occurred to me that one can create single commands to interact
among them to perform some analysis on data, just like in the original
Unix style. Awk can be used as glue among them, with some other small
glue utilities. Plotting data is another thing that I would like to
integrate into this, since plots are quite frequent while analysing
data, but I am not sure how.
Also, something similar to GSL (http://www.gnu.org/software/gsl/)
would be invaluable or maybe even indispensable.
Maybe some day I'll start to write some commands for plan 9 to begin
working on it, but I want to convince myself that this is worth the
time spent.
What do you think of this? my main concern is that perhaps the do one
thing well design falls short for data analysis. I've never seen
people work like this on data analysis before (but I do not think I am
the first to do it) because in general, they tend to use large data
analysis frameworks. I'd really appreciate some feedback on this from
people working on data analysis and also from the plan 9 community
(otherwise I wouldn't be writing here :-)
Saludos

-- 
Hugo



Re: [9fans] data analysis on plan9

2009-07-09 Thread Jason Catena
I'd also be interested in knowing whether gnuplot or an equivalent is
yet ported to Plan 9.  Ron Minnich et al. seem to prefer gnuplot, and
reported that they generated data for it and used it in a paper, but
weren't specific whether the gnuplot ran on the same plan9 box or
another *nix.

From http://doc.cat-v.org/plan_9/IWP9/2008/trace.pdf pp. 19-21:

4.1 Visualizing trace device output

Once we had the data, we needed a way to analyse the information. After
working with the data for a while, we realized that the output as
shown in Figure
1 would be very useful. No graphiing [sic] tool available to us in
Plan 9 or Linux
was able to create that output. In the end, we determined that gnuplot was the
most appropriate tool, but even then the data required significant processing to
get it into the proper form.

We wrote a suite of scripts usng rc, the plan 9 shell; acid, the Plan
9 debugger;
awk, and sed to generate data appropriate for plotting with gnuplot.
The createplot
script has the ability to filter out functions which ran for less than
a specified number of clock cycles, which is useful for reducing the amount of
noise in a plot. To generate a plot from the data collected earlier, discarding
functions which completed in less than 4000 cycles, we just ran:

plots/createplot /amd64/9k8pf 4000 ./trace  plotme

and fed the input into gnuplot.

Jason Catena



Re: [9fans] data analysis on plan9

2009-07-09 Thread Federico G. Benavento
 Also, something similar to GSL (http://www.gnu.org/software/gsl/)

gsl-1.6.tbz GNU Scientific Library, native port.

/n/sources/contrib/pac/sys/src/lib/gsl-1.6.tbz


hth

-- 
Federico G. Benavento



Re: [9fans] data analysis on plan9

2009-07-09 Thread ron minnich
On Thu, Jul 9, 2009 at 12:26 PM, Jason Catenajason.cat...@gmail.com wrote:
 I'd also be interested in knowing whether gnuplot or an equivalent is
 yet ported to Plan 9.  Ron Minnich et al. seem to prefer gnuplot, and
 reported that they generated data for it and used it in a paper, but
 weren't specific whether the gnuplot ran on the same plan9 box or
 another *nix.


gnuplot on linux. even octave uses gnuplot. Not that it's great, but
there is not much else.

ron



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Jason Catena
 i think one could write quite an interesting
 book critiquing modern software development for failing to
 stop at good enough.

 Why would it take a book?  DMR [sic] made the point succinctly in his
 critique of Knuth's literate program, showing how a few command-line
 utilities do the work of the Don's elaborately constructed tries.

 Do you have a URL for this?

Yes, sorry I didn't look it up earlier.

Bentley, J., Knuth, D., and McIlroy, D. 1986. Programming pearls: a
literate program. Commun. ACM 29, 6 (Jun. 1986), 471-483. DOI=
http://doi.acm.org/10.1145/5948.315654

It is McIlroy (not DMR), but it looks like he focused on, not writing
a literate program, but the engineering benefits of constructing a
pipeline from common tools, vs Knuth's elaborate, single-purpose
program.

From the paper:

To return to Knuth’s paper: everything there---even
input conversion and sorting---is programmed
monolithically and from scratch. In particular the
isolation of words, the handling of punctuation, and
the treatment of case distinctions are built in. Even
if data-filtering programs for these exact purposes
were not at hand, these operations would well be
implemented separately: for separation of concerns,
for easier development, for piecewise debugging, and
for potential reuse. The small gain in efficiency from
integrating them is not likely to warrant the resulting
loss of flexibility. And the worst possible eventuality
eventuality---being forced to combine programs---is
not severe.

The simple pipeline given above will suffice to get
answers right now, not next week or next month. It
could well be enough to finish the job. But even for
a production project, say for the Library of Congress,
it would make a handsome down payment, useful
for testing the value of the answers and for smoking
out follow-on questions.

Jason Catena



Re: [9fans] data analysis on plan9

2009-07-09 Thread J. R. Mauro





On Jul 9, 2009, at 14:40, hugo rivera uai...@gmail.com wrote:


Hi,
since I discovered plan 9, about two years ago, I've been constantly
amazed by its simple yet quite powerful design.
From one year now, I am looking forward to move to plan 9 as my main
OS, but I am not able to do so because it lacks the data analysis
tools available in some other systems, like linux.
Because my work involves dealing with data coming from experiments in
astro-particle physics, I am more or less tied to data analysis

software like the R programming language, Python's Numpy, Cern's ROOT


There a plan 9 R port, isn't there? If not, there might be an R done  
in python or something.




and even gnuplot. While using them, I realized that most of the time I
deal with text files that go here and there as input or output of
small specific programs that perform a given task (I don't know if
this is the result of my Unix/Plan 9 background or just a
coincidence). Say I have a command 'clean' that removes undesired
points from a body of data, and another command 'four' that performs
the FFT; so they are used together as
clean data.txt | four  results.txt
so it occurred to me that one can create single commands to interact
among them to perform some analysis on data, just like in the original
Unix style. Awk can be used as glue among them, with some other small
glue utilities. Plotting data is another thing that I would like to
integrate into this, since plots are quite frequent while analysing
data, but I am not sure how.


Plan 9 has plot program that fit well in a pipeline. Even gnuplot can  
go in a pipeline.




Also, something similar to GSL (http://www.gnu.org/software/gsl/)
would be invaluable or maybe even indispensable.
Maybe some day I'll start to write some commands for plan 9 to begin
working on it, but I want to convince myself that this is worth the
time spent.


I think it's worth it. Parts of this idea are already there (sum,  
sort, join, plot)




What do you think of this? my main concern is that perhaps the do one
thing well design falls short for data analysis. I've never seen
people work like this on data analysis before (but I do not think I am
the first to do it) because in general, they tend to use large data
analysis frameworks. I'd really appreciate some feedback on this from
people working on data analysis and also from the plan 9 community
(otherwise I wouldn't be writing here :-)


I know someone who does astrophysics analysis and visualization  
(including movies) on a special OS he wrote that works entirely like  
Unix pipes and filters. I think developing an anlysis framework as a  
pipes-and-filters toolbox is great.




Saludos

--
Hugo





Re: [9fans] data analysis on plan9

2009-07-09 Thread J. R. Mauro





On Jul 9, 2009, at 15:34, ron minnich rminn...@gmail.com wrote:

On Thu, Jul 9, 2009 at 12:26 PM, Jason  
Catenajason.cat...@gmail.com wrote:

I'd also be interested in knowing whether gnuplot or an equivalent is
yet ported to Plan 9.  Ron Minnich et al. seem to prefer gnuplot, and
reported that they generated data for it and used it in a paper, but
weren't specific whether the gnuplot ran on the same plan9 box or
another *nix.



gnuplot on linux. even octave uses gnuplot. Not that it's great, but
there is not much else.

ron



It wouldn't be too bad to translate gnuplot to plan9 plot/graph, would  
it?




Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread tlaronde
On Thu, Jul 09, 2009 at 02:47:37PM -0500, Jason Catena wrote:
 
 Yes, sorry I didn't look it up earlier.
 
 Bentley, J., Knuth, D., and McIlroy, D. 1986. Programming pearls: a
 literate program. Commun. ACM 29, 6 (Jun. 1986), 471-483. DOI=
 http://doi.acm.org/10.1145/5948.315654

[The article is reproduced in D. E. Knuth, Literate Programming, CSLI
ISBN 0-9370-7380-6]

For the task to be done print the k most common words in a file, the
Unix approach and the Unix tools give everything to create a program
far more rapidly than the from scratch approach adopted by D. Knuth. But
because the tools exist (are already written... but in what language?
Easily understandable? Maintainable? etc.).

But this does not mean that _in general_, literate programming has not
its strength especially for complex and weaven program... or even for
writing the tools, the bricks one combines in a pipeline like McIlroy does.

I don't think that TeX and METAFONT could have been written correctly,
or could be understandable in something else than WEB (unfortunately not
CWEB; that would simplify greatly porting).

[For another thread: MetaPOST can be used instead of gnuplot. But not 
easily for 3D like plotting. Unfortunately, MetaPOST too is WEB not 
CWEB. And the ad hoc conversion of some Pascal to C (web to C) seems
alas the simplest way. Even the Pascal compilers that could be ported to
compile on Plan9 (if there are) would probably not allow a
straightforward compilation of the WEB based programs.]
-- 
Thierry Laronde (Alceste) tlaronde +AT+ polynum +dot+ com
 http://www.kergis.com/
Key fingerprint = 0FF7 E906 FBAF FE95 FD89  250D 52B1 AE95 6006 F40C



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Jason Catena
 But this does not mean that _in general_, literate programming has not
 its strength especially for complex and weaven program... or even for
 writing the tools, the bricks one combines in a pipeline like McIlroy does.

I'll say amen, especially for a system of many little parts.  My point
wasn't to bash literate programming at all.  Rather I'd say that big
elaborate constructions of many aspects are fragile, hard to
understand and work with, and of limited use.  Instead, let our tools
combine bits of code into a bigger whole, and reuse the tools for
other wholes (cf Lego bricks).

I've used LaTeX and noweb daily since the first quarter of 2007 to
write papers that are literate programs.  They work well to organize
and document collections of many small command-line guide files, shell
scripts, makefiles, data files, C and lex source, and a real-time
gnuplot driver I downloaded from the page of a guy at NASA. (I posted
a few recent simple examples, in the tufte-handout style, at
http://swtools.wordpress.com/papers/ . I plan to ask for a publishing
release for the really interesting ones now behind a corp firewall.)

 Thierry Laronde

Jason Catena



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread erik quanstrom
 For the task to be done print the k most common words in a file, the
 Unix approach and the Unix tools give everything to create a program
 far more rapidly than the from scratch approach adopted by D. Knuth. But
 because the tools exist (are already written... but in what language?
 Easily understandable? Maintainable? etc.).

the problem i have with literate programming is that it
tends to treat code like a terse and difficult-to-understand
footnote.  it seems to me that literate programs tend to
spend too much time commenting on straightforward code
or code that is easier read than explained.  ironicly, the
assumption seems to be that one is illiterate in the computer
language at hand.

- erik



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Jack Johnson
On Thu, Jul 9, 2009 at 1:34 PM, erik quanstromquans...@quanstro.net wrote:
 the problem i have with literate programming is that it
 tends to treat code like a terse and difficult-to-understand
 footnote.

And thus, we have literate programming meets APL. ;)

-Jack



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread erik quanstrom
 structure, on extremely clever constructions (on the BWK gibe that I
 won't be smart enough to debug it later), and to describe how the code
 segment interacts with others and maps to the problem domain.

it's also interesting to notice that long comments
are often associated with bugs.

- erik



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Jason Catena
On Thu, Jul 9, 2009 at 16:34, erik quanstromquans...@quanstro.net wrote:
 For the task to be done print the k most common words in a file, the
 Unix approach and the Unix tools give everything to create a program
 far more rapidly than the from scratch approach adopted by D. Knuth. But
 because the tools exist (are already written... but in what language?
 Easily understandable? Maintainable? etc.).

 the problem i have with literate programming is that it
 tends to treat code like a terse and difficult-to-understand
 footnote.  it seems to me that literate programs tend to
 spend too much time commenting on straightforward code
 or code that is easier read than explained.  ironicly, the
 assumption seems to be that one is illiterate in the computer
 language at hand.

I'd guess that depends a great deal on the author's style.  In the
paper I quoted, I wouldn't say that's true at all of Knuth's
discussion.  I personally am very aware of this tendency, and only
comment to introduce a bit of code and place it within the overall
structure, on extremely clever constructions (on the BWK gibe that I
won't be smart enough to debug it later), and to describe how the code
segment interacts with others and maps to the problem domain.

Jason Catena



Re: [9fans] data analysis on plan9

2009-07-09 Thread Roman V Shaposhnik
On Thu, 2009-07-09 at 15:56 -0300, Federico G. Benavento wrote:
  Also, something similar to GSL (http://www.gnu.org/software/gsl/)
 
 gsl-1.6.tbz   GNU Scientific Library, native port.
 
 /n/sources/contrib/pac/sys/src/lib/gsl-1.6.tbz

I've been meaning to ask this for a long time -- do we have a
catalog of things in contrib?

Thanks,
Roman.




Re: [9fans] data analysis on plan9

2009-07-09 Thread erik quanstrom
 http://plan9.bell-labs.com/wiki/plan9/Contrib_index/

the parsing seems a bit odd, at least for my contrib stuff.
contrib/list from contrib(1) does a better job of listing contrib
packages.

- erik



Re: [9fans] Google finally announces their lightweight OS

2009-07-09 Thread Bakul Shah
On Thu, 09 Jul 2009 13:44:20 -0800 Jack Johnson knapj...@gmail.com  wrote:
 On Thu, Jul 9, 2009 at 1:34 PM, erik quanstromquans...@quanstro.net wrote:
  the problem i have with literate programming is that it
  tends to treat code like a terse and difficult-to-understand
  footnote.
 
 And thus, we have literate programming meets APL. ;)
 
 -Jack

http://www.youtube.com/watch?gl=GBhl=en-GBv=a9xAKttWgP4fmt=18

*This* is what can happen when a literate programmer meets APL!