Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread John Zabroski
On Jun 15, 2012 2:39 PM, Pascal J. Bourguignon p...@informatimago.com
wrote:

 John Zabroski johnzabro...@gmail.com writes:


  Sorry, you did not answer my question, but instead presented excuses
  for why programmers misunderstand people.  (Can I paraphrase your
  thoughts as, Because people are not programmers!)

 No, you misunderstood my answer:
 Because people don't pay programmers enough.

In the words of comedian Spike Milligan, All I ask is for the chance to
prove money can't make me happy.

But my motto comes from pianist Glenn Gould: the ideal ratio of performers
to audience is one. I have never seen a software team produce better
results with better pay, but most of the great advances in software came
from somebody doing something differently because any other way was simply
wrong.

Having seen millionaires throw their money around to build their dream app
(the Chandler project featured in Scott Rosenberg's book Dreaming in Code
and all of Sandy Klausner's vaporware graphical programming ideas), and
seeing what road blocks still remained, I disbelieve your answer.

Who invented the spreadsheet? One person.
Who invented pivot tables? One person.
Who invented modeless text editing? One person.

How much money is enough, anyway?  In the words of John D. Rockefellar, A
little bit more?
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread John Zabroski
On Fri, Jun 15, 2012 at 10:52 PM, Miles Fidelman mfidel...@meetinghouse.net
 wrote:

 Pascal J. Bourguignon wrote:

 John Zabroski johnzabro...@gmail.com writes:


  Sorry, you did not answer my question, but instead presented excuses
 for why programmers misunderstand people.  (Can I paraphrase your
 thoughts as, Because people are not programmers!)

 No, you misunderstood my answer:
 Because people don't pay programmers enough.


  I think that might be an inaccurate statement in two regards:

 - programmers make VERY good money, at least in some fields (if you know
 where to find good, cheap coders, in the US, let me know where)

 - (a lot of programmers) do NOT have a particularly user-focused mindset
 (just ask a C coder what they think of Hypercard - you'll get all kinds of
 answers about why end-users can't do any useful; despite a really long
 track record of good stuff written in Hypercard, particularly by educators)

 Note: This is irrelevant vis-a-vis Jon's question, however.  The answer to
 why he can't find easy ways to upload files, is because he isn't looking.



I have probably spent the better part of my life looking for examples of
good design, and my conclusion is there are so few good designers. Even
fewer non-designers actually notice a good designers' abilities to put them
in a position to succeed.  For that reason, we are languishing under the
canopy of a squalid subcultural darkness where I get answers saying greed
is good and will solve all.

Your answer and Pascal's can only be described as the answers adults would
give.  My question was a thought exercise, to build in the world of
imagination.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread BGB

On 6/16/2012 9:19 AM, Randy MacDonald wrote:

On 6/10/2012 1:15 AM, BGB wrote:
meanwhile, I have spent several days on-off pondering the mystery of 
if there is any good syntax (for a language with a vaguely C-like 
syntax), to express the concept of execute these statements in 
parallel and continue when all are done.

I believe that the expression in Dyalog APL is:

⍎¨statements

or

{execute}{spawn}{each}statements.



I recently thought about it off-list, and came up with a syntax like:
async! {A}{B}{C}

but, decided that this isn't really needed at the more moment, and is a 
bit extreme of a feature anyways (and would need to devise a mechanism 
for implementing a multi-way join, ...).


actually, probably in my bytecode it would look something like:
mark
mark; push A; close; call_async
mark; push B; close; call_async
mark; push C; close; call_async
multijoin

(and likely involve adding some logic into the green-thread scheduler...).


ended up basically opting in this case for something simpler which I had 
used in the past:
callback events on timers. technically, timed callbacks aren't really 
good, but they work well enough for things like animation tasks, ...


but, I may still need to think about it.

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread Randy MacDonald
@BGB, if the braces around the letters defers execution, as my memories 
of Perl confirm, this is perfect.  With APL, quoting an expression 
accomplishes the same end: '1+1'



On another note, I agree with the thesis that OO is just message passing:

  aResult ← someParameters 'messageName' to anObject ⍝⍝ so, once 
'to' is defined, APL does OO.


I was thinking 'new' didn't fit, but

   'new' to aClass

convinced me otherwise.

It also means that 'object oriented language' is a category error.

On 6/16/2012 11:40 AM, BGB wrote:


I recently thought about it off-list, and came up with a syntax like:
async! {A}{B}{C}



--
---
|\/| Randy A MacDonald   | If the string is too tight, it will snap
|\\| array...@ns.sympatico.ca|   If it is too loose, it won't play...
 BSc(Math) UNBF '83  | APL: If you can say it, it's done.
 Natural Born APL'er | I use Real J
 Experimental webserver http://mormac.homeftp.net/
-NTP{ gnat }-

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread BGB

On 6/16/2012 10:05 AM, Randy MacDonald wrote:
@BGB, if the braces around the letters defers execution, as my 
memories of Perl confirm, this is perfect.  With APL, quoting an 
expression accomplishes the same end: '1+1'




no, the braces indicate a code block (in statement context), and it is 
the async keyword which indicates that there is deferred execution. 
(in my language, quoting indicates symbols or strings, as in this is a 
string, 'a', or 'single-quoted string', where a is always a string, 
but 'a' is a character-literal).


in a expression context, the braces indicate creation of an ex-nihilo 
object, as in {x: 3, y: 4}.


the language sort-of distinguishes between statements and expressions, 
but this is more relaxed than in many other languages (it is more built 
on context than on a strict syntactic divide, and in most cases an 
explicit return is optional since any statement/expression in tail 
position may implicitly return a value).



the letters in this case were just placeholders for the statements which 
would go in the blocks.


for example example:
if(true)
{
printf(A\n);
sleep(1000);
printf(B\n);
sleep(1000);
}
printf(Done\n);

executes the print statements synchronously, causing the thread to sleep 
for 1s in the process (so, Done is printed 1s after B).


and, with a plain async keyword:
async {
sleep(1000);
printf(A\n);
}
printf(Done\n);

will print Done first, and then print A about 1 second later (since 
the block is folded into another thread).


technically, there is another operation, known as a join.

var a = async { ... };
...
var x = join(a);

where the join() will block until the given thread has returned, and 
return the return value from the thread.
generally though, a join in this form only makes sense with a single 
argument (and would be implemented in the VM using a special bytecode op).


an extension would be to implicitly allow multiple joins, as in:
join(a, b, c);//wait on 3 threads
except, now, the return value doesn't make much sense anymore, and likewise:
join(
async{A},
async{B},
async{C});
is also kind of ugly.

in this case, the syntax:
async! {A}{B}{C};
although, this could also work:
async! {A}, {B}, {C};

either would basically mean async with join, and essentially mean 
something similar to the 3-way join (basically, as syntax sugar). it may 
also imply we don't really care what the return value is.


basically, the ! suffix has ended up on several of my keywords to 
indicate alternate forms, for example: a as int and a as! int will 
have slightly different semantics (the former will return null if the 
cast fails, and the latter will throw an exception).



but, since I got to thinking about it again, I started writing up more 
of the logic for this (adding multiway join logic, ...).





On another note, I agree with the thesis that OO is just message passing:

  aResult ? someParameters 'messageName' to anObject ?? so, once 
'to' is defined, APL does OO.


I was thinking 'new' didn't fit, but

   'new' to aClass

convinced me otherwise.

It also means that 'object oriented language' is a category error.



my language is a bit more generic, and loosely borrows much of its 
current syntax from JavaScript and ActionScript.


however, it has a fair number of non-JS features and semantics exist as 
well.
it is hardly an elegant, cleanly designed, or minimal language, but it 
works, and is a design more based on being useful to myself.




On 6/16/2012 11:40 AM, BGB wrote:


I recently thought about it off-list, and came up with a syntax like:
async! {A}{B}{C}



--
---
|\/| Randy A MacDonald   | If the string is too tight, it will snap
|\\|array...@ns.sympatico.ca|   If it is too loose, it won't play...
  BSc(Math) UNBF '83  | APL: If you can say it, it's done.
  Natural Born APL'er | I use Real J
  Experimental webserverhttp://mormac.homeftp.net/
-NTP{ gnat }-


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread BGB

On 6/16/2012 11:36 AM, Randy MacDonald wrote:
@BGB, by the 'same end' i meant tranforming a statement into something 
that a flow control operator can act on, like if () {...} else {}  The 
domain of the execute operator in APL is quoted strings.  I did not 
mean that the same end was allowing asynchronous execution.




side note:
a lot of how this is implemented came from how it was originally 
designed/implemented.


originally, the main use of the call_async opcode was not for async 
blocks, but rather for explicit asynchronous function calls:
foo!(...);//calls function, doesn't wait for return (return value is 
a thread-handle).

likewise:
join(foo!(...));
would call a function asynchronously, and join against the result 
(return value).


async also was latter added as a modifier:
async function bar(...) { ... }

where the function will be called asynchronously by default:
bar(...);//will perform an (implicit) async call

for example, it was also possible to use a lot of this to pass messages 
along channels:

chan!(...);//send a message, don't block for receipt.
chan(...);//send a message, blocking (would wait for other end to join)
join(chan);//get message from channel, blocks for message

a lot of this though was in the 2004 version of the language (the VM was 
later re-implemented, twice), and some hasn't been fully reimplemented 
(the 2004 VM was poorly implemented and very slow).


the async-block syntax was added later, and partly built on the concept 
of async calls.



but, yeah, probably a lot of people here have already seen stuff like 
this before.





On 6/16/2012 1:23 PM, BGB wrote:

On 6/16/2012 10:05 AM, Randy MacDonald wrote:
@BGB, if the braces around the letters defers execution, as my 
memories of Perl confirm, this is perfect.  With APL, quoting an 
expression accomplishes the same end: '1+1'




no, the braces indicate a code block (in statement context), and it 
is the async keyword which indicates that there is deferred 
execution. (in my language, quoting indicates symbols or strings, as 
in this is a string, 'a', or 'single-quoted string', where a is 
always a string, but 'a' is a character-literal).


in a expression context, the braces indicate creation of an ex-nihilo 
object, as in {x: 3, y: 4}.


the language sort-of distinguishes between statements and 
expressions, but this is more relaxed than in many other languages 
(it is more built on context than on a strict syntactic divide, and 
in most cases an explicit return is optional since any 
statement/expression in tail position may implicitly return a value).



the letters in this case were just placeholders for the statements 
which would go in the blocks.


for example example:
if(true)
{
printf(A\n);
sleep(1000);
printf(B\n);
sleep(1000);
}
printf(Done\n);

executes the print statements synchronously, causing the thread to 
sleep for 1s in the process (so, Done is printed 1s after B).


and, with a plain async keyword:
async {
sleep(1000);
printf(A\n);
}
printf(Done\n);

will print Done first, and then print A about 1 second later 
(since the block is folded into another thread).


technically, there is another operation, known as a join.

var a = async { ... };
...
var x = join(a);

where the join() will block until the given thread has returned, 
and return the return value from the thread.
generally though, a join in this form only makes sense with a 
single argument (and would be implemented in the VM using a special 
bytecode op).


an extension would be to implicitly allow multiple joins, as in:
join(a, b, c);//wait on 3 threads
except, now, the return value doesn't make much sense anymore, and 
likewise:

join(
async{A},
async{B},
async{C});
is also kind of ugly.

in this case, the syntax:
async! {A}{B}{C};
although, this could also work:
async! {A}, {B}, {C};

either would basically mean async with join, and essentially mean 
something similar to the 3-way join (basically, as syntax sugar). it 
may also imply we don't really care what the return value is.


basically, the ! suffix has ended up on several of my keywords to 
indicate alternate forms, for example: a as int and a as! int 
will have slightly different semantics (the former will return null 
if the cast fails, and the latter will throw an exception).



but, since I got to thinking about it again, I started writing up 
more of the logic for this (adding multiway join logic, ...).





On another note, I agree with the thesis that OO is just message 
passing:


  aResult ? someParameters 'messageName' to anObject ?? so, once 
'to' is defined, APL does OO.


I was thinking 'new' didn't fit, but

   'new' to aClass

convinced me otherwise.

It also means that 'object oriented language' is a category error.



my language is a bit more generic, and loosely borrows much of its 
current syntax from JavaScript and ActionScript.


however, it has a fair number of non-JS features and 

Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread Wesley Smith
 If things are expanding then they have to get more complex, they encompass
 more.

Aside from intuition, what evidence do you have to back this statement
up?  I've seen no justification for this statement so far.  Biological
systems naturally make use of objects across vastly different scales
to increase functionality with a much less significant increase in
complexity.  Think of how early cells incorporated mitochondria whole
hog to produce a new species.

Also, I think talking about minimum bits of information is not the
best view onto the complexity problem.  It doesn't account for
structure at all.  Instead, why don't we talk about Gregory Chaitin's
[1] notion of a minimal program.  An interesting biological parallel
to compressing computer programs can be found in looking at bacteria
DNA.  For bacteria near undersea vents where it's very hot and genetic
code transcriptions can easily go awry due to thermal conditions, the
bacteria's genetic code as evolved into a compressed form that reuses
chunks of itself to express the same features that would normally be
spread out in a larger sequence of DNA.

wes

[1] http://www.umcs.maine.edu/~chaitin/
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread Miles Fidelman

Wesley Smith wrote:

If things are expanding then they have to get more complex, they encompass
more.

Aside from intuition, what evidence do you have to back this statement
up?  I've seen no justification for this statement so far.


As I recall, there was a recent Nobel prize that boiled down to: 
Increase the energy flowing into a system, and new, more complex, 
behaviors arise.

Biological
systems naturally make use of objects across vastly different scales
to increase functionality with a much less significant increase in
complexity.  Think of how early cells incorporated mitochondria whole
hog to produce a new species.


Encapsulating complexity (e.g, in mitochondria) doesn't eliminate 
complexity.  Encapsulation and layering MANAGES complexity allowing new 
layers of complexity to be constructed (or emerge) through combinations 
of more complicated building blocks.


Miles Fidelman


--
In theory, there is no difference between theory and practice.
In practice, there is.    Yogi Berra

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread BGB

On 6/16/2012 1:39 PM, Wesley Smith wrote:

If things are expanding then they have to get more complex, they encompass
more.

Aside from intuition, what evidence do you have to back this statement
up?  I've seen no justification for this statement so far.  Biological
systems naturally make use of objects across vastly different scales
to increase functionality with a much less significant increase in
complexity.  Think of how early cells incorporated mitochondria whole
hog to produce a new species.


in code, the later example is often called copy / paste.
some people demonize it, but if a person knows what they are doing, it 
can be used to good effect.


a problem is partly how exactly one defines complex:
one definition is in terms of visible complexity, where basically 
adding a feature causes code to become harder to understand, more 
tangled, ...


another definition, apparently more popular among programmers, is to 
simply obsess on the total amount of code in a project, and just 
automatically assume that a 1 Mloc project is much harder to understand 
and maintain than a 100 kloc project.


if the difference is that the smaller project consists almost entirely 
of hacks and jury-rigging, it isn't necessarily much easier to understand.


meanwhile, building abstractions will often increase the total code size 
(IOW: adding complexity), but consequently make the code easier to 
understand and maintain (reducing visible complexity).


often the code using an abstraction will be smaller, but usually adding 
an abstraction will add more total code to the project than that saved 
by the code which makes use of it (except past a certain point, namely 
where the redundancy from the client code will outweigh the cost of the 
abstraction).



for example:
MS-DOS is drastically smaller than Windows;
but, if most of what we currently have on Windows were built directly on 
MS-DOS (with nearly every app providing its own PMode stuff, driver 
stack, ...), then the total wasted HD space would likely be huge.


and, developing a Windows-like app on Windows is much less total effort 
than doing similar on MS-DOS would be.




Also, I think talking about minimum bits of information is not the
best view onto the complexity problem.  It doesn't account for
structure at all.  Instead, why don't we talk about Gregory Chaitin's
[1] notion of a minimal program.  An interesting biological parallel
to compressing computer programs can be found in looking at bacteria
DNA.  For bacteria near undersea vents where it's very hot and genetic
code transcriptions can easily go awry due to thermal conditions, the
bacteria's genetic code as evolved into a compressed form that reuses
chunks of itself to express the same features that would normally be
spread out in a larger sequence of DNA.


yep.

I have sometimes wondered what an organism which combined most of the 
best parts of what nature has to offer would look like (an issue seems 
to be that most major organisms seem to be more advanced in some ways 
and less advanced in others).




wes

[1] http://www.umcs.maine.edu/~chaitin/
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread David Barbour
On Fri, Jun 15, 2012 at 1:38 PM, Paul Homer paul_ho...@yahoo.ca wrote:

 there is some underlying complexity tied to the functionality that
 dictates that it could never be any less the X lines of code. The system
 encapsulates a significant amount of information, and stealing from Shannon
 slightly, it cannot be represented in any less bits.


A valid question might be: how much of this information should be
represented in code? How much should instead be heuristically captured by
generic machine learning techniques, indeterminate STM solvers, or
stability models? I can think of much functionality today for control
systems, configurations, UIs, etc. that would be better (more adaptive,
reactive, flexible) achieved through generic mechanisms.

Sure, there is a minimum number of bits to represent information in the
system, but code is a measure of human effort, not information in general.



 If things are expanding then they have to get more complex, they encompass
 more.


Complexity can be measured by number of possible states or configurations,
and in that sense things do get more complex as they scale. But they don't
need to become more *complicated*. The underlying structure can become
simpler, more uniform, especially compared to what we have today.

Regards,

David

-- 
bringing s-words to a pen fight
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread Miles Fidelman

BGB wrote:


a problem is partly how exactly one defines complex:
one definition is in terms of visible complexity, where basically 
adding a feature causes code to become harder to understand, more 
tangled, ...


another definition, apparently more popular among programmers, is to 
simply obsess on the total amount of code in a project, and just 
automatically assume that a 1 Mloc project is much harder to 
understand and maintain than a 100 kloc project.


And there are functional and behavioral complexity - i.e., REAL 
complexity, in the information theory sense.


I expect that there is some correlation between minimizing visual 
complexity and lines of code (e.g., by using domain specific languages), 
and being able to deal with more complex problem spaces and/or develop 
more sophisticated approaches to problems.


Miles



--
In theory, there is no difference between theory and practice.
In practice, there is.    Yogi Berra

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread BGB

On 6/16/2012 2:20 PM, Miles Fidelman wrote:

BGB wrote:


a problem is partly how exactly one defines complex:
one definition is in terms of visible complexity, where basically 
adding a feature causes code to become harder to understand, more 
tangled, ...


another definition, apparently more popular among programmers, is to 
simply obsess on the total amount of code in a project, and just 
automatically assume that a 1 Mloc project is much harder to 
understand and maintain than a 100 kloc project.


And there are functional and behavioral complexity - i.e., REAL 
complexity, in the information theory sense.


I expect that there is some correlation between minimizing visual 
complexity and lines of code (e.g., by using domain specific 
languages), and being able to deal with more complex problem spaces 
and/or develop more sophisticated approaches to problems.




a lot depends on what code is being abstracted, and how much code can be 
reduced by how much.


if the DSL makes a lot of code a lot smaller, it will have a good effect;
if it only makes a little code only slightly smaller, it may make the 
total project larger.



personally, I assume not worrying too much about total LOC, and more 
concern with how much personal effort is required (to 
implement/maintain/use it), and how well it will work (performance, 
memory use, reliability, ...).


but, I get a lot of general criticism for how I go about doing things...


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] The Web Will Die When OOP Dies

2012-06-16 Thread Pascal J. Bourguignon
John Zabroski johnzabro...@gmail.com writes:

 On Jun 15, 2012 2:39 PM, Pascal J. Bourguignon p...@informatimago.com 
 wrote:

 John Zabroski johnzabro...@gmail.com writes:


  Sorry, you did not answer my question, but instead presented excuses
  for why programmers misunderstand people.  (Can I paraphrase your
  thoughts as, Because people are not programmers!) 

 No, you misunderstood my answer:
 Because people don't pay programmers enough.

 In the words of comedian Spike Milligan, All I ask is for the chance to 
 prove money can't make me happy.

 But my motto comes from pianist Glenn Gould: the ideal ratio of performers to 
 audience is one. I have never seen a software team produce better results 
 with better pay, but most of
 the great advances in software came from somebody doing something differently 
 because any other way was simply wrong.

 Having seen millionaires throw their money around to build their dream app 
 (the Chandler project featured in Scott Rosenberg's book Dreaming in Code and 
 all of Sandy Klausner's
 vaporware graphical programming ideas), and seeing what road blocks still 
 remained, I disbelieve your answer.

 Who invented the spreadsheet? One person.
 Who invented pivot tables? One person.
 Who invented modeless text editing? One person.

 How much money is enough, anyway?  In the words of John D. Rockefellar, A 
 little bit more?

I wasn't speaking of the work of art programmers would do anyway.

I was speaking of what the customers want.  If they want to have the
same services as offered by plumbers (you don't hold the spanner to a
plumber, or you don't bring your own tubes; you don't get wet;  you just
call him, and let him deal with the leak: simple and nice user
interface, good end-result, including the hefty bill), then you'll have
to pay the same hourly rates as what you pay to plumbers.  Just google
some statistics.


-- 
__Pascal Bourguignon__ http://www.informatimago.com/
A bad day in () is better than a good day in {}.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc