Re: [fonc] About the reduce of complexity in educating children to program

2014-09-19 Thread Pascal J. Bourguignon
Iliya Georgiev  writes:

> Hello,
> I am addressing this letter mainly to Mr. Alan Kay and his fellows at
> VPRI. I have an idea how to reduce complexity in educating children
> to program. This seems to be a part of a goal of the VPRI "to improve
> "powerful ideas education" for the world's children".
>
> But in case my idea turns into success, a moral hazard emerges. If
> the children (6-14 years old) understand things better and can even
> program, can they become a victim of labor exploitation? Up to know
> they could be exploited physically. From now on they could be
> exploited mentally. OK, in the north in so called developed countries
> they may be protected, but in the south...
>
> On the other side, don't we owe to the tomorrow people the
> possibility to understand the world we leave to them? Or they will be
> savages that use tools, but do not know how work. 
>
> So if you want to wear the burden of the moral hazard, I will send
> the description of my idea to you and help with what I can. You will
> judge, if it is worth to do it.  It would be easily if people
> work cooperatively. That is a lesson children should learn too. The
> software could be made from one person, but there may be
> more challenges than one think. In case you agree to do it I will
> want you to publish online the results of the experiment. And if
> possible to make the program to run in a web browser and
> to release it freely too, just as you did in some of your recent
> experiments. 
>
> It is strange that unlike more scientists, I will be equally happy
> from the success and failure of my idea.

This is a choice only you can make (or a trusted friend who could keep
it secret, but anything known by more than one person is not a secret
anymore).

In my experience, even dangerous ideas, that you refrain communicating,
are soon discovered and publicated by others.

Or rather, there are some people whose sole purpose is to find and
exploit anything and any idea they can, and they will outguess you
already.

So if your idea can do some good, and if there can be some good people
that may use it for good, instead of using it as an evil weapon, perhaps
it would be worth sharing it.

-- 
__Pascal Bourguignon__ http://www.informatimago.com/
“The factory of the future will have only two employees, a man and a
dog. The man will be there to feed the dog. The dog will be there to
keep the man from touching the equipment.” -- Carl Bass CEO Autodesk
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] About the reduce of complexity in educating children to program

2014-09-19 Thread Casey Ransberger
Hello Iliya.

While you directed your inquiry to the people at VPRI (and I'm not there,) I 
hope you'll forgive my curiosity. 

Questions inline, and hopefully not too many of them. 

> On Sep 19, 2014, at 2:16 AM, Iliya Georgiev  wrote:
> 
> Hello,
> I am addressing this letter mainly to Mr. Alan Kay and his fellows at VPRI. I 
> have an idea how to reduce complexity in educating children to program. This 
> seems to be a part of a goal of the VPRI "to improve "powerful ideas 
> education" for the world's children".
> 
> But in case my idea turns into success, a moral hazard emerges. If the 
> children (6-14 years old) understand things better and can even program, can 
> they become a victim of labor exploitation?

All stop. Anyone can be exploited. It just takes a big enough gang of 
exploitive people. Is this really a question, or is it more of a statement?

> Up to know they could be exploited physically. From now on they could be 
> exploited mentally. OK, in the north in so called developed countries they 
> may be protected, but in the south...
> 
> On the other side, don't we owe to the tomorrow people the possibility to 
> understand the world we leave to them? Or they will be savages that use 
> tools, but do not know how work. 

I made a half-assed promise at the top to only ask questions, but your phrasing 
here struck me as stunningly beautiful. "The tomorrow people." If I'd written 
that, I'd be inclined to capitalize and underline the words. It'd be a great 
title for a science fiction novel. 

> So if you want to wear the burden of the moral hazard, I will send the 
> description of my idea to you and help with what I can.

All stop. Why must I decide to bear a burden before I read your words? What 
risk is there in sharing them without contract?

> You will judge, if it is worth to do it.  It would be easily if people work 
> cooperatively. That is a lesson children should learn too. The software could 
> be made from one person, but there may be more challenges than one think. In 
> case you agree to do it I will want you to publish online the results of the 
> experiment. And if possible to make the program to run in a web browser and 
> to release it freely too, just as you did in some of your recent experiments. 

Isn't that asking a bit much? Would not asking for permission to publish your 
results yourself be enough?

> It is strange that unlike more scientists, I will be equally happy from the 
> success and failure of my idea.

Ouch. Yeah, I see what you're trying to say. Once again, I've failed to ask a 
question. What I'll say instead: I've never known a good scientist who would 
not be happy to know that her hypothesis was incorrect, because in doing so, 
she's learned something about the universe and our place in it, which is what 
she set out to do in the first place. 

> Best regards,
> 
> Iliya Georgiev

Hope I haven't created unnecessary noise with this post. I assure you that you 
will forgive my curiosity!

--Casey___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


[fonc] About the reduce of complexity in educating children to program

2014-09-19 Thread Iliya Georgiev
Hello,
I am addressing this letter mainly to Mr. Alan Kay and his fellows at VPRI.
I have an idea how to reduce complexity in educating children to program.
This seems to be a part of a goal of the VPRI "to improve "powerful ideas
education" for the world's children".

But in case my idea turns into success, a moral hazard emerges. If the
children (6-14 years old) understand things better and can even program,
can they become a victim of labor exploitation? Up to know they could be
exploited physically. From now on they could be exploited mentally. OK, in
the north in so called developed countries they may be protected, but in
the south...

On the other side, don't we owe to the tomorrow people the possibility to
understand the world we leave to them? Or they will be savages that use
tools, but do not know how work.

So if you want to wear the burden of the moral hazard, I will send the
description of my idea to you and help with what I can. You will judge, if
it is worth to do it.  It would be easily if people work cooperatively.
That is a lesson children should learn too. The software could be made from
one person, but there may be more challenges than one think. In case you
agree to do it I will want you to publish online the results of the
experiment. And if possible to make the program to run in a web browser and
to release it freely too, just as you did in some of your recent
experiments.

It is strange that unlike more scientists, I will be equally happy from the
success and failure of my idea.

Best regards,

Iliya Georgiev
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc