On 26/07/2011, at 1:43 AM, Igor Stasenko wrote:

> (quotes are broken)
> 
> On 25 July 2011 16:26, Julian Leviston <[email protected]> wrote:
>> 
>> On 26/07/2011, at 12:03 AM, Igor Stasenko wrote:
>> 
>> In contrast, as you mentioned, TCP/IP protocol which is backbone of
>> today's internet having much better design.
>> But i think this is a general problem of software evolution. No matter
>> how hard you try, you cannot foresee all kinds of interactions,
>> features and use cases for your system, when you designing it from the
>> beginning.
>> Because 20 years ago, systems has completely different requirements,
>> comparing to today's ones. So, what was good enough 20 years ago,
>> today is not very good.
>> 
>> That makes no sense to me at all. How were the requirements radically
>> different?
>> I still use my computer to play games, communicate with friends and family,
>> solve problems, author text, make music and write programs. That's what I
>> did with my computer twenty years ago. My requirements are the same. Of
>> course, the sophistication and capacity of the programs has grown
>> considerably... so has the hardware... but the actual requirements haven't
>> changed much at all.
>> 
> 
> If capacity of programs has grown, then there was a reason for it
> (read requirements)?
> Because if you stating that you having same requirements as 20 years
> ago, then why you don't using those old systems,
> but instead using today's ones?
> 

Well, Igor, if something more efficient comes along, I will use it, and it will 
*probably* work just fine on 20 year old hardware... because *my* requirements 
haven't changed much. I will grant you that it's probably going to be quite 
hard to get a commodore 64 connected to a router, because its not very 
compatible, but what I'm trying to say here is that most of the "requirements" 
you're talking about are actually self-imposed by our computing system. Having 
something that can do 2.5 million instructions per second is ludicrous if all I 
want to do is type my document, isn't it? Surely any machine should be able to 
handle typing a document. ;-) (Note here, I'm obviously ignoring the fact that 
nowadays, we have unicode).

What I'm getting at is *MY* requirements haven't changed much. I still want to 
send a communication to my mother every now and then, and I still want to play 
games. In fact, some of my favourite games, I actually use emulators to play... 
emulators that run 20 year old hardware emulation so I can play the games which 
will not run on today's machines ;-)

One of my favourite games is Tetris Attack, which me and my friend play on his 
XBOX (original, not 360) in a Super Nintendo Emulator...

Do you find that amusing? I sure as hell do. :)

But I digress - my intentions are relatively similar that they were 20 years 
ago... I like to write programs, and I like to use programs to draw, and I like 
to listen to music, solve problems, create texts, make music... etc. The 
IMPLEMENTATIONS of how I went about this are vastly different, and so if you 
like you can bend "requirements" to a systems-view of requirements... and then 
I will agree with you... my requirements that I have today of my computer in 
terms of TECHNICAL requirements are vastly different, but in terms of 
interpersonal requirements, they're not at all different - maybe slightly...

Making music satisfies a creative impulse in me, and I can make it using my 
$10,000 computer system that I have today, or I can satisfy it using a 
synthesizer from the 80's. One of them does a vastly better job for me, but 
this is a qualitative issue, not a requirements issue ;-)

> Speaking of requirements,  a tooday's browser (Firefox) running on my
> machine takes more than 500Mb of system memory.
> I have no idea, why it consuming that much.. the fact is that you
> cannot run it on any 20-years old personal computer.
> 
> 

Well this is the point of the STEPS project and the like - get rid of the 
cruft, and we will have an optimized system that will run like lightning on our 
current day processors with all their amazing amount of memory. 

>> And here the problem: is hard to radically change the software,
>> especially core concepts, because everyone using it, get used to it ,
>> because it made standard.
>> So you have to maintain compatibility and invent workarounds , patches
>> and fixes on top of existing things, rather than radically change the
>> landscape.
>> 
>> I disagree with this entirely. Apple manage to change software radically...
>> by tying it with hardware upgrades (speed/capacity in hardware) and other
>> things people want (new features, ease of use). Connect something people
>> want  with shifts in software architecture, or make the shift painless and
>> give some kind of advantage and people will upgrade, so long as the upgrade
>> doesn't somehow detract from the original, that is. Of course, if you don't
>> align something people want with software, people won't generally upgrade.
>> 
> 
> Apple can do whatever they want with their own proprietary hardware
> and software, as long as its their own.
> Now try to repeat the same in context of Web.
> Even if Apple will rewrite their Safari 5 times per year, they will
> still has to support HTTP, HTML, Javascript etc.
> So, you miss my point.

Yes, Apple can, and to a large degree, ARE doing this. Their iOS platform is 
their best attempt yet at building an infrastructure of code that runs across 
the internet but isn't the web, doesn't rely on the web, and yet uses the 
internet for its communications mechanism (ie not necessarily the web).

I'm not really missing your point. ;-) I turned Adobe Flash off on my main 
browser a while back, and that's been an interesting experience... seeing how 
lots of people have put all their "data" into that technology (for example, 
ordering a pizza with pizza hut is impossible without flash in Australia) ;-) I 
can do it with an iPhone, though ;-) And yeah, I'm aware they both use HTTP.

I guess my question is... what's stopping an alternative, replacement, 
backwardly-compatible protocol from taking over where http and https leave off? 
And what would that protocol do? One of the issues is surely the way our 
router-system structure is in place... if there was going to be a replacement 
for the web, it would *have* to end up being properly web based (down to the 
packet level), surely... because I simply hate the fact that if three people in 
my house request the front page of the financial times, our computers all have 
to go get it separately. Why don't the other two get it off the first one, or 
at the very least, off the router?

I really think it's necessary to have a disconnect between intention and 
implementation. That would let us be clear about best practices for 
implementation in connection with a particular intention. As computer 
programmers, we rarely focus on the separation between the two, but they're 
intimately related and yet also quite definitely separate. Separating them 
allows us to allow for differences and allows us to be accepting of varying 
methods.

For example... my intention perhaps is to make coffee... I have my way of 
making it with my espresso machine that I really love. I have a house mate who 
loves his coffee a certain way. I don't like it when it's made like that - 
that's his implementation of his intention to enjoy a cup of coffee, but I know 
how to make it so he loves it, actually possibly better than HE can make it 
(wow that's an interesting thing isn't it?) because I understand his intention 
(he loves his coffee with just this certain balance of coffee, sugar, water and 
milk) and I know this from making him many cups of coffee... but we both have 
different implementations of making him a cup of coffee. I use an espresso 
machine, he uses a filter machine. If he asked me to make it using his filter 
machine, I could easily do that... and yet still have the same intention - to 
make him a cup of excellent coffee... :)

Do you see? Now, we don't have this in computing. We need it desperately, 
because computers can actually mostly handle implementations quite well (see 
LLVM) so long as they have their intentions carefully communicated to them.

We don't even have languages of intention - just languages of implementation. 
We're left to "abstract out" the intention from reading the implementation.

WHAT A FUCKING JOKE. (Please excuse the swearing - it's there for extreme 
emphasis, not rudeness),

Here's another interesting thing... If I go to a hotdog vendor, this is one 
method of satisfying my hunger... I can call this an implementation of 
satisfying my hunger. Now, I can also have different implementations of going 
to a hotdog vendor... (do you see where I'm going with this?) Intention and 
implementation are recursive, overlapping enclosing concerns.

If my intention is to satisfy my hunger and nothing more, then a hotdog vendor 
would do just as well as a five star restaurant. Perhaps even more so because I 
don't have to follow any protocols on HOW I eat... if my intention is to 
satisfy my hunger AND to give my body a wonderful set of nutrients, then 
perhaps I cant go to the hotdog vendor anymore, but a different set of options 
(implementations) arise... 

I "throw" this stuff out there, but this is the most important thought work 
I've done in my entire life. I think it's vastly important, and most humans 
ignore it entirely.

Julian.



_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to