I have always regarded Ursula Le Guin's *The Dispossessed 
<http://en.wikipedia.org/wiki/The_Dispossessed> *as one of the greatest SF 
novels ever written. In her depiction of the anarchist society of Annares, 
the whole administration of practical organisation is carried out by 
computers. This serves to take a major component of the exercise of power 
out of the area of human relations.

"Rule" is basically the exercise of power. The will to power seems to be 
one of the strongest human urges - indeed, it's wider than just human - 
take the constant jostling for rank and status in a wolf-pack, for example. 
I suspect most of those of us involved here in this forum are freaks as we 
don't seem to possess much of it. Personally I don't get it, but I must 
acknowledge that it seems to be (and always has been) an immensely strong 
driving force for a lot of people.

Our concepts of freedom and autonomy make my initial reaction to the idea 
of "rule by machine" instinctively and immediately suspicious. But then, on 
reflection, I'm already being "ruled" by all sorts of shadowy 
people/groups/elites, who daily make all sorts of decisions which have huge 
effects on the life I live and who certainly don't have an sense of my 
well-being in mind (apart from that portion of my material assets which is 
part of a pension fund/savings/investment fund/life insurance - which then 
has the notionally privileged status of being the object of 
shareholder-value). Could machines fuck things up any worse than humans do 
at the moment?

I wonder if there aren't some deep neurotic guilt/fear things at work here. 
There's the old story of the Sorcerer's Apprentice, who Goethe has 
despairingly calling out; "Herr, die Noth ist groß! Die ich rief, die 
Geister, Werd’ ich nun nicht los. [Master, I'm in deep shit here! I can't 
get rid of the fucking spirits I summoned]." Or the idea that when the 
Singularity <http://en.wikipedia.org/wiki/Technological_singularity> comes, 
the first things the machine intelligences will do is get rid of us for 
being hopelessly corrupt and imperfect.

It's the feeling that we're giving over control to something else - 
something we may try to programme so that it is benevolent towards us - but 
where there are no guarantees. But what guarantees do we have right now? 
And who controls?
 

Am Montag, 9. März 2015 08:44:34 UTC+1 schrieb archytas:
>
> Human leadership is corrupt.  The history is clear.  We form empires of 
> violence.  At the start of WW1, about 1911 with the Italian invasion of 
> part of the declining Ottoman Empire, we had a population the planet could 
> manage, new technologies that could have released us from work serfdom and 
> the potential to grow green and surpass our libidinal-violent biology. 
>  Instead we went to war and have over-populated like a bacterial colony 
> poisoning itself.  This war to end all war led to another one, largely 
> about exhausting the Wehrmacht on Soviet forces.  I have no idea how these 
> wars started, interesting given how much education I've had.  The Americans 
> won and everyone else lost, but Americans generally wanted no part of the 
> stuff.  Various fables on cause make no sense.  Much can be said on this, 
> yet we evade the fairly obvious reality that human society is generally 
> dire.  About 250,000 of the 400,000 inhabitants of the zenith of the 
> Athenian democracy were slaves, and slaving was the major Black Sea 
> industry from then until 1870.
>
> Machines could help us get over ourselves and establish a rational 
> society.  This would be a rebellion to remove the allocation class that 
> owns nearly everything a monetary value can be put on.  We would embody 
> knowledge in the machines (we already do) and rely on their genuine 
> rationality instead of our faux version, corrupted by our libidinal-violent 
> biology. Most people are very scared of intelligent machines and rather 
> like the idea humans are superior because we can remove their plugs.  We 
> worry they will destroy us in a world with 8,000 nuclear weapons in safe 
> human hands that are not problematic.  Genghis Khan killed about a third of 
> his known world's population.
>
> Why do we hate machines so much?  Do we fear their rationality shames us? 
>  We are all now chronically ignorant compared with extra-somatic databases. 
>  Maybe we fear control by machines operating in the interests of a small 
> group or police state - yet this 'machine' is already in place as a 
> socio-technical human endeavor as the allocation class in real power we 
> can't vote out.  We could change a lot if we weren't so naff about this. 
>  Anyone here even think about it?
>
> In terms of data, what we chatter about, changes as data if we are not 
> actually interested in large-scale human change. 
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
""Minds Eye"" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to