On 3/20/2013 4:04 PM, meekerdb wrote:
> On 3/20/2013 10:59 AM, Stephen P. King wrote:
>> On 3/20/2013 6:26 AM, Bruno Marchal wrote:
>>>> On 19 Mar 2013, at 23:40, meekerdb wrote:
>>>> I think it likely that the first applications will be providing
>>>> soldiers with augmented senses and communication. Just as AI
>>>> research has been funded by the military. Threats of war are often
>>>> used to justify bypassing ethical considerations and rushing into ill
>>>> considered projects.
>>> Sadly very plausible.
>> I would claim that it is not only implausible but inevitable given
>> the current reluctance in the West to commit humans to the in person
>> task to the destruction of its enemies.
> You write 'current reluctance' as though it were different in the past
> and might change in the future. The obvious reason for this
> reluctance is that if you commit humans to the task then they are more
> exposed to risk. Only an irrational society would risk it's members
I am trying to be optimistic. You make a good point as it shows the
irrationality of current policies. My argument is that the severance of
the immediate physical conenction between actions and actors leads
inevitably to objectification of 'the enemy' and a general reduction in
the reluctance to take extreme measure against them. Warfare become
indistinguishable from playing a FPS game. We see a very real example of
this in the currect US policy of Drone usage. Are we training our
children to be 'remote control killers' by allowing them to play FPS games?
What happens when we implement full synthetic sapience in drones?
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to email@example.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.