BK> But more importantly, as the people who build the tools that make this
BK> possible, what is our role in deciding what is and isn't too far? I
BK> don't want to drag in the emotional intensity of this comparison, but
BK> it is similar to the scenario of arms dealing. Guns can be used to
BK> liberate or to tyrannize, and so can big data. Don't we have some
BK> level of obligation to at least consider the consequences of the
BK> technology we are providing and who we are providing it to?
I'd say: If you help someone do something, then you share some of the
responsibility for what they do.
If you think that what they're doing is sort of dubious, or not so good,
or very wrong, or really awful, then you certainly need to think about
whether you're ok with that.
And to consider your alternatives: If you can't afford to lose your job,
you may have to do some things you'd rather not do sometimes. If that's
not acceptable to you, then you've got a really hard choice to make.
I don't think you can say "hey man, I just run the computers". If someone
uses those computers to do something horrible without your knowledge, then
sure, you can't be expected to have done anything about that. But if you
know that the computers you run are being used to do horrible things, then
you need to either be ok with saying (at least to yourself) "I am helping
do horrible things", or stop doing that.
Which way you choose is a very personal choice, and I'm not going to tell
anyone that it's always right to quit a job that doesn't 100% align with
your moral sense, or vice versa. But I do think that it's always right to
be honest with yourself (and, ideally other people) about what you're doing.
-Josh ([email protected])
_______________________________________________
Discuss mailing list
[email protected]
https://lists.lopsa.org/cgi-bin/mailman/listinfo/discuss
This list provided by the League of Professional System Administrators
http://lopsa.org/