On Tue, Apr 27, 2021 at 09:04:53AM -0400, Nico Kadel-Garcia wrote: > The movie is, itself, profoundly biased. It didn't explore at all why > a public housing project might benefit from cameras on the door of a > densely populated building with numerous poor, old, or unhealthy > tenants.
On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson <[email protected]> wrote: > Data does not remove bias. And one can and should both read the article and > watch the movie. I imagine the camera was helpful for young white men entering the building. But how does that help the old non-white women who are locked out or their apartments because the software fails 30% of the time for them? An automated camera makes sense if software is perfect, but the point of the film and the paper is that the software is not perfect. A automated camera should work BETTER than adept (and unbiased) human being paying attention to a monitor, not just CHEAPER. "Dancing bearware" that only pretends to do the job should result in prosecution and hefty penalties for the software designers and decision makers if their "cost saving" replacement of trained human security guards results in a crime ... letting a criminal in or locking a tenant out, to be robbed on the doorstep. I too was bothered by the film's seeming "lefty bias", but "my side" is *human achievement* ... engaging all 8 billion of us. Leaving people out is economically suboptimal, but most organizations are insensitive to the costs they impose outside their organization. We create (often bad) laws to internalize those costs so the organizations MUST pay attention. Sadly, laws usually just make organizations pay attention to loopholes. If the automated cameras are redesigned to do their job perfectly, I would love that. If diligent security guards are trained and employed for the task, I'm for that as well. What I am not for is replacing quality human effort with slapdash "cost cutting", which often means "job cutting". In this case, putting human security guards on the dole, or not hiring enough competent software designers to properly design and properly TEST recognition software, that works for everyone, not just software designers and ethnically similar product purchasers. I imagine a room full of $10/hour Chinese programmers designing this software for Chinese customers. I bet their software would do a good job recognizing blacks in the US if there were enough blacks in China to test their software with. Offshoring has its costs as well, and the point of the film is that the costs are imposed on those least able to pay them. I also imagine devolving software purchasing decisions downwards to the people who are affected by them. In my ideal world, some of those tenants would be involved in testing and selecting the software. Or training tenants to look at security cameras as a part-time job; perhaps for a rent reduction. Software might be used to insure that those "informal employees" are doing the task they are paid for, but that could have bias as well. Quality is hard work. Nobody is perfect, and some folks are quite imperfect - thugs in Armani suits. Automating imperfection is the opposite of quality, while designing to compensate for imperfection is the path to continuous quality improvement. But hey, I'm a chip designer. I invented a circuit that is used to ultra-cheaply identify individual electronic devices, WITHOUT identifying the individuals using them. I wrote A LOT of Linux software to test and improve those designs; we got the failure rate down below 30 parts per million, and we designed fallbacks for the unhappy 30. The customers would have accepted worse - but I would not. At the end of the day, our professional satisfaction rests on what we have accomplished, not on what we are paid to do it. If you are just doing it for the money, please go into investing, not into engineering. Keith -- Keith Lofstrom [email protected]
