The science behind automated behavior control: https://captology.stanford.edu/
On 1/15/21, Stefan Claas <[email protected]> wrote: > Thanks for the info, much appreciated! > > I also raised concerns a while ago publicity on 'social' media platforms, > remembering Professor Weizenbaum's (R.I.P) eliza ... > > Best regards > Stefan > > On Fri, Jan 15, 2021 at 9:08 PM Karl <[email protected]> wrote: >> >> https://www.ajl.org/ >> >> In today’s world, AI systems are used to decide who gets hired, the >> quality of medical treatment we receive, and whether we become a >> suspect in a police investigation. While these tools show great >> promise, they can also harm vulnerable and marginalized people, and >> threaten civil rights. Unchecked, unregulated and, at times, unwanted, >> AI systems can amplify racism, sexism, ableism, and other forms of >> discrimination. >> >> The Algorithmic Justice League’s mission is to raise awareness about >> the impacts of AI, equip advocates with empirical research, build the >> voice and choice of the most impacted communities, and galvanize >> researchers, policy makers, and industry practitioners to mitigate AI >> harms and biases. We’re building a movement to shift the AI ecosystem >> towards equitable and accountable AI. >
