And so the “Alexa” conversation now goes to a whole new level. This is a solid, sober piece of reporting that needs to be taken seriously by anyone who cares about the welfare of human beings as our life world becomes increasingly infiltrated by Giant Tech. I am especially looking at the many, many people who are calling themselves “AI Ethicists.” Here’s you opportunity to do some concrete problem crunching.
Personally, I am hoping that this story and the Alexa conversation will push us to move away from our current naive ceding to Giant Tech decisions that affect our lives in very deep ways. We need to get a lot more involved, rather than maintain our current infantile state of receivers of what a very small number of people are deciding (sometimes without much thinking, or no thinking at all, such as the “Alexa” naming example) on matters that cause harm to a large number of flesh and blood human beings.
Think: we have evolved to protect ourselves from physical harm (asbestos, tobacco, seat belts, mercury, lead, etc.) in products. It is time that we took our mental health as seriously as our physical health. This is in my view the next step in our evolution as members of a society that doesn’t tolerate easy self-serving bamboozlement from heavily monied interests for whom the suffering and death of human beings is just “the cost of doing business.” If we accept this arrangement, then we have only ourselves to blame.
Last point: This is NOT a call to back off the wonderful technology of smart speakers. Smart speakers and voice assistants deliver tremendous value. (And my livelihood depends on their success too.). It is instead a call to wake up and push for a new paradigm where these Giant IT companies who know so much about us and affect us in so many ways, are engaged by us, the consumers, and in a way that is not paternalistic.