Artificial Intelligence is a ruse, a red herring, a canard, an elaborate diversion from more pressing and contested issues. It is used to justify decisions that have already been made, or that don’t matter. It is the modern version of manifest destiny, the white man’s burden or the 3/5 clause of the constitution — an excuse for the continuation of privilege, exploitation and dispossession of the weak, poor and marginalized. Ownership of people’s content and data, even that of dubious integrity and economic value, provides access to vast sums of capital to private (usually privileged white male) interests — used to build systems that trap those same people in Sisyphean mazes of their own construction, much like the large media conglomerates once did. Meanwhile the powerful forces that shape our world-materially, socially and emotionally-continue unabated.
What important decisions does AI really make? What advertisements that you will see? AI or not, you will still see them, as you always have. What social services that you are eligible for? AI or not, they will still be woefully inadequate and difficult to access. Whether or not you are considered for a job, a loan, or as a suspect in a crime? AI or not, minorities and the poor will always be more likely to end up in jail, rather than in a corner office. All this concern about fairness begs the question — has the world ever been fair? Was slavery fair? Colonialism? The holocaust? The caste system? AI and big data are just the modern logic used to justify those decisions that have always been made by the powerful in the service of unfair and unequal interests.
Meanwhile, Rome burns. The most important questions of our time do not require sophisticated algorithms or big data to fathom. The ice is melting, the forests are burning, the seas are rising, economic divides are widening, and our political system is not up to the task of putting competent people in charge. Black people continue to be killed, at least when they are not economically useful or complacent. Women are harassed and treated as objects, especially by those in power. We are becoming mentally ill (especially our youth), slowly alone or in bursts of violent misdirected anger. Drugs provide solace and diversion, at least for the few who have access to legal therapies. The rest must make do with other options, until they are addicted or dead. We are running out of fuel, and our infrastructure is crumbling and dated. Our food and transportation systems are inefficient, resource-intensive and unhealthy. Our health systems are wasteful and byzantine, difficult to navigate even for the reasonably wealthy, never mind the poor and unemployed.
The root cause is also not difficult to understand. Our winner-take-all economic system (i.e. capitalism) is designed to grow at all cost, without concern for social, emotional or environmental implications. Our political system does not provide enough checks on this growth, and is in fact captured by the same interests who stand to benefit from its continuation. Both of these take advantage of superstition and bias to divide, alienate and conquer. Why should AI be any different, when the social and economic forces that create it are one and the same?
Others have questioned whether large-scale AI systems should be built at all, given their ambiguous moral, social, political and economic implications. While that is certainly a worthy question, in my mind there is one that is even more fundamental: wouldn’t we all be better off doing something else with our time and energy, instead of building these systems, or hand-wringing about their consequences? More students at all levels spend their time learning to code, when they could be learning about history and diverse political and economic systems. Vast amounts are spent on systems to monitor and classify, rather than connect us across our disparate heritage and memories, arrive at some kind of reconciliation with the injustices of our past, and to collectively envision a better future. Those with a critical consciousness spend their time worrying about the advertisements we are seeing, how our images are being classified, or how algorithms reinforce existing racial, social, political and economic divisions, rather than questioning the very basis of our society, and exploring alternative ways of living and being.
It’s not that these issues are unimportant, and undeserving of our attention. I just wonder whether a disproportionate focus on AI progressivism takes some of the air out of the room from more fundamental (and radical) approaches to addressing societal challenges. In general, we do not have enough shared context or a vision of the future to counter the powerful and ubiquitous forces that are driving the world. Will big data be the glue that binds us, or the explosive that drives us further apart? Will it be our downfall, or our salvation? Neither, most likely. AI is no doubt complicit in many of these trends, but they did not start with AI, and will not end with it either.
AI is just another symptom of the affliction that pervades our society, an intellectual parlor game to play while the key questions continue to go under-asked. When we look back through the reflection of history, I believe that we will see AI as the equivalent of Nero’s fiddle — an expensive and not-as-entertaining diversion that distracted us from the real issues of our time. If we are genuinely interested in using technology for social good, more of us need to start asking the right questions about it: whose interests does it serve, how can it challenge rather than reinforce existing sources of power, and most importantly, how its resources and attention can be marshaled to support alternative social and economic arrangements, and to help us develop a shared context and vision for the future at various scales. The rest is just mere implementation detail.
Note: Sebastian Benthall has a smart take on this topic that I saw while writing this, although I disagree with his suggestion that a renewed focus on the quantitative social sciences is the answer. But more on that in a future post.