Code Dependent by Madhumita Murgia: human cost of AI’s data colonialism – review

Code Dependent by Madhumita Murgia: human cost of AI’s data colonialism – review

Unveiling the shadows: a critical exploration of AI's impact on society

by Suswati Basu
0 comments

We’re all inundated with information about this profound shift in the digital world. From exciting advances in automation to the dark side of job insecurity and deepfakes, artificial intelligence has set the globe alight. Among the various essays, articles, and commentaries on AI travelling the interweb, Women’s Prize for Non-Fiction finalist Madhumita Murgia illuminates a sobering perspective on the digital era’s newest economic engine in her book “Code Dependent”. Labelling this burgeoning landscape as a form of “data colonialism,” the Financial Times AI editor looks into the myriad ways tech giants use human-generated data, often at the cost of those less privileged.

Rating: 5 out of 5.
This image is a vibrant and colourful collage featuring various book covers related to the theme of artificial intelligence and its impact on society. In the centre, prominently displayed, is "CODE DEPENDENT: Living in the Shadow of AI" by Madhumita Murgia. Surrounding this central book are others with intriguing titles like "Algorithm: How AI Can Steal Your Future" by Hilke Schellman and "Filterworld: How Algorithms Flattened Culture" by Kyle Chayka. The background includes digital elements such as binary code, computer keys, and tech accessories like a mouse, emphasising the digital and technological focus of the books. The image creatively encapsulates a collection of works exploring AI's influence on human behaviour and societal norms.
Code Dependent by Madhumita Murgia, alongside some other recent notable AI books

The book examines an aspect which has been overlooked in the last few years. The immense value extraction by tech behemoths such as Google and Meta, who “have applied machine learning to target advertising as narrowly as possible and grow their worth up to $1tn,” which is seen by some as a new form of empire-building. This model, identified by Harvard University psychology professor Shoshana Zuboff as ‘surveillance capitalism‘, is central to the discourse on data exploitation, illustrating how personal details are commodified under the guise of technological advancement.

“Our blindness to how AI systems work means we can’t properly comprehend when they go wrong or inflict harm – particularly on vulnerable people. And conversely, without knowledge of our nature, ethical preferences, history and humanity, AI systems cannot truly help us all.”

Who is Madhumita Murgia?

MEET THE AUTHOR

Madhumita Murgia

Madhumita Murgia is the Artificial Intelligence Editor at the Financial Times. Born in Mumbai, she pursued a degree in biology at Oxford University and contributed to AIDS vaccine research before transitioning to journalism.

Code Dependent by Madhumita Murgia
Code Dependent by Madhumita Murgia
Black and white image of Madhumita Murgia, author of Code Dependent
Madhumita Murgia. Credit: Matt Round Photography

She earned her Master’s in science journalism from New York University and started her writing career at WIRED magazine. Over the past eight years at the Financial Times, she has focused on the human consequences of technological advancements, consistently making front-page headlines.

In 2017, Murgia was awarded the prestigious Stern-Bryan Fellowship, created by literary agent Felicity Bryan and the Washington Post. She recently won the Science & Technology Journalist of the Year award at the 2024 UK Press Awards for her insightful reporting on generative AI and its impact on business. She has also been highly commended and repeatedly shortlisted in this category.

Her debut book, “Code Dependent,” was released in the UK by Picador in March 2024, and it has been shortlisted for the first-ever Women’s Prize for Non-Fiction.

What is data and cyber colonialism?

Quoting artist James Bridle, Murgia underlines how tech companies “made their money by inserting themselves into every aspect of everyday life.” This intrusion reaches into the most intimate corners of individual creativity and expression, a pervasive overreach that commodifies even our dreams and private conversations.

Generative AI—a technology capable of producing human-like text, images, and sounds— is seen to be part of a historical continuum of exploitation. The technology, as described, isn’t just a marvel. This type of AI, exemplified by tools like ChatGPT, brings about a new era where technology can mimic human-like creativity, yet this innovation rests on a “bedrock of human creativity,” raising questions about the ethics of such technologies.

The author discusses the concept of ‘data colonialism’ as explored by sociologists Nick Couldry and Ulises Mejias in their paper “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject”. They compare the relentless data extraction to historical colonial practices, arguing that just as land and resources were historically seized, now data is extracted without equitable compensation to those it is taken from.

This theme resonates particularly in the depiction of gig workers whose lives are tightly controlled by algorithms, with little regard for their welfare. Murgia, who is a former immunologist, discusses the paradox of data-labelling workers unwittingly paving the way for their own obsolescence: “Even the objective of data-labelling work felt extractive: it trains AI systems, which will eventually replace the very humans doing the training.”

What is data colonisation in Africa?

“To build AI, Silicon Valley’s most illustrious companies are fighting over the limited talent of computer scientists in their backyard, paying hundreds of thousands of dollars to a newly minted Ph.D. But to train and deploy them using real-world data, these same companies have turned to the likes of Sama, and their veritable armies of low-wage workers with basic digital literacy, but no stable employment.”

She focuses on the individuals behind the scenes—those whose labour powers these AI systems. She takes us into the lives of data annotators like Ian from Nairobi, who, despite contributing to complex AI models (like those used in self-driving cars), face a precarious future with minimal job security and recognition. Their work, vital yet undervalued, showcases the new age of digital labour, mirroring the exploitative labour practices of earlier industrial revolutions.

This is juxtaposed with the plight of content moderators employed by Sama to work for Meta, who suffered psychological trauma from their work, challenging the narrative that such jobs are upliftment rather than exploitation. She writes that she “later discovered that many of them had nightmares for months and years, some were on antidepressants, others had drifted away from their families, unable to bear being near their own children any longer.”

Watch: Who Wrote This? author Naomi Baron: AI’s threat to language

Murgia questions the moral boundaries of outsourcing such work to low-wage countries, especially as there is a lawsuit against Meta and Sama that encapsulates the human rights debates entangled with tech. In one of the biggest cases of its kind, a group of nearly 200 petitioners sued both Sama and its client Meta for alleged human rights violations and wrongful termination of their contracts.

These workers have agency, and the fact that they are fighting against this level of exploitation suggests that there is something inherently wrong with the AI work model. Daniel is among those suing the company, telling Murgia: “When you are poor and hungry […] you basically don’t have a choice, and you don’t have a voice if you are exploited. The only thing they did was to give people something to eat. That’s it.”

“‘All revolutions are built on the backs of slaves. So if AI is the next industrial revolution, then those who are working in AI training and moderation, they are the slaves for this revolution.’”

Mercy Mutemi, Lawyer quoted in ‘Code Dependent’

Kenyan lawyer Mercy Mutemi, who is representing Daniel, states that it is an extension of existing labour inequalities that have played out in traditional outsourcing businesses like fashion and IT. “Yet, people see this work as unique, because of what she calls the ‘illusion of AI’,” she says. For Mutemi, the only way to fix the pay structure and to know whether data workers are being fairly remunerated at a global scale is to start to view them as part of the process, as part of the AI industry. To benchmark their pay against workers doing similar jobs inside Western companies.

Murgia’s analysis extends to how these practices affect broader societal structures. The reliance on AI systems, which are described as “black boxes,” often perpetuates existing societal biases rather than eliminates them, thereby reinforcing inequalities.

The troubling gendered-slant of deepfakes

The term ‘deepfake’ is scrutinised for its implications in creating hyper-realistic images and videos that blur the line between reality and fabrication. The technology’s potential for misuse is evidently shown by its application in generating non-consensual pornography, predominantly targeting women. The distressing statistic from deepfake detection software Sensity AI that “roughly 95 per cent of online deepfake videos were non-consensual pornography” underlines the severity and gender bias inherent in this misuse of technology.

The legal landscape, or lack thereof, regarding deepfakes is another critical point Murgia addresses. She notes the slow pace at which laws are adapting to the challenges posed by AI, with only a few countries taking steps to criminalise the non-consensual distribution of AI-generated intimate imagery. This lack of regulation leaves many victims without recourse, further exacerbated by the inconsistent enforcement of platform policies against such content.

New legislation in the European Union that came into effect in 2023 obliges social platforms to demonstrate how people can request takedowns of illegal material and to action these requests, but this doesn’t work if the material in question – such as deepfakes – isn’t illegal in the first place. In an anonymous poll of the users posted to the channel, and seen by Sensity AI, 63 per cent said they used photos of ‘familiar girls, whom I know in real life’. I recently wrote about the UK’s Ministry of Justice announcing that creating sexually explicit deepfake images will be made a criminal offence in England and Wales under a new law.

In the first three quarters of 2023 alone 143,733 new deepfake porn videos were uploaded online – more than all the previous years combined.

“The telltale mark of data ‘colonialism’, the exploitation of vulnerable communities by powerful tech companies, is in how the impacts of an algorithmic system are distributed. Advantages conferred by the technology, because of its statistical nature, are often enjoyed by the majority – whether through race, geography or sex.”

AI’s racial and socio-economic biases: the danger of facial recognition

Murgia explores the broader societal implications of AI through examples like the misuse of facial recognition technology, which often misidentifies women and people of colour. This misidentification is both a technical failure and reflects deeper systemic biases that are perpetuated and magnified by AI technologies.

She reveals, through the lens of the UK’s use, how facial recognition also alters human behaviour in public spaces. As cited, “facial recognition changes how people express themselves in public – from displaying their political and religious affiliations, to what they wear and how they act.” This technology, she argues, enforces a kind of visibility that strips individuals, particularly vulnerable groups like activists and journalists, of their anonymity, pushing them “into the shadows, or locked away in our homes.”

Emily’s experience in Stratford, East London, an area where this technology has been deployed, points out a tangible example of how communities feel the encroaching gaze of surveillance. Murgia describes a scene at the Stratford Centre, where the Dazzle Walks co-founder reflected on how “certain types of people become when these spaces are being surveilled by facial-recognition cameras.”

In China, the situation escalates to an even more dystopian level, with the government’s extensive use of this technology for control and repression, especially against protesters and the Uyghur community. This severe application “forms part of an extensive technological system used to restrict the Muslim Uyghur community in the province of Xinjiang – a portrait of a province caged by tech-enabled authoritarianism.”

Read: Michael Rosen unveils Uyghur scholar Rahile Dawut as PEN’s Writer of Courage

There are also global implications with the example of Uganda, where Chinese companies are exporting surveillance technology. Murgia captures the anxieties of Dorothy Mukasa, a Ugandan activist, who views the $126m surveillance deal with Huawei as a form of neocolonialism, fearing that “black, Ugandan faces would be used to further improve Chinese surveillance systems.”

While the story of Masood from Hyderabad personalises the impact of surveillance. After being wrongly targeted by local police using such technologies in Delhi, Masood’s life transformed into one of civic activism. Despite retreating from public religious and political activities, he sued the government over privacy violations, showing the personal and societal repercussions of unchecked surveillance.

When systems of power work against you

One of the most striking criticisms presented in the chapter called “Your Safety Net” is the algorithm’s focus on these female teenagers, which Murgia argues reinforces harmful gender stereotypes and overlooks the broader societal issues contributing to the problem, such as sexual violence and lack of education.

Against the societal backdrop within Salta in Argentina, between the affluent descendants of European colonists, the marginalised indigenous communities live in poverty and serious sexual violence. Murgia uses this setting to focus on a chilling application of AI in predicting teenage pregnancies among the poor, primarily targeting young indigenous girls.

This initiative, though seemingly grounded in a desire to address social issues, unfortunately, exemplifies what the UN’s Special Rapporteur criticises as a system designed to “predict, identify, surveil, detect, target and punish” the impoverished rather than genuinely uplift them. Many of the women and girls have been victims of a decades-long racist colonial practice known as ‘el chineo’: gang rape, usually by white men, of indigenous girls and young women.

Read the review: A Flat Place by Noreen Masud on trauma and its link to land

Murgia reveals that the algorithm’s approach is woefully simplistic and lacks a human-centred perspective, leading to interventions that are not only ineffective but also potentially harmful: “The data collection process treated the young women as passive.”

The public’s reaction to the AI system, when disclosed, was of alarm and protest, particularly timed with the national campaign for legal abortion in Argentina. Critics, including tech and human rights activist Paz Pena, lambasted the system for adopting a deterministic view of poor women’s futures, thus undermining their agency and contributing to a fatalistic narrative about their lives.

Murgia also explores the technical deficiencies of the AI system, noting how researchers found significant flaws in the model’s design that compromised its effectiveness. These issues reiterate the risks of relying too heavily on AI for social governance, especially when the underlying data and model construction are flawed.

Minority Report: criminalising vulnerable teens of colour

‘Code Dependent’ author Madhumita Murgia talks about Philip K. Dick’s ‘Minority Report’ being used as a blueprint in real life

At the Jaipur Literature Festival this year, Murgia discussed the disturbing instances of a real-life “Minority Report” system being used in the Netherlands. She exposes the unsettling impacts of predictive policing and data-driven surveillance systems through the harrowing story of Damien and his family.

“Every cry for help was an admission of poor parenting, causing the system to plant a red flag, a scarlet letter, by their names. The data was shared amongst authorities – between police, youth workers and schools – and held against them for several years. It felt like there was no way to scrub clean of an algorithmic stain. Mothers like Diana lost any purchase they’d had in their lives.”

Damien and his brother Nafayo were caught in the incessant loop of surveillance and criminal profiling, which underlines a critical ethical crisis. Murgia describes how these brothers were unfairly tagged and monitored as potential criminals based only on predictive data. The AI systems, as Murgia notes, were not just fallible but demonstrably biased, echoing the broader societal prejudices and amplifying them through official actions and records. “Each incident was logged in the boys’ records as a ‘police contact’, data that would become a part of their profiles for years to come”.

Predictive policing is described in ‘Code Dependent’

She introduces the idea of ‘necropolitics‘, a term coined by philosopher Achille Mbembe, to explain how political power can determine the vulnerability of citizens, particularly how they are perceived and treated in society. This is seen through the targeted families caught in an algorithm-driven net that views them through a lens of suspicion and potential criminality.

Read more: Jaipur Literature Festival 2024: identity, ageing, and AI

The constant police checks, the unannounced visits by social workers, and the inability to remove themselves from prejudicial lists depict a disturbing scenario where data not only misguides justice but perpetuates cycles of disadvantage and discrimination. “It felt like there was no way to scrub clean of an algorithmic stain,” Murgia observes, underlining the inescapability and the branding effect of such surveillance.

Ethical considerations and human rights

In essence, Murgia’s “Code Dependent” serves as a critique of the so-called “data colonialism” where state and corporate powers use technology to exert control over vulnerable populations. Her discussion aligns with broader concerns about the ethical use of AI, honing in on the need for technology that genuinely understands and integrates the human experience rather than merely attempting to predict it. These AI systems seem to enact and enforce systemic racial and socio-economic biases under the guise of efficiency and public safety in many cases.

It forces us to reconsider who benefits from AI and at whose expense. The book is a call to action: a plea for a more equitable tech industry that recognises and compensates all contributors fairly. She criticises and invites a dialogue on how we might reshape the future of technology to ensure it serves humanity as wholesomely as it profits from it. “Code Dependent” shows the urgency of addressing these ethical dilemmas to prevent a new form of digital and data-driven imperialism.

One-Time
Monthly
Yearly

Make a one-time donation

Make a monthly donation

Make a yearly donation

Choose an amount

£5.00
£15.00
£100.00
£5.00
£15.00
£100.00
£5.00
£15.00
£100.00

Or enter a custom amount

£

Your contribution is appreciated.

Your contribution is appreciated.

Your contribution is appreciated.

DonateDonate monthlyDonate yearly

About How To Be Books’ Editorial Process

We uphold a strict editorial policy that focuses on factual accuracy and relevance. Our content is meticulously reviewed by an experienced editor to ensure compliance with the highest standards in reporting and publishing. We avoid commissions and are explicit about any affiliate links and partnerships.

You may also like

Leave a Reply

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?

Discover more from How To Be Books

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from How To Be Books

Subscribe now to keep reading and get access to the full archive.

Continue reading