top of page
Search

When the Algorithm Gets It Wrong: Why Empathy Must Power the Future of A.I 

  • Anna Simmonds
  • Jul 25
  • 5 min read
ree

The action figure trend… remember that? A couple of months ago, Cephas Williams shared a post on Linkedin that seriously caught our attention around this trend. The post showed a joyful photo of Cephas and his son, Zion, in a moment full of warmth, connection and pride.


Cephas wanted to see the moment brought to life in a new way, so he asked ChatGPT to turn them into pixar-style cartoons, using a prompt he had seen circulating online. But what came back didn’t reflect them at all. 


Instead, the image had replaced him - a proud Black father - with a white man.


Still optimistic, Cephas tried again. This time, he used a prompt to turn them into action figures, imagining something playful his son might enjoy. But what came back was even more disturbing


The image showed Cephas and his four‑year‑old son holding assault rifles.


Cephas put it plainly: “My Black identity was either removed or replaced with violence - this wasn’t just a glitch. It’s a reflection of how invisible and misrepresented we can be in systems not designed with us in mind.”


What started as a light, creative exercise became something far more revealing, exposing what happens when systems are built without awareness, empathy, or care. This isn’t just about technology getting it wrong; it’s about what (and who) gets overlooked when the people behind the systems don’t see or represent the full picture of humanity - a reflection of the systemic issues that, if left unaddressed, get amplified by A.I rather than solved by it..


A heartwarming moment captured between a father, Cephas Williams, and his smiling son, Zion, radiating love and happiness in a joyful embrace. Photo by Cephas Williams.
A heartwarming moment captured between a father, Cephas Williams, and his smiling son, Zion, radiating love and happiness in a joyful embrace. Photo by Cephas Williams.

More Than a Glitch: A Mirror of Deeper Bias


Moments like this force us to confront hard questions about visibility, representation and the biases that exist in people and are, therefore, inherently baked into the systems that are increasingly shaping our lives. 


It’s not just a tech failure. It’s a human one.


A.I models don’t emerge in isolation. They are trained on vast pools of data collected from the internet - data shaped by the same stereotypes, inequalities and exclusions that exist in the world around us. Without intentional care, these systems don’t just reflect bias; they reproduce it and, most worryingly, help it scale.


We often hear A.I described in superlatives - smarter, faster, more efficient - but we rarely stop to ask: "Smarter for who? Faster toward what end? Efficient at what cost?" 


Even leaders at the very heart of A.I’s development have sounded similar alarms. In 2024, Geoffrey Hinton - often called the “godfather of A.I” - accepted a Nobel Prize for his work, but used his speech not just to celebrate progress alone, but to warn humanity about the dangers ahead. Having left Google so he could speak freely about this topic, Hinton cautioned that the risks posed by A.I are far from science‑fiction, even if many still treat them that way.


The Consequences Are Already Here


We’re already seeing the real-world consequences of this. Joy Buolamwini, founder of the Algorithmic Justice League, has spent years uncovering the harm. Through her research at the MIT Media Lab and in the documentary Coded Bias, she showed how facial recognition technologies consistently struggle to detect darker skin tones, especially those of black women. This is what happens when we build tools without thinking about who they truly serve and who they leave behind.


Her research project, Gender Shades, found that commercial facial-recognition systems had error rates of up to 34.7% for dark-skinned women, compared to 0.8% for light-skinned men. From facial recognition tools that misidentify black faces at a much higher rate than white ones, to job screening algorithms that downgrade CVs with traditionally female names, to predictive policing software that disproportionately targets minority communities. The list goes on.


Yet these systems are being used everywhere now - in classrooms, courtrooms, and beyond - often without asking the people they affect what they think.


A speaker presents insights on facial recognition technology, with an image alluding to A.I and ethical considerations on a large screen behind her. Credit: John Werner.
A speaker presents insights on facial recognition technology, with an image alluding to A.I and ethical considerations on a large screen behind her. Credit: John Werner.

Is there time for empathy?


Empathy is a starting point, a foundational skill that helps developers and decision‑makers ask better questions such as: “Who might this harm? What perspectives are missing? Whose stories aren’t being represented?”. Empathy means having a genuine understanding and awareness of other people’s lived experiences, perspectives and cultural contexts. It’s what encourages us to ask deeper questions, to slow down and consider: What might we be missing, and who hasn’t yet got a seat at this table? 


 Empathy creates the chance for better action: changes in structures, processes, and accountability. 


The lack of empathy in A.I development isn’t simply an ethical oversight; it’s a fundamental flaw in design. Without empathy, we end up creating systems that serve some people well while causing harm to others, and when those harmed are voices already pushed to the margins, the consequences only grow deeper and more damaging.


That’s why diversity matters, not just as a moral imperative but as a practical necessity. Studies consistently show that diverse teams outperform homogeneous ones, especially when tackling complex challenges. For example, a 2021 Harvard Business Review article highlighted that companies with diverse leadership teams are 70% more likely to capture new markets. Similarly, McKinsey’s Diversity Wins report found that organisations in the top quartile for ethnic diversity on executive teams are 36% more likely to outperform their peers financially.


Diversity brings a wider range of perspectives, and more perspectives lead to richer, more complete solutions. Empathy ensures those perspectives are heard, valued and embedded into the systems we build. When developers are encouraged to seek out unfamiliar experiences and test with a truly representative range of users, the result is tech that works better for everyone.


An image of an empty round table with a question mark above one chair, symbolising the question "Who doesn't have a seat at the table?".
An image of an empty round table with a question mark above one chair, symbolising the question "Who doesn't have a seat at the table?".

What Change Could Look Like


To build A.I that truly serves humanity, we need more than innovation. We need empathy woven into every layer. That means: 


Building diverse teams, where everyone is heard.

Embedding ethics into STEM education

 ✨Testing and building systems with a truly representative range of users. 

✨ Treating historically marginalised communities not just as users, but as co‑creators of the future.


People like Cephas Williams are already showing us what this looks like - using projects like The Black Network and Letter to Zion to reimagine tech through the lens of care, equity and accountability.


Behind every algorithm is a choice. And empathy helps us make better ones. If we want AI to serve all of us, empathy can’t be an afterthought. It must be the foundation.




 
 
Website Elements (19)_edited_edited.png

An education and creative studio developing the skill of empathy through film, education and training.

Our journey started in the classroom and led to the creation of Empathy Week in 2020 and Empathy Studios in 2024. We've reached 1.8 million students across 56 countries and counting.

Company Number 15650617.
Registered VAT no. 490 4017 07

© 2025 Empathy Studios Ltd.

 

TO CHANGE THE WORLD,
YOU FIRST HAVE TO UNDERSTAND THE PEOPLE IN IT.

bottom of page