SmartyGrants a case study in keeping AI tools human

Posted on 24 Mar 2026

By Matthew Schulz, journalist, Institute of Grants Management

Hope OC House Front 1020516
SmartyGrants headquarters at Our Community House, with the exterior artwork "Hope", reflecting a desire to do good at the tech-focused social enterprise.

SmartyGrants is ensuring its artificial intelligence tools are fair and useful by focusing on how they affect people, not just how well they work.

The project is being led by Our Community’s director of data, AI and analytics, Dr Paola Oliva-Altamirano, in collaboration with Dr Lorenn Ruster, a leading AI ethics researcher who developed the “dignity lens” framework.

Speaking in a webinar hosted by the International Humanistic Management Association (IHMA), a global network focused on human-centred management and ethical leadership, the pair explored the topic “Leading with dignity in the age of AI”.

Paola
Dr Paola Oliva-Altamirano

Ethical ‘muscle’ at the heart of good AI practice

The conversation examined how ethical frameworks can be applied in day-to-day practice, using SmartyGrants as a case study.

Oliva-Altamirano said the work entailed “building ethical muscle” within the SmartyGrants team, beginning in 2021, which had led to the publication of an academic whitepaper in 2022. Ethics have since become part of the team’s day-to-day work.

As the team develops artificial intelligence tools, it conducts human-centred ethical reviews at every stage.

“We see how organisations struggle for funding, and we see how decisions are made,” Oliva-Altamirano said.

“The way we design tools can influence real-world outcomes for communities.”

The reviews require teams to document purpose, intended users, potential failures and human impacts.

“You need leadership that treats dignity as a product requirement, not an optional extra."
Dr Paola Oliva-Altamirano
Whitepaper
Tap to access the full report

“At the beginning, you go through the framework carefully and document everything. After a few cycles, the principles are baked into how you think. It becomes the way you work.”

“When the team questions purpose and impact, we usually end up with clearer communication and more inclusive design,” she said. “It’s not just about avoiding harm. It’s about building better tools.”

She warned that time constraints were a factor in ethical failures by other tech providers.

“In tech, everyone is on the clock,” Oliva-Altamirano said. “Ethics can become superficial, not because people don’t care, but because they run out of time.”

“Most teams are focused on getting the infrastructure right, making sure it scales, fixing bugs,” she said. “Very few have space to step back and ask, ‘What is the impact of this tool on the people who will use it?’”

A key change at SmartyGrants has been giving staff permission to delay deployment if necessary.

“You need leadership that treats dignity as a product requirement, not an optional extra,” Oliva-Altamirano said.

“Many harms don’t come from the original intent. They come from how tools are used once they are out in the world.”

Ruster
Dr Lorenn Ruster

The “dignity lens” developed through Ruster’s research provides a structured way to assess potentially harmful effects.

“Protecting people from harm is essential,” Ruster told the webinar. “But it’s only half the job. We also need to ask what becomes possible if we take our responsibility to promote dignity seriously.

“The fact that dignity is complex is not a weakness. It’s a strength.

“We need to keep asking not just ‘Can we build this?’ But ‘What does dignity require of us here?’”

How does a dignity lens work?

Dr Ruster’s dignity lens framework guides the design and governance of AI systems.

It goes beyond assessing fairness and risk to assessing effects on people’s autonomy, safety, recognition and working lives, and on communities and the environment.

Dignity is not considered an abstract concept but is assessed through practical questions at each stage of development, helping teams assess not just whether a system works, but how it shapes human experience.

More news