Cash transfer and social assistance programs have become increasingly common globally. Governments across the world are using algorithms to track indicators, which can help to identify and rank those in poverty and provide them with assistance. However, a report by the human rights watch has found that such algorithms are often flawed and can lead to deprived people being excluded from important aid programs.
Jordan’s Takaful program, which came into effect in 2019, is an example of a system that uses an algorithm to assess eligibility for assistance. Takaful is responsible for allocating over a billion dollars provided by the World Bank, and has currently enrolled 220,000 Jordanian families. The program uses 57 socio-economic factors to rank households based on their eligibility for aid. However, human rights watch found that the algorithm used within the Takaful program was biased and inaccurate, as it failed to take into consideration the realities that those living under the poverty line must face.
For instance, asset ownership and electricity consumption are indicators that, although taken into account by the algorithm, do not necessarily depict poverty in the entirety of its complexity. Other factors, such as recent inheritance of property as well as expenses that exceed income, continue to be ignored. Mariam, one of the families interviewed by the human rights watch, fell out of the Takaful program because owning a car influenced her chances of receiving aid.
Jordan is not alone in automating important decisions that impact ordinary people’s lives through algorithms. Rotterdam’s algorithm to rank welfare recipients was also found to be biased, and so was crime-prediction software used by police departments in the United States. The algorithm used by the Australian government caused accusations of overpayment for about 400,000 welfare recipients.
Human rights watch suggests that such algorithms lead to a loss of nuance in identifying those most deserving of assistance. Toh, a lead researcher at the Human Rights Watch, explains that these rigid measurements and indicators create a crude ranking of poverty, which pits households against one another for aid, generating social tension.
While the National Aid Fund has a complaint and appeal process for its citizens, some issues in the algorithm arise from inaccuracies in the data compilation process. Data collected from surveys and mass registration exercises is inconsistent and does not necessarily reflect the social complexities that exist. In addition, the Takaful program rejects applications where an individual’s expenses exceed their income by 20%. To work around this, applicants must lie about their income or expense value arbitrarily.
Despite its flaws, the World Bank continues to provide loans for similar projects in various countries, including Egypt, Iraq, Lebanon, Morocco, Yemen, and Tunisia. Amnesty International has raised concerns that the use of algorithms to determine poverty is depriving people of the support they need.
The purpose of implementing these algorithms is to provide aid to those who need it most in constrained fiscal space, the World Bank says. However, lack of nuance in such an accurate deployment of algorithms could mean that millions of people across the world continue to miss vital services. A more comprehensive approach is needed to ensure that the use of algorithms ensures that those in need live a dignified life.