Page 1 of 1

Many people see an algorithm as the devil

Posted: Tue Dec 17, 2024 6:07 am
by arzina566
Solving social issues: policymakers are 'data woke'
As much as a self-driving car appeals to the imagination, if I'm honest I find artificial intelligence in consumer products less interesting. I prefer to think of the social opportunities for data-driven work. The government also sees that these are there. More and more policymakers are ' data woke '. They believe in the value of data for policy development, service provision and implementation of government tasks, supervision and enforcement, and business operations. And perhaps even more importantly: for social innovation and solving tough social issues. Think of the energy transition and combating undermining.

This leads to initiatives such as the Data Agenda Government , intended to “promote the sharing, combining and analysis of data in collaboration”. A quick look at the five pillars of the agenda shows that the government realizes that there are preconditions for this. Ethical considerations and 'public values' receive attention, in addition to improving the quality of data. The government also invests in people, organizational development and cultural change around data-driven work.


Despite this awareness, data-driven work does not always run smoothly. The use of algorithms in particular regularly makes the media – and usually not in a positive way. Various (government) organisations have been widely reported to have failed in the detection of fraud . They linked data and predictive calculation models to 'risk profiles' and drew incorrect conclusions about possible abuses.

Because of these kinds of messages, many people still have a sulphurous smell around algorithms and data-driven work. Whether that is justified is the question. After all, it is not the developments around artificial intelligence and digitalisation themselves that are dangerous by definition. The poison lies in how organisations deal with this. Ultimately, they are the driving force behind undesirable situations, by acting clumsily or sometimes downright unethically. That often turns out – as in the poignant example of the Tax Authorities and childcare allowance – to be a decision made by identifiable people.

Fiction reflects reality (spoiler alert!)
After reading his brilliant, dystopian novel south africa telegram data The Imperfect (2020), you get the feeling that writer Ewoud Kieft has been sitting like a fly on the wall of such a boardroom. He shows that it is indeed people who derail, and how naturally that can happen.

Image

In a society led by 'the Conglomerate', people are guided by an algorithm: Gena. This also applies to main character Casimir, better known as 'Cas'. From puberty onwards, it is Gena who gives him pushes where he needs them or slows down destructive tendencies. When Cas turns out to be a revolutionary and calls for resistance, Gena has to answer to the 'Supervisory Board' of the Conglomerate. And then the human (im)power becomes apparent.

Because where clear 'parameters' guide the behaviour of the algorithm, the supervisors resolutely question those principles. They immediately reach for the dark possibilities of the data. Cas' listeners must be assessed with analyses of their conversations. In this way, they can approach the 'doubters' in a targeted manner and limit the damage. One of the supervisors also wants to know from Gena whether there are already concrete names known of people who 'belong to the risk group'.