The usage of algorithms and Big Data Analytics is becoming a matter of course in more and more application areas. Last but not least, with the AMS-model from Austria [1] which will be used to determine the need for support measures of unemployed people to obtain a job it becomes clear that not only the opportunities and challenges of this technology should be discussed, but concepts for the regulation of algorithms must be established. This applies in particular if the usage of algorithm and Big Data Analytics is used to assess people in social contexts as in the given example. At this point well-known key issues, such as data protection and data security, double subjectivity and the issue of false positive and negative issues are relevant and useful. How the desire for efficiency on the one hand and the danger of stigmatisation and discrimination by the systems on the other can be eliminated is debated at length in Germany. One possible solution is the demand for the foundation and implementation of a so-called “Algorithm TÜV” or “TÜV for Artificial Intelligence”. The acronym TÜV stands for a technical testing organisation which performs security controls required by state laws or regulations. The demand which is mentioned for example by the Federation of German Consumer Organisations (Verbraucherverband Bundeszentrale) usually refers to algorithms of car software and automatic decision systems, such as scoring algorithms [2; 3]. Thereby, societal principles and standards, such as data protection compliance, the prevention of discriminatory aspects, and, last but not least, transparency could be ensured.
In the proposed contribution I will discuss the above solution using MAEWIN project as an example. The project is part of the of the Digital Society research program funded by the Ministry of Culture and Science of the German State of North Rhine-Westphalia and inter alia aims to develop a prototype of decision support system for the use in Social Work. Because the effects of this technology in the context of welfare state institutions, especially in the area of social work, have hardly been researched, the “Algorithm TÜV” does not seem to be a suitable solution for starters to the problem of possible transfer social disparities into the digital world.
References
[1] Holl, J.; Kernbeiß, G. and M. Wagner-Pinter (2018): Das AMS-Arbeitsmarktchancen-Modell. Dokumentation zur Methode, Konzeptunterlage vom Oktober 2018, SynthesisForschung, www.forschungsnetzwerk.at/downloadpub/arbeitsmarktchancen_methode_%20dokumentation.pdf, last visited 17/12/2018.
[2] Pressemitteilung Verbraucherzentrale Bundesband: Algorithmen-TÜV für Autosoftware, 20/06/2017, https://www.vzbv.de/pressemitteilung/algorithmen-tuev-fuer-autosoftware, last visited 20/12/2018.
[3] SVRV (2018). Verbrauchergerechtes Scoring. Gutachten des Sachverständigenrats für Verbraucherfragen. Berlin: Sachverständigenrat für Verbraucherfragen, http://www.svr-verbraucherfragen.de/dokumente/verbrauchergerechtes-scoring/, last visited 20/12/2018.