1. bookVolume 42 (2021): Issue 2 (July 2021)
Journal Details
License
Format
Journal
eISSN
2001-5119
First Published
01 Mar 2013
Publication timeframe
2 times per year
Languages
English
access type Open Access

Fostering the data welfare state: A Nordic perspective on datafication

Published Online: 05 Dec 2021
Volume & Issue: Volume 42 (2021) - Issue 2 (July 2021)
Page range: 207 - 223
Journal Details
License
Format
Journal
eISSN
2001-5119
First Published
01 Mar 2013
Publication timeframe
2 times per year
Languages
English
Abstract

Digital tools facilitating everything from health to education have been introduced at a rapid pace to replace physical meetings and allow for social distancing measures as the Covid-19 pandemic has sped up the drive to large-scale digitalisation. This rapid digitalisation enhances the already ongoing process of datafication, namely turning ever-increasing aspects of our identities, practices, and societal structures into data. Through an analysis of empirical examples of datafication in three important areas of the welfare state – employment services, public service media, and the corrections sector – we draw attention to some of the inherent problems of datafication in the Nordic welfare states. The analysis throws critical light on automated decision-making processes and illustrates how the ideology of dataism has become increasingly entangled with welfare provision. We end the article with a call to develop specific measures and policies to enable the development of the data welfare state, with media and communication scholars playing a crucial role.

Keywords

Introduction

The Covid-19 pandemic has accelerated the drive to large-scale digitalisation, which has been underway for decades. Throughout the global emergency, digital tools for everything from health to education have been rapidly introduced, replacing physical meetings and supporting social distancing measures. As a consequence of this “digitalisation on steroids”, an increasing number of aspects of our identities, practices, and societal structures have been transformed into data. From 2018 to 2021, we led a Nordic exploratory research network on “Datafication, Data Inequalities and Data Justice”, under NordForsk NOS-HS (Joint Committee for Nordic Research Councils for the Humanities and the Social Sciences), which aimed to outline the specifics of datafication in the Nordic countries. In this article, we ask whether we are witnessing the emergence of a data welfare state in the Nordic countries, and if so, how it might be characterised.

Through the analysis of empirical examples in three important areas of the welfare state – namely automated decision-making within employment services, data-driven methods within public service media, and the digitalisation of the corrections sector – we aim to draw attention to some of the problems inherent in increased datafication in the Nordic welfare states, including issues of blurring of public and private sector, lack of transparency, lack of diversity, and bias in data. We conclude with a call to develop specific measures and policies to enable the development of the data welfare state.

Any quotations and excerpts cited in this article not originally in English have been translated by us.

A Nordic version of datafication?

In the digitalised world, vast amounts of data are gathered automatically from our everyday activities, including shopping, travel, media consumption, and engagement with social media. We are living in an age of “infoglut” and “datafication” (Andrejevic, 2013; van Dijck, 2014), in which our feelings, identities, and affiliations are tracked and analysed. Datafication refers to the process of using data – mainly from digital environments – to understand sociality and social behaviour. Our increasing ability to generate and make sense of ever-larger quantities of data has also been described as the “industrial revolution of data” (Milan & van der Velden, 2016).

While the digital refers to numbers, and data that can be communicated via numbers, datafication describes the process whereby numbers are turned into datasets that, collectively, provide information about behaviour. van Dijck (2014: 198) describes datafication as a “transformation of social action into online quantified data”; Andrejevic (2020) frames it as an automation of subjectivity and knowledge; and Couldry and Mejias (2019: 1) call it “a quantification of the social”. While many scholars laud datafication as a new field of social science, authors like van Dijck, Andrejevic, and Couldry and Mejias pose more critical views. Specifically, they underline that datafication is supported by the ideology that data can provide more accurate and nuanced information about human behaviour, and therefore can define and predict behaviours. van Dijck (2014) calls this ideological assumption dataism. Central to dataism is the belief that data is neutral, quantifications are objective, and there is “a self-evident relationship between data and people, subsequently interpreting aggregated data to predict individual behaviour” (van Dijck 2014: 199). Metcalfe and Dencik (2019) note that this (ideological) understanding of a correlation between data and behaviour stokes the belief that data can effectively predict future activities, consumption, health, and risk, and thereby defines prediction as the primary goal of data collection.

In this context, the Nordic welfare states are characterised by a high degree of public trust in institutions, underscoring citizens’ general acceptance of a very high level of data collection. In the Nordic countries, a large amount of data is already available on all citizens – from newborns to seniors. Specifically, information on health, education, employment, tax, crime, and other matters are linked to individuals via their CPR number (Denmark), personal number (Sweden), and social security number (Finland), registering and documenting their engagement with both public and private sectors (Ustek-Spilda & Alastalo, 2020). While the data are not cross-referred – as it is (currently) illegal for one institution (e.g., a health authority) to share data with another (e.g., the police) – the vast and growing mass of data holds promise for large-scale digitalisation.

Currently, the Nordic countries are eager to use digitalisation to streamline public administration and the provision of welfare. This involves not only digitising physical files, but also digitally automating decision-making. In the following, we illustrate how the ideology of dataism is appropriated in different datafication initiatives related to the Nordic welfare states. As Alfter (2020) notes, Denmark aims to become a global leader in digitalisation; to achieve this end, it intends to incorporate digitalisation into all new legislation. In 2019, the Danish Agency for Digitalisation [Digitaliseringsstyrelsen], within the Ministry of Finance, launched a national strategy for artificial intelligence (AI) [National strategi for kunstig intelligence] with the following aims: “The public sector will use artificial intelligence to offer world-class service [and] artificial intelligence [will be used] to support a faster and more efficient handling of cases” (The Danish Government, 2019: 10). The Danish strategy, aiming to “[lead] in Europe in the implementation of data and artificial intelligence to improve and target the public service”, is built on optimism and hope (The Danish Government, 2019: 10). The mentioned benefits of AI include: “more personal treatment” of citizens; better support for citizens’ cases; higher quality administration of resources; faster and more accurate diagnosing; more efficient and effective administration; and effective systems to fight tax fraud and social benefits fraud (The Danish Government, 2019: 11). In other words, the strategy aims to improve the quality and efficiency (i.e., cost) of public services.

Similarly, in 2017, Finland launched a national AI strategy, the AuroraAI programme, to improve public services and competitiveness. AuroraAI aims to combine all public organisations under one network, facilitating interaction and data exchange between services and platforms. According to the national AI strategy, Finland is in an excellent position to produce “the world's best services in the age of artificial intelligence” (Ministry of Economic Affairs and Employment, Finland, 2017: 14). The goals of AuroraAI are similar to the ones described in the Danish strategy: the programme will provide “smoothly running daily life” as it automatically interconnects different services, breaks down silos in the service sector, and promotes cost-efficiency. The vision entails that AI operates smoothly and efficiently across the whole public service sector (Ministry of Finance, Finland, 2021).

Like Denmark and Finland, Sweden has set an ambitious agenda to become world-leading in AI development and use (Government Offices of Sweden, 2018). The National Approach to Artificial Intelligence published in 2018 proclaims in the introduction that “Sweden aims to be the world leader in harnessing the opportunities offered by digital transformation. By international standards, Sweden is in the vanguard” (Government Offices of Sweden, 2018: 4). At the same time, it will take resources and much effort to live up to the expectations and keep this leading position. The Swedish approach includes the idea that “there is a great potential in the public sector to develop activities and public services in the citizens’ interest with the help of AI. It is therefore in Sweden's interest to stimulate innovative applications and use of AI in society in various ways” (Government Offices of Sweden, 2018: 8).

For all three countries, these stated aims are quite telling of the ways in which dataism is frequently entangled with the ideals of efficiency and improved decision-making. One possible basis for the agencies’ optimism towards the potential for AI to improve public services is their (naïve) understanding of algorithms as neutral and objective, mirroring dataism's ideology of data as neutral and objective (van Dijck, 2014). As illustratively phrased in the Danish strategy, algorithms are believed to safeguard justice: “The algorithms will secure equal treatment by being objective, unbiased and independent from personal conditions” (The Danish Government, 2019: 7).

However, technological designs, including algorithms, are not neutral. Critical design theory (e.g., Drucker, 2011; Kannabiran & Petersen, 2010; Sun & Hart-Davidson, 2014), critical data studies (e.g., Andreassen, 2020; Eubanks, 2017; Iliadis & Russo, 2016; Noble, 2018; van Dijck, 2014; van Dijck et al., 2018), and data justice scholars (e.g., Andreassen, 2021; Dencik et al., 2018; Metcalfe & Dencik, 2019) argue that design and programming are always intertwined with values and ideologies. Context, norms, and values not only influence designs, but also the affordances and algorithms that go into – and constitute – those designs (see Buolamwini & Gebru, 2018; Constanza-Chock, 2020). In their analysis of design biases, Kofoed-Hansen and Søndergaard (2017) describe how a designer's wish to improve existing conditions is always influenced by the ideological trends of their contemporaries, even when the designer is not conscious of these trends. Having outlined the specific imaginaries connected with datafication in the Nordic countries, we now move to consider the concept of the Nordic data welfare state.

From the media welfare state to the data welfare state?

The extent to which scholars should contextualise processes of digitalisation and datafication (which are often described in universal terms, following the logic of large, global corporations) remains an unanswered question. Nonetheless, in this article, we enquire into the specificities of the Nordic welfare states, highlighting the legal frameworks and historical trajectories of institutional trust that must be considered in any exploration of datafication in the Nordics. In order to do so, we develop the notion of the data welfare state, where we rely on media scholars Syvertsen and colleagues’ (2014) conception of the media welfare state. The media welfare state refers to a special model of media systems in the Nordic countries, with four pillars: 1) universal access to information through communication systems such as postal services, telecommunication networks, and printed and audiovisual media; 2) editorial freedom, referring to a range of measures used to safeguard editorial independence from state interference; 3) content diversity, with extensive cultural policies seeking to ensure the provision of alternative (domestic and minority) content, while diminishing the influence of global market forces; and 4) durable and consensual policy solutions. It has been discussed whether the media welfare state should be considered as an ideal or a reality, and that political and social changes have led to the emergence of the neoliberal media welfare state instead (Jakobsson et al., 2021). Furthermore, in increasingly datafied Nordic societies, it is reasonable to ask whether the pillars of the media welfare state are changing or need to be adapted. Accordingly, instead of the media welfare state, we are interested in how the contours of a data welfare state might look. Importantly, the media welfare state consolidates democracy and trust in the state. If a data welfare state were to do the same, the four pillars outlined by Syvertsen and colleagues (2014) would require adaptation: 1) justice and non-bias in processes of datafication; 2) decommodification, that is, freedom from commercial logic; 3) data diversity acknowledging different needs of citizens and residents; and 4) transparency on the datafication process providing sustainable and meaningful information for citizens and residents (see Table 1).

Adaption of four pillars of the media welfare state to the data welfare state

Media welfare state Data welfare state
universal access to information justice and non-bias in processes of datafication
editorial freedom decommodification
content diversity data diversity
durable and consensual policy solutions transparency and sustainability

Comments: The pillars of the media welfare state were formulated by Syvertsen and colleagues (2014).

Coming from a slightly different viewpoint, with a capabilities approach in mind, Taylor (2017) has introduced three pillars of data justice that partly coincide with the four pillars above. Drawing on theorisations of data justice (Dencik et al., 2018; Heeks & Renken, 2016; Johnson, 2014), Taylor's three pillars address (in)visibility, digital (dis)engagement, and countering data-driven discrimination. Although framed differently, these point to similar issues as our data welfare state framework: visibility refers to access of representation and diversity; engagement indicates autonomy in technological choices covering issues of decommodification and transparency; and countering discrimination is compatible with seeking non-bias. While Taylor offers her framework in a broad global setting, our approach is grounded specifically in the context of the Nordic welfare state. In what follows, we look more closely at different areas of society in three Nordic countries to discuss how the pillars of the data welfare state are emerging or are being challenged.

Data-driven welfare in the Nordic welfare states: Danish unemployment, Finnish public service media, and Swedish smart corrections

As the Nordic AI strategies above illustrate, the welfare state is considered an important site for datafication and algorithmic automated decision-making (ADM) systems. Below, we discuss three Nordic projects that illustrate how the ideology of dataism can become entangled with welfare provision. Although there are important differences between the three Nordic countries considered here, they are each considered representative of the social-democratic welfare regime, according to Esping-Andersen's (1990) seminal classification. Social-democratic welfare states are characterised by the principles of universalism and decommodified social rights, as well as the promotion of equality and high living standards for all. In the following, we draw on three examples to analyse the status of the Nordic data welfare state. Based mainly on documentary analysis, as well as background interviews, we first discuss ADM in the welfare sector, specifically employment services; second, we engage with the datafication of public service media; and third, we explore the datafication and subsequent automation of the corrections sector.

Automated decision-making in the Danish employment services

A relatively new Danish labour and unemployment law [Beskæftigelsesindsats], passed in 2019, involves an ADM system that uses data profiling to assist state social workers in their efforts to find work for unemployed citizens. Although the system is not explicitly labelled ADM or AI, but rather “a national digital tool for clarification and dialogue” (Retsinformation, 2019: §8.2), the tool nevertheless represents an attempt to digitally assess job-seekers and predict their success on the job market. The tool is based on an ADM pilot project that ran from 2016–2018 (Motzfeldt, 2019). While it is not clear whether the current tool is identical to that of the pilot project, or whether it represents a new and updated version, scholars agree that the pilot project served as the primary source for this new “ADM unemployment system” (Andersen, 2019; Motzfeldt, 2019).

The goal of the pilot project was to identify unemployed citizens at risk of long-term unemployment via a “profile clarification tool” [afklaringsværktøj]. This tool operated in two stages: In the first stage, unemployed citizens filled out a survey indicating their own evaluation of their situation – including their expectations of when they would find employment, as well as descriptions of their personal situation – in case this was relevant to their search for work. In the second stage, ADM was used to evaluate each job-seeker based on a number of predefined categories, gathered from different institutions and public platforms (relating to, e.g., education, work experience, age, ethnicity, and welfare benefit history). These categories were defined as “objective” and “observable”, while participants’ own evaluations were considered “subjective” (The Danish Parliament: 212; Mploy, 2018: 7). The combination of data across both stages generated a score, indicating whether the job-seeker was at risk of long-term unemployment. According to this score, the job-seeker was assigned services from a local job centre.

In her book Automating Inequality, Eubanks (2017) discusses the consequences of grounding social welfare policies on data prediction, on the basis of her analyses of ADM-driven social services in the US. Echoing van Dijck's (2014) conceptualisation of dataism, Eubanks criticises the assumed relationship between large population data and individual behaviour, and she describes how algorithms aimed at predicting individual needs are designed according to social group. As a result, scores used to predict individual needs are not based on individuals and their personal histories, but rather on the data mining of (pre-defined) social groups related to ethnicity, gender, civil status, neigh-bourhood, and other demographic factors. In other words, individuals are not impacted by their own actions or personal situations, but by the previous actions of “members” of their “categorical belonging”. While Danish law defines social categories (e.g., age, ethnicity, citizenship status, education, etc.) as “objective” (The Danish Parliament, 2019), Eubanks demonstrates how the history of poverty and ideas of marginalised groups as less valuable citizens play into the design of algorithm-based social policies; this is especially clear in the design of the social categories used to predict individual risks or needs, and thus the level of assistance granted by the state. She argues that, rather than facilitating new and neutral social policies, “digital tools are embedded in old systems of power and privileges” (Eubanks, 2017: 178). Such embeddedness might explain the mixed results of the Danish unemployment services pilot project: Of the sixteen municipalities that participated, nine found the ADM screening system to be effective in identifying those at risk of long-term unemployment (Mploy, 2018); the remaining seven municipalities experienced the ADM system as stigmatising, and thus undesirable for facilitating dialogue with unemployed citizens (Mploy, 2018: 32). These ambiguous results indicate a challenge in fulfilling the first pillar (justice and non-bias in processes of datafication) in the data welfare state.

All of the participating municipalities in the pilot project concluded that the ADM tool could not determine the risk of long-term unemployment independently; rather, output from the tool needed to be interpreted alongside social workers’ professional judgements. In other words, in order to maintain a transparent and sustainable data welfare state (the fourth pillar), as well as a just and non-biased data welfare state (the first pillar), the ADM tool needs to be continuously combined with professional judgements provided by “real people”. Importantly, some job centres pointed to situations in which the ADM disagreed with social workers’ evaluations (Mploy, 2018). In these cases, the social workers problematised that they could not locate the basis for the discrepancy; that is to say, they could not see the calculation behind the ADM algorithm. This points to the “black box” of ADM programming, whereby users do not understand or see the programming that feeds into a particular score or prediction. While we, as consumers, are used to ADM recommendations that do not fit our desires (e.g., suggestions from streaming services that do not capture our interest) (Motzfeldt, 2019), the same “mal matching” can have severe consequences when incorporated into social services. As consumers, we can choose to not accept ADM recommendations; but as citizens in a social system, we have fewer options to reject ADM-determined suggestions or solutions. The lack of transparency and sustainability (the fourth pillar) in the data welfare state is therefore much more critical when considering the determination of welfare provisions than it is in the provision of entertainment options.

Another reason why the ADM prediction differed from social workers’ evaluations in some instances might be that, in line with the underlying assumptions of dataism, the ADM tool equated correlations with causality (Cheney-Lippold, 2017). In the pilot project, knowledge about unemployed citizens was reduced to correlations, whereas knowledge held by social workers was based on an observed understanding of causal relationships between various social categories (Antczak & Birkholm, 2019). Accordingly, we argue that the term “artificial intelligence” should be replaced with that of “automated decision-making”. While AI and machine learning have become the assumed hallmarks for a prosperous future, we question the assumed intelligence of datafication and stress that AI is best characterised as a future imaginary.

Motzfeldt (2019) warns that the use of ADM tools in the welfare state will most likely be expensive. While such systems are often initiated to save time and replace (salary demanding) human staff, the pilot project described above underlines that ADM cannot be a complete substitute for human staff, and it is therefore likely to add cost on top of existing salaries. Importantly, experiences with ADM-driven social services point to how professional staff tend to prioritise the ADM-score instead of their own judgement (Antczak & Birkholm, 2019; Eubanks, 2017; see also Oak, 2015); this risk seems heightened in contexts where resources are few, and there is little time to determine the reason behind the discrepancy between an ADM-score and one's own evaluation.

The datafication of public service media

Public service media (PSM) represent an important component of the welfare state. PSM are largely financed through public funding, with the remit to serve the public interest. They follow and express the ideals of the welfare state – namely equality and universality – in terms of access to information and representation. Changes in media logics related to datafication are aptly captured by Andrejevic (2020), who describes a shift from mass media to automated media. Automated media operate through ADM, platforms, and utilisation of data, with a variety of consequences for mutual recognition, collective deliberation, and judgment. At their core, automated media infrastructures based on datafication propel and shape media logics and content.

The datafication of media can be understood as a “political economic regime”, whereby the accumulation and analysis of data determines new ways of doing business and governance (Sadowski, 2019). This has several consequences for media operation, including the “platformisation” (Gillespie, 2010; Helmond, 2015) of the media environment, dominated by commercial platforms such as Google, YouTube, Twitter, Facebook, and Instagram. According to van Dijck and colleagues (2018), a platform is fuelled by data, automated and organised by algorithms and interfaces, formalised through ownership, and governed through user agreements. Platformisation is but one example of how communication and sociality have become formatted to enable datafication. As Andrejevic (2020) argues, automated systems promise to augment or displace the human role in communication, information processing, and decision-making. The implication of this is that mental labour, thought processes, evaluation, and judgement must be standardised and formatted, similar to how physical labour in factories was replaced by machines.

As a result of their datafied infrastructure, media now operate in drastically different ways, using detailed audience data to customise services. While data-driven media promise more accurate services and efficient operations, they are also highly problematic in terms of the normalisation and capitalisation of surveillance and the de-skilling of comprehension (Turow, 2011; Zuboff, 2019). Additionally, the datafication of media endangers a fundamental role of media in society: to foster solidarity and civic mindedness (Andrejevic, 2020; Couldry, 2012; Nikunen, 2019). In what follows, we discuss how the datafication of the media environment affects the principles and pillars of the Nordic data welfare state.

Nordic PSM have traditionally been a strong and important part of the so-called media welfare state (Syvertsen et al., 2014). Nordic PSM prided themselves on their early move towards digitalisation; however, they now find themselves surrounded by commercial platforms owned by tech companies. Though sharing PSM content on social media platforms appears necessary to reach audiences – particularly young audiences (Andersen & Sundet, 2019) – Nordic PSM reluctantly do so, as a side effect of this sharing is the involuntary support of commercial platforms, which blurs the boundaries of the public and private sector and constitutes a threat to the second pillar (decommodification). For this reason, and to maintain ownership and control, some Nordic PSM have adapted to datafication by creating their own platforms, collecting data, and using customised services (Hokka, 2018; Moe, 2013).

Most Nordic PSM employ audience metrics, user profiling, or algorithms to create content, customise distribution channels, and generate recommendation systems. In line with their early adoption of digital technologies, YLE (Finland), DR (Denmark), SVT (Sweden), and NRK (Norway) considered data-driven personalisation services highly relevant to their strategy (Andersson Schwarz, 2016; Van den Bulck & Moe, 2018). But how does data-driven personalisation affect the non-bias of media content, in other words, the universal principle of PSM? PSM justify the use of data-driven customised services as a “new universality”. However, personalised recommendation systems may also erode universalism and the idea of a shared public sphere (the first pillar of the media welfare state) by categorising and profiling audiences. In particular, recommendation algorithms allow audiences to filter out content and viewpoints they are not interested in, thereby creating “filter bubbles” (Pariser, 2011). The resulting customised – and therefore fragmented – media environment erodes the experience of an imagined community (Andersson, 1983), which is central to solidarity, trust, and civic mindedness. In other words, the infrastructures and architectures of digital media emphasise individual tastes and personalised brands over collectiveness and community (the first and third pillars of the media welfare state).

To counter these polarising tendencies, PSM have integrated “public service algorithms” into their recommendation processes (Beckett, 2020; Bennett, 2018, Nikunen & Hokka, 2020). For example, YLE uses algorithms that seek to diversify, rather than polarise, media use (Nikunen & Hokka, 2020).

There are some implications that data-driven personalisation can also be used to better foster diversity of audiences. Through personalisation, PSM have been able to recognise new audiences (Hokka, 2018) instead of addressing the abstract average citizen, who is unlikely to belong to a minority; however, data-driven personalisation may create new forms of marginalisation and groups of discrimination (Mann & Matzner, 2019). Commercial platforms such as Netflix have found a global niche in audiences who are interested in content produced by or representing ethnic and racial minority groups as well as gendered and sexual minorities. Nationally, these audiences may be small, when considered globally, they are substantial. While catering to these audiences is important, the fact that it is driven by commercial interests, rather than by public values, tends to put emphasis on sensational and trendy elements that further marginalise those already in the margins (Saha, 2018). These challenges exemplify the entangled nature of the first and the third pillars of the data welfare state in the context of media, where the question remains of how to ensure non-bias in data-driven systems and take into account the different needs and aspects of diverse society.

Andrejevic (2020: 49) suggests that, instead of focusing on how algorithms affect content discovery on media platforms, we should attend to the ways in which the “combination of platform logics and communicative practices with broader social policies undermines the conditions for democratic deliberation” (see also Baum & Groeling, 2008; Campbell, 2018; Pariser, 2011). For many, such a consideration culminates in a vision of the public sphere guided by commercial profit rather than public values. Indeed, datafication has intensified the entanglement of public and private media infrastructures, which takes us to the data welfare state's second pillar of decommodification: The domination of global platforms has given rise to a media ecosystem in which virtually all media platforms are dependent on the infrastructural services of global tech giants (van Dijck et al., 2018). As this applies to PSM, the entanglement with data-driven media platforms risks undermining the public's trust. They also constitute a risk to sustainability (the fourth pillar) – as management and responsibility for the platform are outside the control of individual countries – as well as data diversity (the third pillar), as the data (and control with data) is gathered on very few platforms. While many European PSM allow for some private funding and advertising on their websites, Nordic PSM have tried to maintain independence (Sørensen & Van den Bulck, 2020). As discussed, some have developed their own platforms (YLE Areena in Finland, SVT Play in Sweden, NRK TV in Norway, and DR TV in Denmark); however, to be discovered by potential users, they must also appear on set-top box interfaces and third-party software (e.g., Apple TV, PlayStation, ElisaViihde [Finland], Strim [Norway], Tv Hub [Sweden], and YouSee [Denmark]). Whenever PSM content and platforms become embedded in these interfaces, they must submit to their hosts’ datafied logics, marketing, and commercialisation; this challenges the second pillar (decommodification) of the data welfare state.

As very little is known about the data traffic in these contexts, transparency (the fourth pillar) is seriously challenged. Moreover, some Finnish, Swedish, and Danish PSMs are connected to third-party servers that collect user data for advertising or data management purposes (Sørensen & Van den Bulck, 2020). While there is no evidence that audience data is gathered by these servers, it is nonetheless clear that PSM are significantly integrated into digital business networks, and it is highly problematic for public trust if PSM user data is tracked and sold to third parties. In addition, data gathering on PSM remains modest compared with that of the tech giants; thus, PSM must seek new collaborations to access more data. These collaborations with commercial data-driven platforms further blur the line between public and private media, directly affecting the second pillar (decommodification) of the data welfare state.

In Sweden, public service radio has integrated the commercial music streaming service Spotify – a phenomenon referred to as the “Spotification” of public service media (Burkart & Leijonhufvud, 2019). Furthermore, some PSMs (e.g., NRK) co-produce content with Netflix (Sundet, 2017). While datafication is embraced in PSM news services as an efficient way to reach and serve audiences, it is simultaneously seen as highly problematic for the PSM principles of universalism and equality (the first pillar). Growing dependence and entanglement with commercial digital platforms generates uncertainty over data practices and trust. In recent years, Nordic PSM have paid more attention to the transparency of their data practices and the independence of their platforms. This reflects their strong investment in data-driven technologies and the challenges that these systems entail for PSM. As illustrated by the ADM unemployment services case, algorithmic operations represent trade secrets, and data collection practices are often hidden from the public. To serve public values, PSM must maintain transparency (the fourth pillar) in their operations and ethics in regards to their data practices.

It seems PSM experiences various challenges in upholding the four pillars of the data welfare state. Particularly the second pillar of decommodification appears to be difficult to foster in the current platformed media ecosystem. The effects of blurring the boundaries between the public and private sector are illustrated in the ways in which PSM have adopted the logics of commercial platforms with practices that challenge non-bias and diversity principles (the first and third pillars), as well as undermine sustainability and transparency (the fourth pillar) with increasing dependency on powerful platforms.

The smart prison: Datafication of corrections

In the Nordic setting, corrections are part of the public sector – aiming to rehabilitate and resocialise individuals – and are often viewed as Scandinavian exceptionalism (Pratt, 2008; Pratt & Eriksson, 2012). Similar to other areas of the welfare state, the corrections sector is increasingly implementing digital technology in order to become more efficient, particularly with respect to decision-making. In the Swedish context, the aim to “smartify” corrections – in part, through datafication – has materialised in the Krim:Tech initiative, which the Swedish Prison and Probation Service [Kriminalvården] launched in 2018. The main aim of Krim:Tech is to gather and recruit technology developers to renew and digitalise work with incarcerated individuals (Kaun & Stiernstedt, 2020). The Swedish Prison and Probation Service describes this in the following terms:

Krim:Tech is the new digitalisation initiative by The Swedish Prison and Probation Service. With the help of the latest technology and research, the initiative will support the development of new and improved digital solutions within the authority. Krim:Tech is an inventor's workshop and test bed for digital technology. Does an ankle monitor actually have to be an ankle monitor or could it be something else instead? How can we use IT to keep our security class 1 facilities calm? How can we prevent children and families who are visiting their father or mother in the prison from becoming afraid? Can we do security scans with a toy instead of metal detectors and full body scanners?

(The Swedish Prison and Probation Service, 2018)

This description of Krim:Tech reinforces the idea of renewing the entire organisation – including incarcerated individuals – with the help of smart data-based technology, while simultaneously considering the prison context as a test bed for new technologies (Kaun & Stiernstedt, 2021). This idea of renewal and even reinvention is emphasised in an unpublished policy document, shared with the authors, which carves out a digital agenda for The Swedish Prison and Probation Service. According to this agenda, there are five ambitions for smart technology within the sector: leveraging digital resources to overcome social isolation, preparing prisoners for life in the digital society, increasing efficiency through digital resources, preempting recidivism through digital resources, and striking an appropriate balance between security and the use of digital resources.

Besides the larger visions expressed by Krim:Tech and the digital agenda, there are also specific projects and implementations of digital technology in The Swedish Prison and Probation Service that illustrate particular aspects of datafication. One example, which is used to support probation services, is an application called Utsikt [View, Prospect, or Outlook] (Kaun & Stiernstedt, 2020). This tool, representing the first of its kind, was developed by Krim:Tech in 2015, with support from the state agency for innovation (Vinnova) (The Swedish Prison and Probation Services, 2015). During a trial period in early 2017, a group of 19 individuals on probation tested the application and provided feedback for improved functionality (The Swedish Prison and Probation Services, 2017). According to the project leader, Lena Lundholm, the application was designed to improve attendance at probation meetings and provide clients with preemptive exercises derived from cognitive training in challenging situations. Accordingly, the application includes breathing exercises for stressful encounters and links to hotlines for support during critical episodes. Furthermore, the tool enables users to track their moods and provides them with different scenarios for problem-solving.

Descriptions of the application emphasise that it is merely meant to prevent recidivism and not to control or supervise clients. Use of the application, which is available as a free download, is voluntary, but it is thought to complement the work of probation officers. The application requires an iPhone 5 or Android version 5 (or later operating system), as well as an App Store or Google Play account. These preconditions are potentially challenging for some clients, especially those who have recently been released from long prison sentences (Jewkes & Reisdorf, 2016); this lack of access potentially challenges the first pillar of justice and non-bias in the processes of datafication.

Together with a probation officer, users input data on their rehabilitation process and support, as well as control measures, into the application. All data stored in the application are inaccessible by Prison and Probation Service officers. In fact, the only link between The Swedish Prison and Probation Service and the application is the automatic synchronisation of meetings via the calendar function. Beyond this, no data are saved, and it is the responsibility of the users themselves to back up their personal information. However, it is conceivable that, in the future, data stored in the application could be used to predict users’ recidivism or critical moments. This can threaten the transparency and sustainability (the fourth pillar), if users do not know how the information they insert into the application is used to predict future risks of recidivism.

Concern for the secure treatment of data is made very explicit in the application's promotional material. However, datafication concerns not only the treatment and usage of data, but also the transformation of complex processes (i.e., rehabilitation) through datafication; in the case of this application, this transformation involves the differentiation of rehabilitation into distinct periods, tasks, and risks to be mitigated, leaving little room for the user or probation officer to navigate. This highlights a similarity with automated decision-making in the employment services and the use of data-driven methods in PSM discussed above, in which human-centred approaches were also sidelined.

Following the principles of a data welfare state, datafication in the corrections sector needs to follow the four pillars of non-bias, decommodification, diversity, and transparency. More specifically, if data-based prison technologies are to avoid comprising the principles of the data welfare state, they would need to strive for nondiscrimination by avoiding specific biases (the first pillar), for example, by not predicting recidivism based on risk-scoring and historical data, instead allowing for rehabilitation and social mobility of incarcerated individuals. Additionally, datafied corrections within the welfare framework should strive for decommodification (the second pillar) by reducing the presence and influence of commercial actors in the corrections sector, including going beyond the problematic discourse of technological backwardness currently dominating and justifying datafication projects in public–private partnerships. In order to uphold the third pillar (data diversity), the corrections sector should strive to acknowledge diversity and work against algorithmic standardisation that forestalls the importance of individual needs in, for example, rehabilitation. Lastly, the corrections sector should strive for transparency (the fourth pillar) in terms of meaningful information for incarcerated individuals on how decisions impacting their everyday life – for example, about their placement, work assignments, and programme activities – are made, and in which ways algorithmic systems influence these decisions.

Concluding remarks: Towards a data welfare state

While the three cases of datafication examined here – automated decision-making within employment services, data-driven methods within public service media, and the digitalisaion of the corrections sector – might seem very different, they all represent central domains of the welfare state, which citizens must have a certain amount of trust in. The Nordic countries have a long tradition of trust in social and political institutions, including a strong public belief that these institutions support and underpin general social equality. What further unites the three cases is that the automated and data-driven methods employed in each case builds on the ideology of dataism (van Dijck, 2014). As cited by the Danish Agency for Digitalisation, there is a general belief that ADM can deliver “higher quality” welfare state services (The Danish Government, 2019: 11), and that “algorithms will secure equal treatment by being objective, unbiased and independent from personal conditions” (The Danish Government, 2019: 7). As illustrated in the analyses above, the three cases highlight various risks that come with the embrace of datification.

Furthermore, it is common for all three cases – as it is for datafication in the Nordic welfare states in general – that they must be understood in the wider context of financial pressure. Indeed, the fantasy that datafication will lead to faster and more efficient handling of public services goes hand in hand with the desire to reduce costs.

We have pointed out that justice and non-bias in processes of datafication would be one of the pillars of the data welfare state. Our analysis has shown that automated decision-making comes with risks of injustice and bias. Another pillar that was discussed was decommodification. Our analysis of data-driven methods in the PSM domain shows that datafication processes often follow a commercial logic and hence, instead of contributing to decommodification, rather enhances commercialisation. Furthermore, datafication approaches within the welfare sector rarely explicitly enhance data diversity – meaning approaches that nuance difference – and rather reinforce the standardisation and flattening of identities. And lastly, datafication of welfare provision is often connected with issues of black-boxing and intransparency of how, for example, automated decisions were reached. Furthermore, many automation and datafication projects do not explicitly take sustainability into consideration.

While the three examples show specificities about current datafication processes in the Nordic countries, they also illustrate how the digital imperative is intertwined with the ideology of dataism in the Nordic setting. While all the Nordic countries have explicit AI strategies and aim to be digital societies, the ideal of the digital and data-based welfare state has not yet been reached. We end this article by outlining principles, based on the four pillars, which must be met in order to create a data welfare state that corresponds with the democratic welfare state ideals.

First, such principles must include the nondiscrimination of citizens affected by digital welfare technologies; this would imply the prevention of biases and discrimination encoded in digital infrastructures. Second, noncommercial forms of data capturing and the development of nonproprietary systems, guaranteeing fair use of citizen data, would be essential. Third, clear legal frameworks should be enacted to regulate datafication and data usage in emerging technologies, such as ADM and machine learning. Hitherto, Nordic governments have emphasised ethical guidelines and recommendations and have only just begun the work with legal frameworks. Furthermore, when engaging in ethics, Nordic countries have focused on securing privacy, rather than engaging in broader issues of social justice and nondiscrimination. In addition, transparency is crucial for the ethical use and evaluation of data, and thus essential for any welfare state institution or operation. Fourth, policies to support and regulate datafication in the welfare state should be durable and consensual, as proposed for the media welfare state. As our case studies have demonstrated, there is a clear need for more sustainable, human-centred approaches.

The four pillars of non-bias, decommodification, diversity, and transparency should be seriously considered in any new digitalisation or automation project within the Nordic welfare states. Furthermore, any long-term datafication process for the public good should involve the active contribution of media and communication scholars, who are uniquely positioned to provide concrete suggestions based on critical, empirical research integrating the perspectives of citizens and vulnerable groups.

Adaption of four pillars of the media welfare state to the data welfare state

Media welfare state Data welfare state
universal access to information justice and non-bias in processes of datafication
editorial freedom decommodification
content diversity data diversity
durable and consensual policy solutions transparency and sustainability

Altfer, B. (2020). Denmark: Research. In F. Chiusi, S. Fischer, N. Kayser-Bril, & M. Spielkamp (Eds.), Automating society report 2020 (pp. 52–61). Algorithm Watch, Berletsmann Stiftung. https://automatingsociety.algorithmwatch.org AltferB. 2020 Denmark: Research In ChiusiF. FischerS. Kayser-BrilN. SpielkampM. (Eds.), Automating society report 2020 52 61 Algorithm Watch, Berletsmann Stiftung https://automatingsociety.algorithmwatch.org Search in Google Scholar

Andersen, A. N. (2019). Danmark er ved at blive Europas Kina (når det handler om overvågning) [Denmark is becoming Europe's China (when it concerns surveillance)] [blog post]. [SDU blog]. https://holderdetibyretten.wordpress.com/2019/05/15/danmark-er-ved-at-blive-europas-kina-nar-det-handler-om-overvagning/ AndersenA. N. 2019 Danmark er ved at blive Europas Kina (når det handler om overvågning) [Denmark is becoming Europe's China (when it concerns surveillance)] [blog post]. [SDU blog] https://holderdetibyretten.wordpress.com/2019/05/15/danmark-er-ved-at-blive-europas-kina-nar-det-handler-om-overvagning/ Search in Google Scholar

Andersen, M., & Sundet, V. (2019). Producing online youth fiction in a Nordic public service context. Journal of European Television History and Culture, 8(16), 1–16. https://doi.org/10.18146/2213-0969.2019.jethc179 AndersenM. SundetV. 2019 Producing online youth fiction in a Nordic public service context Journal of European Television History and Culture 8 16 1 16 https://doi.org/10.18146/2213-0969.2019.jethc179 10.18146/2213-0969.2019.jethc179 Search in Google Scholar

Anderson, B. (1983). Imagined communities: Reflections on the origin and spread of nationalism. Verso. AndersonB. 1983 Imagined communities: Reflections on the origin and spread of nationalism Verso Search in Google Scholar

Andersson Schwarz, J. (2016). Public service broadcasting and data-driven personalization: A view from Sweden. Television & New Media, 17(2), 124–141. https://doi.org/10.1177/1527476415616193 Andersson SchwarzJ. 2016 Public service broadcasting and data-driven personalization: A view from Sweden Television & New Media 17 2 124 141 https://doi.org/10.1177/1527476415616193 10.1177/1527476415616193 Search in Google Scholar

Andreassen, R. (2020). digitale sorteringskategorier, affordances og interfaces: Online shopping of digitalt forbrug af sex og sæd [Digital sorting categories, affordances and interfaces: Online shopping of digital consumption of sex and semen]. In R. Andreassen, R. Rex, & C. Svabo (Eds.), Digitale liv. Brugere, Platforme og Selvfremstillinger [Digital life: Users, platforms and self-portrayals] (pp. 161–183). Roskilde Universitetsforlag. AndreassenR. 2020 digitale sorteringskategorier, affordances og interfaces: Online shopping of digitalt forbrug af sex og sæd [Digital sorting categories, affordances and interfaces: Online shopping of digital consumption of sex and semen] In AndreassenR. RexR. SvaboC. (Eds.), Digitale liv. Brugere, Platforme og Selvfremstillinger [Digital life: Users, platforms and self-portrayals] 161 183 Roskilde Universitetsforlag Search in Google Scholar

Andreassen, R. (2021). Social media surveillance, LGBTQ refugees and asylum: How migration authorities use social media profiles to determine refugees as ‘genuine’ or ‘fraud’. First Monday, 26(1–4). https://doi.org/10.5210/fm.v26i1.10653 AndreassenR. 2021 Social media surveillance, LGBTQ refugees and asylum: How migration authorities use social media profiles to determine refugees as ‘genuine’ or ‘fraud’ First Monday 26 1–4 https://doi.org/10.5210/fm.v26i1.10653 10.5210/fm.v26i1.10653 Search in Google Scholar

Andrejevic, M. (2013). Infoglut: How too much information is changing the way we think and know. Routledge. https://doi.org/10.4324/9780203075319 AndrejevicM. 2013 Infoglut: How too much information is changing the way we think and know Routledge https://doi.org/10.4324/9780203075319 10.4324/9780203075319 Search in Google Scholar

Andrejevic, M. (2020). Automated media. Routledge. https://doi.org/10.4324/9780429242595 AndrejevicM. 2020 Automated media Routledge https://doi.org/10.4324/9780429242595 10.4324/9780429242595 Search in Google Scholar

Antczak, H. & Birkholm, B. (2019). Når borgeren bliver til data… fordufter den etiske fordring [When the citizen becomes data… the ethical demand evaporates]. Uden for nummer, 39(19), 4–19. https://social-raadgiverne.dk/wp-content/uploads/2019/11/39-UdenForNummer.pdf AntczakH. BirkholmB. 2019 Når borgeren bliver til data… fordufter den etiske fordring [When the citizen becomes data… the ethical demand evaporates] Uden for nummer 39 19 4 19 https://social-raadgiverne.dk/wp-content/uploads/2019/11/39-UdenForNummer.pdf Search in Google Scholar

Baum, M., & Groeling, T. (2008). New media and the polarization of American political discourse. Political Communication, 25(4), 345–365. https://doi.org/10.1080/10584600802426965 BaumM. GroelingT. 2008 New media and the polarization of American political discourse Political Communication 25 4 345 365 https://doi.org/10.1080/10584600802426965 10.1080/10584600802426965 Search in Google Scholar

Beckett, C. (2020). An algorithm for empowering public service news [blog post]. The London School of Economic and Political Science. https://blogs.lse.ac.uk/polis/2020/09/28/this-swedish-radio-algorithm-gets-reporters-out-in-society/ BeckettC. 2020 An algorithm for empowering public service news [blog post] The London School of Economic and Political Science https://blogs.lse.ac.uk/polis/2020/09/28/this-swedish-radio-algorithm-gets-reporters-out-in-society/ Search in Google Scholar

Bennett, J. (2018). Public service algorithms. In D. Freedman, & V. Goblot (Eds.), The future of public service television (pp. 111–120). Goldsmiths Press. BennettJ. 2018 Public service algorithms In FreedmanD. GoblotV. (Eds.), The future of public service television 111 120 Goldsmiths Press 10.7551/mitpress/9781906897710.003.0013 Search in Google Scholar

Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81, 77–91. http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf BuolamwiniJ. GebruT. 2018 Gender shades: Intersectional accuracy disparities in commercial gender classification Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81 77 91 http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf Search in Google Scholar

Burkart, P., & Leijonhufvud, S. (2019). The Spotification of public service media. The Information Society, 35(4), 173–183. https://doi.org/10.1080/01972243.2019.1613706 BurkartP. LeijonhufvudS. 2019 The Spotification of public service media The Information Society 35 4 173 183 https://doi.org/10.1080/01972243.2019.1613706 10.1080/01972243.2019.1613706 Search in Google Scholar

Campbell, J. E. (2018). Polarized: Making sense of a divided America. Princeton University Press. CampbellJ. E. 2018 Polarized: Making sense of a divided America Princeton University Press Search in Google Scholar

Cheney-Lippold, J. (2017). We are data: Algorithms and the making of our digital selves. NYU Press. Cheney-LippoldJ. 2017 We are data: Algorithms and the making of our digital selves NYU Press 10.2307/j.ctt1gk0941 Search in Google Scholar

Constanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. MIT Press. Constanza-ChockS. 2020 Design justice: Community-led practices to build the worlds we need MIT Press 10.7551/mitpress/12255.001.0001 Search in Google Scholar

Couldry, N. (2012). Media, society, world: Social theory and digital media practice. Polity Press. CouldryN. 2012 Media, society, world: Social theory and digital media practice Polity Press Search in Google Scholar

Couldry, N., & Mejias, U. (2019). The costs of connection: How data is colonising human life and appropriating it for capitalism. Stanford University Press. CouldryN. MejiasU. 2019 The costs of connection: How data is colonising human life and appropriating it for capitalism Stanford University Press Search in Google Scholar

Dencik, L., Jansen, F., & Metcalfe, P. (2016, August 30). A conceptual framework for approaching social justice in an age of datafication. DATAJUSTICE Project. https://datajusticeproject.net/2018/08/30/a-conceptual-framework-for-approaching-social-justice-in-an-age-of-datafication/ DencikL. JansenF. MetcalfeP. 2016 August 30 A conceptual framework for approaching social justice in an age of datafication DATAJUSTICE Project. https://datajusticeproject.net/2018/08/30/a-conceptual-framework-for-approaching-social-justice-in-an-age-of-datafication/ Search in Google Scholar

Drucker, J. (2011). Humanities approaches to interface theory. Culture Machine, 12(1), 1–20. https://culturemachine.net/wp-content/uploads/2019/01/3-Humanities-434-885-1-PB.pdf DruckerJ. 2011 Humanities approaches to interface theory Culture Machine 12 1 1 20 https://culturemachine.net/wp-content/uploads/2019/01/3-Humanities-434-885-1-PB.pdf Search in Google Scholar

Esping-Andersen, G. (1990). The three worlds of welfare capitalism. Polity Press. Esping-AndersenG. 1990 The three worlds of welfare capitalism Polity Press Search in Google Scholar

Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. St Martin's Press. EubanksV. 2017 Automating inequality: How high-tech tools profile, police, and punish the poor St Martin's Press Search in Google Scholar

Gillespie, T. (2010). The politics of ‘platforms’. New Media & Society, 12(3), 347–364. https://doi.org/10.1177/1461444809342738 GillespieT. 2010 The politics of ‘platforms’ New Media & Society 12 3 347 364 https://doi.org/10.1177/1461444809342738 10.1002/9781118321607.ch28 Search in Google Scholar

Government Offices of Sweden. (2018). National approach to artificial intelligence. Ministry of Enterprise and Innovation. https://www.regeringen.se/4aa638/contentassets/a6488ccebc6f418e9ada18bae40bb71f/national-approach-to-artificial-intelligence.pdf Government Offices of Sweden 2018 National approach to artificial intelligence Ministry of Enterprise and Innovation https://www.regeringen.se/4aa638/contentassets/a6488ccebc6f418e9ada18bae40bb71f/national-approach-to-artificial-intelligence.pdf Search in Google Scholar

Heeks, R., & Renken, J. (2016). Data justice for development: What would it mean? Information Development, 34(19), 90–102. https://doi.org/10.1177/0266666916678282 HeeksR. RenkenJ. 2016 Data justice for development: What would it mean? Information Development 34 19 90 102 https://doi.org/10.1177/0266666916678282 10.1177/0266666916678282 Search in Google Scholar

Helmond, A. (2015). The platformization of the web: Making web data platform ready. Social Media + Society (July–December), 1–11. https://doi.org/10.1177/2056305115603080 HelmondA. 2015 The platformization of the web: Making web data platform ready Social Media + Society July–December 1 11 https://doi.org/10.1177/2056305115603080 10.1177/2056305115603080 Search in Google Scholar

Hokka, J. (2017). Making public service under social media logics. International Journal of Digital Television, 8(2), 221–237. https://doi.org/10.1386/jdtv.8.2.221_1 HokkaJ. 2017 Making public service under social media logics International Journal of Digital Television 8 2 221 237 https://doi.org/10.1386/jdtv.8.2.221_1 10.1386/jdtv.8.2.221_1 Search in Google Scholar

Hokka, J. (2018). Towards nuanced universality: Developing a concept bible for public service online news production. European Journal of Communication, 34(1), 74–87. https://doi.org/10.1177/0267323118810862 HokkaJ. 2018 Towards nuanced universality: Developing a concept bible for public service online news production European Journal of Communication 34 1 74 87 https://doi.org/10.1177/0267323118810862 10.1177/0267323118810862 Search in Google Scholar

Iliadis, A., & Russo, F. (2016). Critical data studies: An introduction. Big Data & Society, 3(2), 1–7. https://doi.org/10.1177/2053951716674238 IliadisA. RussoF. 2016 Critical data studies: An introduction Big Data & Society 3 2 1 7 https://doi.org/10.1177/2053951716674238 10.1177/2053951716674238 Search in Google Scholar

Jakobsson, P., Lindell, J., & Stiernstedt, F. (2021, September 29). A neoliberal media welfare state? The Swedish media system in transformation. Javnost – the Public. Advance online publication. https://doi.org/10.1080/13183222.2021.1969506 JakobssonP. LindellJ. StiernstedtF. 2021 September 29 A neoliberal media welfare state? The Swedish media system in transformation Javnost – the Public Advance online publication. https://doi.org/10.1080/13183222.2021.1969506 10.1080/13183222.2021.1969506 Search in Google Scholar

Jewkes, Y., & Reisdorf, B. C. (2016). A brave new world: The problems and opportunities presented by new media technologies in prisons. Criminology & Criminal Justice, 16(5), 534–551. https://doi.org/10.1177/1748895816654953 JewkesY. ReisdorfB. C. 2016 A brave new world: The problems and opportunities presented by new media technologies in prisons Criminology & Criminal Justice 16 5 534 551 https://doi.org/10.1177/1748895816654953 10.1177/1748895816654953 Search in Google Scholar

Johnson, J. (2014). From open data to information justice. Ethics and Information Technology, 16(4), 263–274. https://doi.org/10.1007/s10676-014-9351-8 JohnsonJ. 2014 From open data to information justice Ethics and Information Technology 16 4 263 274 https://doi.org/10.1007/s10676-014-9351-8 10.1007/s10676-014-9351-8 Search in Google Scholar

Kannabiran, G., & Petersen, M. G. (2010, October). Politics at the interface: A Foucauldian power analysis. Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, NordiCHI 2010, 695–698. https://doi.org/10.1145/1868914.1869007 KannabiranG. PetersenM. G. 2010 October Politics at the interface: A Foucauldian power analysis Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, NordiCHI 2010 695 698 https://doi.org/10.1145/1868914.1869007 10.1145/1868914.1869007 Search in Google Scholar

Kaun, A., & Stiernstedt, F. (2020). Doing time, the smart way? Temporalities of the smart prison. New Media and Society, 22(9), 1580–1599. https://doi.org/10.1177/1461444820914865 KaunA. StiernstedtF. 2020 Doing time, the smart way? Temporalities of the smart prison New Media and Society 22 9 1580 1599 https://doi.org/10.1177/1461444820914865 10.1177/1461444820914865 Search in Google Scholar

Kaun, A., & Stiernstedt, F. (2021, June 25). Prison tech: Imagining the prison as lagging behind and as a test bed for technology advancement. Communication, Culture & Critique, 1–15. Advance online publication. https://doi.org/10.1093/ccc/tcab032 KaunA. StiernstedtF. 2021 June 25 Prison tech: Imagining the prison as lagging behind and as a test bed for technology advancement Communication, Culture & Critique 1 15 Advance online publication. https://doi.org/10.1093/ccc/tcab032 10.1093/ccc/tcab032 Search in Google Scholar

Kofoed-Hansen, L., & Søndergaard, M. L. J. (2017). Designing with bias and privilege? Nordes 2017, (7). http://www.nordes.org Kofoed-HansenL. SøndergaardM. L. J. 2017 Designing with bias and privilege? Nordes 2017 7 http://www.nordes.org Search in Google Scholar

Mann, M., & Matzner, T. (2019). Challenging algorithmic profiling: The limits of data protection and anti-discrimination in responding to emergent discrimination. Big Data & Society, (July–December), 1–11. https://doi.org/10.1177/2053951719895805 MannM. MatznerT. 2019 Challenging algorithmic profiling: The limits of data protection and anti-discrimination in responding to emergent discrimination Big Data & Society July–December 1 11 https://doi.org/10.1177/2053951719895805 10.1177/2053951719895805 Search in Google Scholar

Metcalfe, P., & Dencik, L. (2019). The politics of big borders: Data (in)justice and the governance of refugees. First Monday, 24(4). http://dx.doi.org/10.5210/fm.v24i4.9934 MetcalfeP. DencikL. 2019 The politics of big borders: Data (in)justice and the governance of refugees First Monday 24 4 http://dx.doi.org/10.5210/fm.v24i4.9934 10.5210/fm.v24i4.9934 Search in Google Scholar

Milan, S., & van der Velden, L. (2016). The alternative epistemologies of data activism. Digital Culture & Society, 2(2), 57–74. https://doi.org/10.14361/dcs-2016-0205 MilanS. van der VeldenL. 2016 The alternative epistemologies of data activism Digital Culture & Society 2 2 57 74 https://doi.org/10.14361/dcs-2016-0205 10.14361/dcs-2016-0205 Search in Google Scholar

Ministry of Economic Affairs and Employment, Finland. (2017). Finland's age of artificial intelligence: Turning Finland into a leading country in the application of artificial intelligence: Objective and recommendations for measures. Publications of the Ministry of Economic Affairs and Employment 47/2017. https://julkaisut.valtioneuvosto.fi/bitstream/handle/10024/160391/TEMrap_47_2017_verkkojulkaisu.pdf Ministry of Economic Affairs and Employment, Finland 2017 Finland's age of artificial intelligence: Turning Finland into a leading country in the application of artificial intelligence: Objective and recommendations for measures Publications of the Ministry of Economic Affairs and Employment 47/2017. https://julkaisut.valtioneuvosto.fi/bitstream/handle/10024/160391/TEMrap_47_2017_verkkojulkaisu.pdf Search in Google Scholar

Ministry of Finance, Finland. (2021). National artificial intelligence programme AuroraAI. https://vm.fi/en/national-artificial-intelligence-programme-auroraai Ministry of Finance, Finland 2021 National artificial intelligence programme AuroraAI https://vm.fi/en/national-artificial-intelligence-programme-auroraai Search in Google Scholar

Moe, H. (2013). Public service broadcasting and social networking sites: The Norwegian Broadcasting Corporation on Facebook. Media International Australia, 146(1), 114–122. https://doi.org/10.1177/1329878X1314600115 MoeH. 2013 Public service broadcasting and social networking sites: The Norwegian Broadcasting Corporation on Facebook Media International Australia 146 1 114 122 https://doi.org/10.1177/1329878X1314600115 10.1177/1329878X1314600115 Search in Google Scholar

Motzfeldt, H. M. (2019). Socialrådgivere og fagforeninger: Grib dog redningskransen! [Social workers and trade unions: Grab the lifebuoy!]. Uden for nummer, 39, 32–40. MotzfeldtH. M. 2019 Socialrådgivere og fagforeninger: Grib dog redningskransen! [Social workers and trade unions: Grab the lifebuoy!] Uden for nummer 39 32 40 Search in Google Scholar

Mploy. (2018, September 26). Evaluering af projekt ”Samtaler og indsats der modvirker langtidsledighed” [Evaluation of project “Conversations and efforts that counteract long-term unemployment”]. Styrelsen for Arbjedsmarkad og Rekruttering [The Danish Agency for Labor Market and Recruitment]. https://star.dk/media/8004/evaluering-af-projekt-samtaler-og-indsats-der-modvirker-langtidsledighed.pdf Mploy 2018 September 26 Evaluering af projekt ”Samtaler og indsats der modvirker langtidsledighed” [Evaluation of project “Conversations and efforts that counteract long-term unemployment”] Styrelsen for Arbjedsmarkad og Rekruttering [The Danish Agency for Labor Market and Recruitment] https://star.dk/media/8004/evaluering-af-projekt-samtaler-og-indsats-der-modvirker-langtidsledighed.pdf Search in Google Scholar

Nikunen, K. (2019). Media solidarities: Emotions, power and justice in the digital age. Sage. http://dx.doi.org/10.4135/9781529715019 NikunenK. 2019 Media solidarities: Emotions, power and justice in the digital age Sage http://dx.doi.org/10.4135/9781529715019 10.4135/9781529715019 Search in Google Scholar

Nikunen, K., & Hokka, J. (2020). Welfare state values and public service media in the era of datafication. Global Perspectives, 1(1), 12906. https://doi.org/10.1525/gp.2020.12906 NikunenK. HokkaJ. 2020 Welfare state values and public service media in the era of datafication Global Perspectives 1 1 12906. https://doi.org/10.1525/gp.2020.12906 10.1525/gp.2020.12906 Search in Google Scholar

Noble, S. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press. NobleS. 2018 Algorithms of oppression: How search engines reinforce racism NYU Press 10.2307/j.ctt1pwt9w5 Search in Google Scholar

Oak, E. (2015). A minority report for social work? The predictive risk model (PRM) and the Tuituia assessment framework in addressing the needs of New Zealand's vulnerable children. The British Journal of Social Work, 46(5), 1208–1223. https://doi.org/10.1093/bjsw/bcv028 OakE. 2015 A minority report for social work? The predictive risk model (PRM) and the Tuituia assessment framework in addressing the needs of New Zealand's vulnerable children The British Journal of Social Work 46 5 1208 1223 https://doi.org/10.1093/bjsw/bcv028 10.1093/bjsw/bcv028 Search in Google Scholar

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Viking. PariserE. 2011 The filter bubble: What the Internet is hiding from you Viking Search in Google Scholar

Pratt, J. (2008). Scandinavian exceptionalism in an era of penal excess: Part I: The nature and roots of Scandinavian exceptionalism. The British Journal of Criminology, 48(2), 119–137. https://doi.org/10.1093/bjc/azm072 PrattJ. 2008 Scandinavian exceptionalism in an era of penal excess: Part I: The nature and roots of Scandinavian exceptionalism The British Journal of Criminology 48 2 119 137 https://doi.org/10.1093/bjc/azm072 10.1093/bjc/azm072 Search in Google Scholar

Pratt, J., & Eriksson, A. (2012). In defence of Scandinavian exceptionalism. In T. Ugelvik, & J. Dullum (Eds.), Penal exceptionalism? Nordic prison policy and practice (pp. 235–260). Routledge. https://doi.org/10.4324/9780203813270 PrattJ. ErikssonA. 2012 In defence of Scandinavian exceptionalism In UgelvikT. DullumJ. (Eds.), Penal exceptionalism? Nordic prison policy and practice 235 260 Routledge https://doi.org/10.4324/9780203813270 10.4324/9780203813270 Search in Google Scholar

Retsinformation [legal information]. (2019). Lov om aktiv beskæftigelsesindsats [law about active support for employment]. https://www.retsinformation.dk/eli/lta/2019/548#idefaafed5-e8e7-4062-abf9-c62a4c18d707 Retsinformation [legal information] 2019 Lov om aktiv beskæftigelsesindsats [law about active support for employment] https://www.retsinformation.dk/eli/lta/2019/548#idefaafed5-e8e7-4062-abf9-c62a4c18d707 Search in Google Scholar

Saha, A. (2018). Race and the cultural industries. Wiley. SahaA. 2018 Race and the cultural industries Wiley Search in Google Scholar

Sadowski, J. (2019). When data is capital: Datafication, accumulation, and extraction. Big Data & Society, (January), 1–12. https://doi.org/10.1177/2053951718820549 SadowskiJ. 2019 When data is capital: Datafication, accumulation, and extraction Big Data & Society January 1 12 https://doi.org/10.1177/2053951718820549 10.1177/2053951718820549 Search in Google Scholar

Sun, H., & Hart-Davidson, W. F. (2014, April). Binding the material and the discursive with a relational approach of affordances. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3533–3542. https://doi.org/10.1145/2556288.2557185 SunH. Hart-DavidsonW. F. 2014 April Binding the material and the discursive with a relational approach of affordances Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 3533 3542 https://doi.org/10.1145/2556288.2557185 10.1145/2556288.2557185 Search in Google Scholar

Sundet, V. (2017). Co-produced television drama and the cost of transnational ‘success’: The making of Lilyhammer. In E. Bakoy, R. Pujik, & A. Spicer (Eds.), Building successful and sustainable film and television businesses (pp. 67–88). University of Chicago Press. SundetV. 2017 Co-produced television drama and the cost of transnational ‘success’: The making of Lilyhammer In BakoyE. PujikR. SpicerA. (Eds.), Building successful and sustainable film and television businesses 67 88 University of Chicago Press Search in Google Scholar

Syvertsen, T., Enli, G., Mjøs, O. J., & Moe, H. (2014). The media welfare state: Nordic media in the digital era. University of Michigan Press. https://doi.org/10.3998/nmw.12367206.0001.001 SyvertsenT. EnliG. MjøsO. J. MoeH. 2014 The media welfare state: Nordic media in the digital era University of Michigan Press https://doi.org/10.3998/nmw.12367206.0001.001 10.2307/j.ctv65swsg Search in Google Scholar

Sørensen, J. K., & Van den Bulck, H. (2020). Public service media online, advertising and the third-party user data business: A trade versus trust dilemma? Convergence, 26(2), 421–447. https://doi.org/10.1177/1354856518790203 SørensenJ. K. Van den BulckH. 2020 Public service media online, advertising and the third-party user data business: A trade versus trust dilemma? Convergence 26 2 421 447 https://doi.org/10.1177/1354856518790203 10.1177/1354856518790203 Search in Google Scholar

Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data and Society, 4(2). https://doi.org/10.1177/2053951717736335 TaylorL. 2017 What is data justice? The case for connecting digital rights and freedoms globally Big Data and Society 4 2 https://doi.org/10.1177/2053951717736335 10.1177/2053951717736335 Search in Google Scholar

The Danish Government [Regeringen]. (2019). National strategi for kunstig intelligens [National strategy for artificial intelligence]. Finansministeriet & Erhvervsministeriet [Ministry of Finance & Ministry of Business]. https://www.regeringen.dk/media/6537/ai-strategi_web.pdf The Danish Government [Regeringen] 2019 National strategi for kunstig intelligens [National strategy for artificial intelligence] Finansministeriet & Erhvervsministeriet [Ministry of Finance & Ministry of Business] https://www.regeringen.dk/media/6537/ai-strategi_web.pdf Search in Google Scholar

The Danish Parliament [Folketinget]. (2019). Forslag til Lov om en aktiv beskæftigelsesindsats [draft law about active support for employment]. Danish Parliament records. https://www.ft.dk/ripdf/samling/20181/lovforslag/l209/20181_l209_som_fremsat.pdf The Danish Parliament [Folketinget] 2019 Forslag til Lov om en aktiv beskæftigelsesindsats [draft law about active support for employment] Danish Parliament records. https://www.ft.dk/ripdf/samling/20181/lovforslag/l209/20181_l209_som_fremsat.pdf Search in Google Scholar

The Swedish Prison and Probation Service [Kriminalvården]. (2015, December 8). Unik satsning på frivårdsapp [Unique investment in probation app] [Press release]. The Swedish Prison and Probation Service [Kriminalvården] 2015 December 8 Unik satsning på frivårdsapp [Unique investment in probation app] [Press release]. Search in Google Scholar

The Swedish Prison and Probation Service [Kriminalvården]. (2017, July 11). Frivårdsappen får tummen upp av användare [Probation app gets thumbs up of users] [Press release]. The Swedish Prison and Probation Service [Kriminalvården] 2017 July 11 Frivårdsappen får tummen upp av användare [Probation app gets thumbs up of users] [Press release]. Search in Google Scholar

The Swedish Prison and Probation Service [Kriminalvården]. (2018). Kriminalvården storsatsar digitalt – och söker nya medarbetare [The Swedish Prison and Probation Service is investing heavily in digital – and is looking for new employees] [Press release]. https://www.kriminalvarden.se/globalassets/kontakt_press/pressmeddelanden/krimtech-klar.pdf The Swedish Prison and Probation Service [Kriminalvården] 2018 Kriminalvården storsatsar digitalt – och söker nya medarbetare [The Swedish Prison and Probation Service is investing heavily in digital – and is looking for new employees] [Press release]. https://www.kriminalvarden.se/globalassets/kontakt_press/pressmeddelanden/krimtech-klar.pdf Search in Google Scholar

Turow, J. (2011). The daily you: How the new advertising industry is defining your identity and your worth. Yale University Press. TurowJ. 2011 The daily you: How the new advertising industry is defining your identity and your worth Yale University Press Search in Google Scholar

Ustek-Spilda, F., & Alastalo, M. (2020). Software-sorted exclusion of asylum seekers in Norway and Finland. Global Perspectives, 1(1), 12978. https://doi.org/10.1525/gp.2020.12978 Ustek-SpildaF. AlastaloM. 2020 Software-sorted exclusion of asylum seekers in Norway and Finland Global Perspectives 1 1 12978. https://doi.org/10.1525/gp.2020.12978 10.1525/gp.2020.12978 Search in Google Scholar

Van den Bulck, H., & Moe, H. (2018). Public service media, universality and personalisation through algorithms: Mapping strategies and exploring dilemmas. Media, Culture & Society, 40(6), 875–892. https://doi.org/10.1177/0163443717734407 Van den BulckH. MoeH. 2018 Public service media, universality and personalisation through algorithms: Mapping strategies and exploring dilemmas Media, Culture & Society 40 6 875 892 https://doi.org/10.1177/0163443717734407 10.1177/0163443717734407 Search in Google Scholar

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. https://doi.org/10.24908/ss.v12i2.4776 van DijckJ. 2014 Datafication, dataism and dataveillance: Big data between scientific paradigm and ideology Surveillance & Society 12 2 197 208 https://doi.org/10.24908/ss.v12i2.4776 10.24908/ss.v12i2.4776 Search in Google Scholar

van Dijck, J., Poell, T., & de Wall, M. (2018). The platform society: Public values in a connective world. Oxford University Press. van DijckJ. PoellT. de WallM. 2018 The platform society: Public values in a connective world Oxford University Press 10.1093/oso/9780190889760.001.0001 Search in Google Scholar

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Profile Books. ZuboffS. 2019 The age of surveillance capitalism: The fight for a human future at the new frontier of power Profile Books Search in Google Scholar

Recommended articles from Trend MD

Plan your remote conference with Sciendo