Should we be concerned at the use of private tech in the management of migration?


Digital technology now plays an important role in migration and border enforcement across Europe. Ludivine Sarah Stewart writes that while this technology comes with certain advantages, there is a risk that over-reliance on private technology companies for the management of migration could undermine public values such as accountability and participation.

As digitisation and new technologies such as artificial intelligence (AI) progress as a key item on national agendas, public authorities are increasingly turning to private companies to provide digital solutions, including for core state functions. While digital transformation plays an important role in modernising the internal administration of states, reliance on private actors can significantly shape power structures and impact individuals exposed to new technologies.

Private tech providers and migration

Although control over migration has often been regarded “as a prime example of the exercise of state sovereignty via state power”, the presence of private actors in this sector is far from unprecedented. Examples of direct and indirect privatisation and the issues they raise for accountability for human rights violations are already well documented in academic literature. While a tendency towards privatisation of migration is already clearly established, the demand for digital solutions exacerbates this trend by paving the way for further participation of a specific category of private actors: digital tech providers.

Among the various technologies implemented in the field of migration and asylum in Europe identified in a recent report, a significant number have been developed by private entities. In the United Kingdom, non-governmental organisations (NGOs) such as Privacy International have drawn attention to the extensive role of private companies as providers of technology used in migration and border enforcement. In 2022, they revealed that the UK government awarded a £6 million contract to the company Buddi Limited for the development of “non-fitted devices”, or smartwatches, for the purposes of migration enforcement, a technology denounced as “dehumanising and invasive”.

Further examples can be found in Germany. Since 2015, Germany has sought to find solutions to internal challenges through digitisation and new technologies. This includes a contested dialect recognition mechanism designed to assist decision-makers to identify the origin of asylum seekers. According to a response to a written question from members of the German Federal Parliament, the provider of the system is the company Nuance and the system is made available to the state through a licence.

The various AI systems used by Germany involving external actors include a profile analysis system for asylum applications. More information is needed on the functioning of this system which had already attracted criticism prior to its implementation. According to the information released, the pilot project was developed with the help of the German company SVA and is intended to give indications of potential safety-relevant information based on hearing transcripts.

Complexity and risks for individuals

Digitisation and the deployment of new technology have the potential to add another layer of complexity to the entanglement of public and private actors in migration. Lessons can be drawn from the so-called “English test-scandal” in the United Kingdom. The scandal concerned English language tests for visa requirements. Following revelations of fraud in tests processed by the private organisation Educational Testing Service (ETS), the organisation implemented a solution involving a new voice recognition system combined with human checks to identify cheating.

The Home Office subsequently revoked the visas of many students. However, according to an assessment conducted by the National Audit Office, this technology was being used for the first time and there was no “piloting or control group”. The report highlighted a lack of expertise on the part of the public sector, which in this case “relied on assurances from ETS that the voice recognition technology was suitable for the task”. Finally, it also highlighted the challenges faced by individuals whose visas were revoked and who were often unable to access data from their tests.

Protecting public values

Reliance on digital solutions provided by private actors raises important questions regarding public values such as accountability and participation. In the field of migration, scholars have already warned about the public sector’s overreliance on the private digital sector as a strategy to avoid responsibility.

This could exacerbate relations of dependency, especially if the public sector does not have the necessary expertise to exert adequate control over the technology, as in the English test scandal – a problem also apparent in the digitisation of justice. Importantly, not all technologies present similar risks, and it is important to distinguish between systems that pose a risk to individuals and others that are intended to be used solely for internal management purposes.

Moreover, collaboration between the public and private sectors may range from formal delegations to informal partnerships, which can affect access to information and raise questions of transparency. Finally, the demand for digital solutions also creates a lucrative market for private companies, with the state taking on the role of a “market maker”.

This article is the third in a series of blog pieces drawing on research from the DigiPublicValues: Preserving Public Values in Privatised Digital Systems project – a joint CIVICA research project by the London School of Economics and Political Science (LSE), Università Bocconi, European University Institute and the Hertie School’s Centre for Digital Governance.

Written by Ludivine Sarah Stewart (EUI). 

Photo credits: Daniel Schludi (unsplash).