Prevent Watch’s Head of Research, Alim Islam, compares China’s counter-extremism policies to those in the UK and concludes they only differ in degree and scale.
In 2019, it emerged that UK counter-extremism “experts” had been working with China on “best practice” to demonstrate the use of “counter extremism” policies. This venture was funded by the Foreign and Commonwealth Office.
The counter-extremism “think tank” the Royal United Services Institute (RUSI), ran a two-day event at taxpayers’ expense about “countering the root causes of violent extremism undermining growth and stability in China’s Xinjiang Region” and “to demonstrate the effectiveness of UK best practice in CVE and identify ways this can be adopted in China”.
At the time, Senior Associate Fellow Raffaello Pantucci claimed that RUSI was in China “to use British experts to influence Chinese policies” and that this had happened before the “situation in Xinjiang worsened.” The conference happened in December 2016.
Although news of China’s “re-education” camps for its Muslim Uyghur people were officially documented by the UN, Amnesty International and Human Rights Watch in 2018, Human Rights Watch reported that the incarceration and “re-education” of Muslim Uyghur people began in May 2014, driven by China’s Strike Hard Campaign.
With Pantucci and his colleagues at RUSI and the Foreign and Commonwealth Office visiting China in 2016, two years after the inception of this programme of “re-education” of Muslims through incarceration, surely they must have known about it?
Since then, Beijing’s “re-education” of Muslims has included allegations of systemic rape, physical and other sexual abuse, to force the relinquishment of religion.
Subscribe to our newsletter and stay updated on the latest news and updates from around the Muslim world!
It is secularism at its most rabid, and its engine is a programme of surveillance, data gathering and sharing to target individuals, their families and networks of friends and associates. It’s particularly shocking missionary zeal turns on the fulcrum of education, or more precisely “re-education.”
The reports from China are shocking. But how is their programme different from what happens in the UK? Are the methods shared and their outcomes different simply in degree and scale? If so, what does this mean for the UK?
Prevent’s collection and retention of data on children
Government figures released in November 2021 for the period April 2020 to March 2021 showed 4,915 people were referred to Prevent. This is a decrease from the 6,287 referrals, likely due to the coronavirus pandemic. It is the lowest number of referrals since the year ending March 2016.
Of the ages from those referred who were known, the age category of 15-20 made up the highest proportion of individuals referred, that being 1,398 young people, or 29%. Referrals from the police accounted for 36% of referrals, the highest proportion.
Despite schools being closed for much of the year, the education sector still accounted for 25% of referrals. Individuals under the age of 15 accounted for 20% of referrals.
However, these Prevent referrals do not include instances where a child has been subjected to Prevent-type questioning, where they have been asked about their religious and political beliefs, but no formal referral has been made. Based on cases Prevent Watch has worked on, these non-referrals can have the same chilling effect and trauma as a referral.
The Prevent budget since the financial year 2015/16 has been an average of £44m a year. While the budget for Prevent has remained similar since this time, there have been wholesale cuts to healthcare.
For example, in the same period of time public health grants have been cut by 24%, approximately £1 billion. One question that arises is whether spending money on a failed policy is the best way to use the public purse.
Schools as Britain’s collecting points of data on Muslim children
Prevent stipulates that schools are required to teach British values under Prevent. Indeed, schools are a primary site of both British “re-education” and surveillance.
When an 8-year-old Muslim boy was separated from his classmates and – without his parents or any caring adult guardian – asked about Islam, the mosque he attends, whether he prays, his views on other religions, as well as being asked to recite verses from the Quran, Pantucci – from RUSI that advised China on CE policy – said that he “understood” why this would be necessary.
When a mother of three children under the age of 8 years old approached Prevent Watch for assistance, she told us that she had been so traumatised by the intrusion that she could not leave her home for two months. Her children had been approached by Prevent and questioned at school.
“The officer gestured with his finger that I should not even try to enter the room where they were interviewing my children,” she said. “The social worker was present, but he made no attempt to inform me about the process, nor did he appear to consider the impact on my children.”
For any child – let alone three siblings – to be put in such a situation based on imprecise, unclear, and indefinite meanings of “extremism” and “radicalisation,” which have been criticised by the courts, is naturally the cause of great anxiety to parents, and deeply damaging to trust.
As the mother told us: “I was not informed about what my children were being questioned about, and the nature of the questions. I was very concerned whether my children had understood the questions. I was also really worried that they had been asked misleading questions or had been unduly pressured.”
Application of Prevent might be lawful in letter but it is lawless in nature. It allows police to be involved where no crime is evident, even in cases of children. Children are criminalised and their data is collected.
Difficulty removing names from databases
The mother in the case above told Prevent Watch that she received the report outlining the “concerns” that Prevent had about her children (all under the age of eight) nearly five months later.
At that point, she found that her information and that of her children had been shared and discussed among different public authorities to assess whether intervention via the government’s flagship “deradicalisation” programme, Channel, needed to take place.
This had all happened without her knowledge. Neither she or her husband had been given the opportunity to challenge the process or see the data collected on them or their children, understand how it was being interpreted or know with whom it would be shared.
After three years of a protracted and frustrating complaints process to the local authority, the chief concern of both parents remains centred on how the retention of their data has impacted their children and will continue to impact their lives.
Much of their concern is centred on what data has been collected on them, how it is being interpreted by algorithms, with whom it is being shared, and how long it will take to be removed.
These concerns are justified. Our cases show repeatedly that the data collected through Prevent, often on young people who may be subject to questioning without parental consent or presence, is difficult to challenge or access except in a few isolated cases.
In one such case, had the parents not taken legal action to clear their son’s name after no concerns were raised, their son’s data would have been retained for at least six years under the police’s national retention assessment criteria (NRAC) policy, which does not differentiate between records of adults and children.
This means a Prevent interview or referral is a means of gathering data on Muslim children and placing this data on criminal record databases – in the case linked above, parents were shocked to discover their primary school-aged son’s name was not just on one database, but he was on ten.
In this case, when asked, the police refused to give any reassurance to the parents that their son’s records would not be used again, and they said they could not guarantee that the data would not feature in any criminal record checks.
This, even though there had been no concerns of “extremism” or “radicalisation.”
In cases of adults, Prevent Watch has documented several cases where data collection and retention are having a serious impact on lives, including a young adult who had his position at college withdrawn after information was shared between his secondary school and the college about his Prevent referral, despite nothing coming of it.
Islamophobia and algorithm prediction
When data is collected and retained with the aggressiveness pursued by Prevent it is providing an opportunity through which data on individuals – often young – is being trawled, collated and interpreted.
This creates a sort of new tyranny – that of “big data” – under which people must live with the knowledge that they are under the perpetual trawling of dubious algorithms.
The first test case for this “new normal” are Muslims: a recent article in Vox highlighted the intensely Islamophobic nature of artificial intelligence (AI), the so-called “brain” behind the algorithms that sort and “interpret” data, and – most troublingly – which are now being used to problematise behaviour and predict crime, all under the banner of “counter extremism.”
Prevent’s debunked escalator theory, more commonly known as the conveyor belt theory, according to a report by the Joint Human Rights Commission, “rests on the assumption that there is an escalator which starts with religious conservatism and ends with support for violent jihadism.”
However, the term “religious conservatism” is problematic; it is subjectively interpreted and often through a lens of fear and misunderstanding of Islam.
Records revealed by CNN showed that Muslims in China were sent to a detention facility for simply “wearing a veil” or growing “a long beard.”
Prevent Watch has cases where women have been reported upon starting to wear a head covering, and indeed, one Prevent training slide deck produced by the Home Office features a woman who adopts more modest dress as a cause of “concern” and “vulnerable to extremism.”
Data is difficult to remove from the surveillance state
Human Rights Watch reports that Beijing’s predictive policing programme, the IJOP is fed by aggressive data collection, including “DNA samples, fingerprints, iris scans, and blood types of all residents between the age of 12 and 65” particularly of Uyghur Muslims.
Like in the UK, “it is unclear exactly how authorities are using the biometrics, but the amount of information they have on people is enough to frighten many… particularly given that they have no ability to challenge the collection, use, distribution, or retention of this data”.
According to HRW, procurement notices for IJOP show that it is supplied by the Xinjiang Lianhai Cangzhi Company, a wholly owned subsidiary of China Electronics Technology Group Corporation, a major state-owned military contractor in China, which built the big data program to “collate citizens’ everyday behaviour and flag unusual activities to predict terrorism.”
Journalists should be encouraged to investigate links between these companies and entities in the UK. Like Prevent, the IJOP is an integrated, multi-agency operation. It is sold to the public in terms of patriotism (recall: “British values”) and keeping people safe, but it is a military programme.
All of this should also point us not primarily at Islamophobia and the need to get rid of Prevent, but to the deeper and more long-term question of big data, who regulates it, and how it is used?
For now, urgent questions must be asked about what accountability mechanisms are being put in place to check programmers and others who are deciding for us who “might be” a criminal within the toxic framework of Prevent and “counter extremism,” especially when these are our children.