By Local Democracy Reporter
A RIGHTS group has warned the privacy of Thurrock residents could be at risk due to the council’s use of a computer system that aims to predict who needs social care, who may become a victim of crime and who could end up sleeping on the streets.
The computer system was developed by a private company called Xantura and Thurrock has spent the past four years using it to predict if a child is at risk of abuse so social services can intervene early.
Last year the council agreed to expand its use bringing the budget to £1.14million.
Freedom of information findings from the Local Democracy Reporting Service have revealed that the system, which the council calls a “predictive modelling platform”, is starting to be tested by departments that deal with housing, community safety and benefits.
In the same way that social services receive notifications when the system believes a child could be at risk of abuse, data on anti-social behaviour could soon be used to warn the community safety department which residents could become victims of crime.
The housing and benefits departments could also be warned which residents could wind up homeless or falling into significant financial trouble, using council-stored data on debt.
Other data that could be fed into the system includes council data on school attendance records, domestic abuse history and youth offending records.
Privacy advocates, the Open Rights Group, has warned the use of the system could be putting residents at risk.
Policy director Javier Ruiz Diaz said: “This kind of predictive analytics has a huge potential to stigmatise and consequently has a very high privacy risk.
“Thurrock County Council should explain what specific piece of legislation enables them to do this. The law says that they should also provide people affected with clear information on how they use their data and the consequences.
“They should also give individuals the opportunity to ask for human review if they have been automatically referred, and in some cases they might be able to opt out altogether.
“Beyond what the law says, this is fundamentally wrong. Using data in this form to classify people’s alleged future behaviour is precisely the kind of dystopian future we want to avoid.
“Councils claim that they need to save money, but we have not seen proper evidence that this is happens consistently. In most cases we think that IT vendors like Xanthura are selling councils the promise of technological fixes to complex social problems.
“Removing the human judgment of social workers and council employees is not the solution.”
A spokesman for the council said tests are still underway on how the system will be used in relation to anti-social behaviour and homelessness. They stressed that data protection assessments have been carried out and the council is “satisfied that its modelling systems are compliant with all data protection legislation and best practice”.
He continued: “At a time when demand on services such as housing and social care is rising nationally, the use of data analytics provides Thurrock Council with an opportunity to better identify those most vulnerable in need of help and make sure they get the most appropriate support before a potential problem escalates and to help reduce the need for more significant or complex interventions which can be distressing for individuals and families.
“It is not a case of judging people, rather it helps enable the council to get the right help to those who need it sooner.
“It is important to emphasise that data analytics systems are only part of the process and further verification and checks by professional staff are carried out prior to any interventions. All modelling is done using data the council or its partners already hold.”
A council report published in July last year, stated that analytics had been responsible for identifying 100 per cent of families referred to the council’s troubled families programme and it is also being used to provide alerts to safeguarding teams with an 80 percent success rate.