Have you ever   messed up so badly at work that   1,000 experts ring together to tell your publishers to kibosh , lift the crack of publication and exhaustively explain themselves ? No ? Well , give up a thought for   Harrisburg University who find themselves in this exact situation today .

In an coming ledger to be published by Springer Nature , Transactions on Computational Science & Computational Intelligence , the squad from Harrisburg University outlined a organisation they make that they claimed ( in apress releasethat has now been dispatch from online ) , " With 80 percent accuracy and with no racial preconception , the software can prefigure if someone is a criminal based solely on a picture of their face . The software is intended to help police enforcement prevent law-breaking . "

Alarmed by the many and immediate problematic assumptions and repercussion of using " criminal justness statistics to call criminality , " expert from a wide range of proficient and scientific fields including statistics , machine learning , unreal intelligence agency , practice of law , account , and sociologyresponded in the capable varsity letter , categorically stating :

" Let ’s be clear : there is no way to develop a system that can predict or describe ' criminalism ' that is not racially biased — because the class of ' criminality ' itself is racially biased , " adding   " datum generate by the condemnable justice system can not be used to “ identify criminals ” or predict reprehensible demeanor . Ever . "

The authors of the letter write that research like this rests on the assumption that data on criminal arrests and conviction are " dependable , neutral index number of underlying criminal activity , " rather than a reflexion of the policy and practices of the criminal justice system , and all the historical and current preconception within it .

" Countless studies have shown that people of color are treat more harshly than similarly situated white people at every stage of the legal system , which results in serious distorted shape in the data , " the group calling themselves the Coalition for Critical Technology write .

" Thus , any software system built within the exist criminal legal framework will inevitably echo those same prejudices and key inaccuracy when it come to determining if a someone has the ' face of a criminal . ' "

Essentially   – as with so many other forms of technology   – the system will replicate the inherent racial biases of the data it ’s been fed . The system would identify the face of someone who thepolice may profile , a panel may convict , anda jurist may sentence . All of which istainted by prejudice .

The letter point out that " constabulary scientific discipline " has been used as a direction to excuse racially discriminatory practices . Despite being " debunked numerous times throughout history , they go on to resurface under the guise of cutting - edge techno - reforms , such as ' artificial intelligence . ' "

The letter asseverate that any AI systems that arrogate to predict felonious behavior on physical characteristics are a continuation of the long - disgrace   pseudoscience ofphrenology . As well as being used by Europeans as a " scientific " justification for their racist opinion of their superiority over non - whitened people , the authors state phrenology andphysiognomywere and are   " used by academics , law enforcement specialists , and politicians to advocate for tyrannous policing and prosecutorial manoeuvre in wretched and racialized communities . "

The Coalition for Critical Technology asks thatSpringer Naturecondemn the use of criminal justice   statistics to forecast criminalism   and acknowledge their role in " incentivizing such harmful scholarship in the past tense " .

Astatementfrom   Harrisburg University say that   " All   research channel at the University does not necessarily mull the view and goals of this University , " and the staff is " update the paper to come up to concerns raise " .