Conference: ‘‘Technological innovation and challenges in prevention of gender-based violence: the case of the VioGén system’’.
Elisa Simó Soler
Assistant Professor of Procedural Law, Universitat de València
Brief introduction to the VioGén system
The VioGén system (Sistema de Seguimiento Integral en los casos de Violencia de Género, in English, the integral monitoring system in cases of gender-based violence) was implemented in 2007 by the Secretary of State of the Ministry of Interior following the current framework of the Organic Law 1/2004, the 28 December, of Integral Protection Measures against Gender-based Violence to predict the risk of reoffending in gender-based violence cases and to coordinate the protection of victims.
Even though the result of VioGén is shared among different public services (forces and security bodies, justice bodies, prison systems, healthcare, social services, equality), until 2020 it could only be executed by the police in the moment in which a woman went to the police station to report. Using statistical models, the aim is to get a certain degree of risk of reoffending -unrated, low, average, high or extreme- that comes together with certain measures that the police has to adopt, having a predictive, but also preventive task. Besides, the result of VioGén goes together with the police report for the knowledge of the judicial body. Since four years ago, it can also be used by the personnel assigned to the Legal Medicine Institutes and Forensic Sciences at the request of judge’s.
It must be said that VioGén is an actuarial system consisting of two forms: VPR 5.0 (consisting of 35 risk indicators) whose execution allows to get the risk of reoffending of the same aggressor and the VPER 4.1 (consisting of 37 risk indicators) used to know the evolution of the risk. After its last update in 2019, the 5.0 form includes a Scale H (dessigned with 13 indicators) to detect the cases that involve homicide risk. It is a second algorythm, that works at the same time the VPR works, and that sends out an automatic alert when detecting the called ‘‘especially relevant cases’’. Its activation entails an automatic increase from ‘‘average’’ to ‘‘high’’ if the intial result of reoffending has been ‘‘unrated’’ or ‘‘low’’. Moreover, it incorporates specific warnigs for ‘‘minors in a vulnerable situation’’ and ‘‘minors in risk’’.
The VioGén system is an example of how technology -and predictably Artificial Intelligence (AI) in the future- has begun to have a meaningful impact in Administration of Justice. In 2020 the National Court took a stance in the most significant judicial case related to VioGén until the moment.
Secretariat of State of the Ministry of InteriorIn sentence of the 30 September 2020, the Contentious-Administrative Chamber of the National Court declared the patrimonial responsibility of the Public administration because of a mistake in the evaluation of risk that was not able to prevent the murder of a woman by her partner. The VioGén system had assigned a ‘‘unrated’’ risk degree that the Guardia Civil officers kept without carrying out aditional investigations, in spite of the existence of enough evidence to extend the investigation. This evaluation was a negative determinant for the judge to refuse the protection order the victim requested. A month later, the woman was murdered.
When evaluating how this model works, we can identify some issues we must reflect on.
Exposed issues
Lack of transparency
The first of the weaknesses of VioGén -and that could become a threat- in its lack of transparency. As Martínez Garay et al. note. 2022, it is crucial to have a wider knowledge about how it works and the articulation of external control mechanisms to get information about its efficacy and reliability, something the Secretary of State for Security has not made public for the moment. In this sense, it is known which are the factors and risk indicators used in each form, but it is not known which are the ones used in the ponderation, as well as the differenciated weight (if they exist) attributed to each party. The concerns about its opacity is shared by Éticas Foundation, in the external audit carried out with the collaboration of Ana Bella Foundation (2022), when noting that the majority of information VioGén has, comes from studies carried out by teams that participated in its development and staff related to the Ministry of Interior.
This impossibility to audit these algorythms is not only an obstacle to evaluate the accuracy of risk analysis, something that could lead to prediction mistakes with terrible consequences, but also that can jeopardise the public trust and legitimacy of the judicial system.
Algorythmic biases
With the implementation of AI systems, one of the main concerns was the identification of hypothetical cases of algorythmic discrimination. As previously mentioned, the limited access to information makes it hard to detect possible algorythmic biases of VioGén. However, we can identify two hypothetical cases in the ellaborated studies that could indicate this negative effect.
In this work carried out by the team of the Universitat de València (Martínez Garay et. al., 2022), evidence has been found that suggest that women born out of Spain are less protected or its protection is less effective against new aggressions as well as homicides. The overrepresentation of migrant women in VioGén is justified because they suffer from more violence and report more than Spanish women. However, this overrepresentation is not homogeneus, increasing in inactive cases (that could be related to turism or temporary residence in Spanish territory due to work) and in the low risk and unrated degrees (that could be a case of indirect discrimination that requires a deep study about an especially vulnerable situation of migrant women, particularly, in irregular administrative situations.)
On its part, the results of Éticas have found that the fact of not having children entails a meaningful decrease of the perception of extreme risk. As they reffered, ‘‘women who were murdered by their partners and did not have children were systematically assigned a lower degree of risk than the ones who did have children, with a difference of sensitivity between the two of a 44%’’ (Éticas Foundation, 2022, 33).
These two examples constitute a clear warning about possible predictive mistakes that require inmediate attention if we take into account the purpose of its use, which is no other than protecting the life of women who are victims of gender-based violence.
Personal biases
When the existence of biases and its possible involvement in the final result are analysed, the human variable must be taken into account. In the case of VioGén, the human impact could be seen in the moment of filling out the form, because of the lack of information sources or an incorrect evaluation by the police of the circumstances surrounding the victim.
The instruction 4/2019 of the Secretary of State for Security includes a section about ‘‘Communication skills for collecting data that allow to inform the risk indicators’’ that offers some interesting considerations in this regard. On the one hand, although in most cases the victim will only be interviewed, the police should corroborate the information making use of other sources (police officers who had intervened before, author, neighbours, witnesses, family, and even technical reports. On the other hand, the Secretary itself highlights the possible interference of stereotypes about the good victim in the police estimation: ‘‘we must stress that it does not exist a stereotypical and genuine reaction of a ‘‘true’’ victim. This way, beyond existent myths, each victim can react in a different way during the police diligences and also when they are telling their account of the aggressions’’, and directly considers the possible incorporation of biases by the police when highlighting that ‘‘it is more important to let the woman start talking freely and spontaneusly, avoiding the introduction of biases’’.
In the work carried out by Éticas (2022) more than 80% of interviewed women admitted having different problems with the VioGén form, something that directly calls out the quality of the input data and invites to a revision of the process of data generation that feeds the algorythmic system since it can be a cause for biases. We must remember that the 4/2019 instruction of the Secretary of State for Security dismisses the idea of filling out the form as if it was a simple questionnaire. A favourable atmosphere should be ensured, so that the woman can recount the events in a sphere of accompaniment. This can be achieved if intersectional gender perspective is assumed, something that requires specialised qualification.
Automatisation bias:
Even though VioGén has not currently incorporated AI to carry out the risk evaluations, it is possible to identify a bias that the IA Act in the Article 14.4 letter b) states, the well-known automatisation bias, this trend to automatically trust or trust too much the results of a AI system of high risk.
In the case of VioGén, the 95% of the police officers decided not to modify the suggested risk rating (González Álvarez, et al. 2018). In other words, only the 5% modified upwards the obtained result. This trend of assuming the risk that VioGén provides not only can be proof of delegating decision-making to an algorythm, but also it entails certain questions in terms of responsibility. We must remember that the judicial body is not binded by the result of VioGén to adopt a precautionary measure, that is why the National Court dismissed accusing the algorythm of this mistake, considering the prerogatives of human supervision.
However, the automatisation bias promotes other reflections around the possibility of resorting to precautionary measures (for example, the defendant requesting access to the VioGén result to challenge the imposed precautionary measures) or reinforced justification that could be required from the judicial body in case of diverging from the risk offered by VioGén (focusing on the judicial reasoning currently based on heuristic and other contextual factors that shape the human black box, but also in the fulfillment of obligations to motivate resolutions and guarantee imparciality)(Simó Soler y Rosso, 2022;Montesinos García, 2021).
Precission in prediction
One last critical element to consider is the precission in prediction. VioGén is designed to maximise sensitivity and detect the bigger number of possible cases in which the victim could be attacked again. This has led to a great number of false positive cases. As Martínez Garay et al. note. (2022), less than 20% of women suffered a second aggression when the estimated risk is average, high or extreme. The remaining question is if this result is caused by a malfunctioning of VioGén. However, the answer is not simple, due to the opacity of the system, that does not allow to control thorougly these extremes, among other reasons. The overestimation of the risk that is a consequence of the system itself could indeed happen, but the hypothesis of the effectiveness of adopted protection measures that avoid revictimisation is also considered.
It is a topic of discussion whether the increase of the use of VioGén by the Units of Integral Forensic Evaluation to integrate the forensic evaluation could reduce the number of false positive cases on account of they have more information because of their qualification that allows them to evaluate other factors such as the presence of mental disorders, the possibility of interviewing the aggressor, and interviewing the victim in a more relaxed atmosphere and temporally distanced from the aggression. However, the first results were not aimed at this, because the risk is either the same, or it gets to a higher degree. However, even though the change is not the expected one, its only existence is an indicator of the need to have interdisciplinary teams for an integral evaluation of the risk that benefits not only the victim, ensuring its correct protection, but also the aggressor, because the effect of false positive cases in judicial headquarters could have a harmful effect on their rights if it came to establish custodial measures (Martínez Garay et al., 2022).
Conclusions
The use of advanced technologies in justice, as the VioGén system evidences, has inaugurated a new era in the protection of victims of gender-based violence. However, although these technologies offer powerful tools to prevent and manage risks, it is essential to improve transparency of algorythms, promote interdisciplinary supervision and assure that technological advances do not jeopardise the rights of victims and aggressors.
Its application to criminal justice intends to eliminate or at least reduce certain dysfunctions and shortcomings that affect Justice, such as the great judicial subjectivity through objective evaluating the decision-making process, and that aims to contribute achieving the so hoped-for number of zero vicitms of gender-based violence. However, we must not forget that the criminal justice system is the last in the action chain, with a very limited ability to take action and essentially based on individual punishment.

Profesora Ayudante Doctora de Derecho Procesal de la Universitat de València, forma parte del equipo investigador del Proyecto “Claves para una justicia digital y algorítmica con perspectiva de género” (PID2021-123170OB-I00), dirigido por la Profa. Ana Montesinos García y es integrante de OdiseIA.