Transplants

Monaldi: learning from the chain of mistakes instead of just looking for a culprit

The judiciary will take its time, but safe care cannot wait years: a systemic audit of the donation network is needed now

by Riccardo Tartaglia*

 Febbraio 2026. ANSA / CIRO FUSCO

3' min read

Translated by AI
Versione italiana

3' min read

Translated by AI
Versione italiana

In the course of my professional activity, first as Director of the Clinical Risk Management Centre of the Region of Tuscany - WHO Collaborating Centre - and several times as a member of the Ministerial Crisis Unit on Sentinel Events, I have had to analyse numerous serious adverse events, both at regional and national level.

This included a transplant case that occurred in my region. It was on 7 February 2007: three organs from a donor who tested positive for the HIV virus were transplanted in two Tuscan hospitals due to a chain of errors in the donation process.

Loading...

The transplantation of HIV-infected organs

During the evaluation of organ suitability, the positive serological test result was mistakenly transcribed as negative. That finding, without any cross-checking, went into the official documentation and the Regional Transplant Centre assigned liver and kidneys that were later found to be infected with HIV. Five days after the operations, during tissue evaluation by a second laboratory, the correct result emerged. The error was detected and the patients were immediately treated with antiretroviral therapy; appropriate compensation was subsequently arranged.

The event was the subject of a systemic analysis first by a national commission and then by a regional one, in which I participated. It was not only human error in the manual transcription of the report that emerged: technological criticalities were identified - in particular, the lack of integration between the laboratory machine, the computer system and the donor card - and organisational criticalities, such as the division of checks between different laboratories, even by level of competence.

The recommendations were clear: automate the transmission of results, centralise fitness checks, overhaul all procedures and strengthen training and develop a proactive safety culture (even small or missed incidents must be reported and audited).

Individual responsibility

In spite of this, public attention, then as now, focused almost exclusively on the individual responsibility of the health worker who had materially committed the error: an esteemed, probably fatigued professional, who was, however, embedded in a system that allowed - and thus made possible - the manual transcription of such critical data.

Thus the classic 'rotten apple theory' was established: identifying a culprit and convincing oneself that by removing it the problem is solved, without questioning the fragilities of an organisation that, also due to a very strong regionalisation of healthcare, struggles to guarantee (even today) organisational and behavioural homogeneity precisely in the most delicate functions.

Transplant organisations are complex, highly reliable systems, but zero risk does not exist. Human error is inevitable; what makes the difference is the robustness of the technological and organisational infrastructure capable of intercepting it before it causes damage. The availability of suitable containers for transporting organs is also an essential requirement.

Today, faced with new cases involving different structures, I see the same pattern re-emerging: the immediate search for professional responsibility. The judiciary - rightly - will take its time, but the safety of care cannot wait years. A system check is needed immediately, which can only be the responsibility of the competent national bodies, first and foremost the National Transplant Centre, which is called upon to ascertain procedures, controls and authorisation requirements, with the full cooperation of the regional centres.

The absence of 'just culture'

The experience of that time taught that criticalities were not concentrated in one or more operators, but spread to different levels of the organisation. This is the essence of the so-called 'just culture', which arose in the safety sphere of high-risk organisations in the 1980s to overcome the limitations of the punitive culture: human errors are often the last link in a chain, individual punishment does not prevent future events, and the fear of sanctions reduces the reporting of problems. In this context, significant event audits - protected by Law 24/2017 on safety of care - are key tools because they allow in-depth analyses without punitive purposes.

Even the best surgeons and the best doctors, if placed in a network that is inadequately structured in technological and organisational terms, can find themselves exposed to avoidable risks. It may seem unpopular to say so, but securing the system - rather than immediately looking for a culprit, a task that falls to the judiciary - is the real priority in order to protect patients and strengthen confidence in the health service and, in particular, in the donation system, which risks being penalised by this recent event with the serious consequences that it entails for the many people waiting for an organ.

The media should contribute to this process by making the survival data of transplant centres and the good practices they have adopted to improve the quality of care transparent to the public. Comparison is one of the most effective tools for making complex systems safer.

*G. Marconi University 

Copyright reserved ©
Loading...

Brand connect

Loading...

Newsletter

Notizie e approfondimenti sugli avvenimenti politici, economici e finanziari.

Iscriviti