Background: External quality assessment schemes (EQAS) in pathology have been established in the United Kingdom for several years with the aim of raising standards.
Objective: To determine the experiences and perceptions of pathologists undertaking EQAS.
Methods: A questionnaire was distributed to histo/cytopathologists in the south and west of England.
Results: A large proportion of pathologists responding felt that the EQAS was educational, and 69% said participation had encouraged them to undertake additional educational activities. Some reservations were expressed about marking schemes. Asked if EQAS using digital images (CD-ROM or web based) rather that glass slides were valid alternatives two thirds responded no, despite 75% claiming to have appropriate IT skills.
Conclusions: EQAS play a valuable role in helping to maintain standards in histopathology and cytopathology. Some reservations were expressed about the marking schemes and further work is needed to establish a robust marking method. Significant barriers need to be overcome if digital EQAS are to be successfully implemented.
- EQAS, external quality assessment scheme
- NHSBSP, NHS breast screening programme
- external quality assurance
- performance assessment
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Pathologists play a central role in delivering healthcare. In the hospital setting the single most important function of a histopathologist is the interpretation and diagnosis of tissue biopsies or organs removed from patients. To ensure a high quality service, quality assurance programmes have been developed. In the United Kingdom one of the first histopathology EQAS (external quality assessment schemes) was developed in 1990 for the NHS breast screening programme (NHSBSP). This is probably the largest scheme, with slides circulated to over 400 participants (there being approximately 1200 histo/cytopathologists in the UK). The primary objective of this EQAS (and many others subsequently established) was to improve the consistency of diagnoses and the quality of prognostic information provided by pathologists. It was envisaged this would be achieved primarily through a process of self education.
Initially most EQAS were voluntary (although, uniquely, participation in the NHSBSP EQA has been mandatory from the outset for pathologists involved in dealing with breast screening specimens). However, EQAS are starting to become compulsory and in future will be required by the General Medical Council (the regulatory body for UK doctors) as part of a consultant’s portfolio to comply with revalidation requirements.1 EQAS are moving to a role in monitoring performance. Starting in 2005 the NHSBSP EQAS will be implementing action points when a pathologist fails to meet the minimum standard. There are now EQAS (using slides circulated to pathologists) in most histopathology subspecialties in the UK.
The aims of the present paper were to summarise the experience of a cohort of pathologists who have participated in EQAS (in some cases for up to15 years). Knowledge of the experience and perception of pathologists who participate in these schemes will help shape the future development of EQAS in this and other countries.
A questionnaire was sent to 90 consultant histopathologists in the south west of England region during the winter of 2004/5. An opportunity was given for consultants to complete the questionnaire anonymously if they wished. Responses, in tick boxes, were asked for on a Yes/No/Don’t Know basis or on a sliding response of strongly agree/agree/neutral/disagree/strongly disagree with a given statement. After each question there was space to add textual comments.
In all, 61 responses were returned completed, and data were recorded onto a Microsoft Access database and Excel spreadsheet.
RESULTS AND DISCUSSION
A large proportion of questionnaires sent out were completed and returned (68%). The findings indicated that individual pathologists were registered to undertake a variable number of different EQAS slide schemes, as shown in fig 1. Some pathologists participated in only one scheme, while one pathologist participated in six. Most undertook between two and four EQAS. The vast majority of histopathologists contacted (58 of 61) participated in the regional “general histopathology slide scheme” and 52 pathologists participated in the national NHSBSP scheme. The gynaecological cytology EQAS had 29 participants, and the gynaecological histology scheme, 12 participants. Other national slide schemes—for example, renal, orthopaedic, ophthalmic, gastrointestinal, melanoma, liver, and so on—each had fewer than 10 participants from this region.
Most schemes have circulations every six to nine months. When asked if individuals have sufficient time in their job plans to participate in EQAS, 54% of respondents stated they did not (compared with 41% having sufficient time and 5% not responding to the question), with some pathologists commenting that they had to “squeeze in” the EQA or “just push it in at the last minute” in the working day.
EQAS were originally promoted as being educational and it is encouraging that 46% of pathologists agreed that the scored (assessed) slides in EQAS were of educational value (fig 2). Furthermore, asked if participation in EQAS had encouraged them to undertake additional educational activities that they would not otherwise would have done, 69% replied yes (31% stated no).
Of those who thought that the assessed cases were not educational, reasons given by several pathologists included the view that the cases seen in EQAS reflected routine practice and as such would not generally be expected to be educational. Some EQAS specifically contain a few non-assessed slides for educational purposes and 77% of the responders found these educational.
A key issue for pathologists is whether EQAS are valid tools for the performance assessment of pathologists. Direct evidence to support the validity of EQAS in histopathology is lacking. As a result the introduction of EQAS elicited significant debate in the medical press (for a review, see Parham2). In particular there has been controversy over how schemes are marked and how the cut points for identifying satisfactory or poor performance are determined. In the late 1990s, the Department of Health, in conjunction with the Royal College of Pathologists and the Association of Clinical Pathologists, established a working group to make recommendations for histopathology and cytopathology EQAS.3 The report does not recommend a single minimum score and is not prescriptive for the type of scoring system to be employed. Different methods of scoring have been proposed and are used by different schemes owing to variations in case mix and case difficultly.4 A particular problem for histopathology EQAS is the definition of the correct diagnosis. For most schemes, when a case achieves 80% consensus or more with regard to the diagnosis among participating pathologists, this is regarded as the correct diagnosis. These cases are then used to determine the performance scores of participants, and cases with less than 80% agreement are rejected from the analysis. Despite these difficulties in determining the correct diagnosis, pathologists agree that “in general, the scores obtained reflect the performance of pathologists” (fig 3).
A few pathologists felt reassured, in that their scores gave them confidence that their performance was within acceptable standards. Indeed, prior to EQAS there has not been a mechanism for pathologists to receive feedback on their performance routinely. Several pathologists, however, commented that the scores really only reflect performance in the EQAS and that these do not necessarily reflect performance in routine work. Some pathologists may take extra care when undertaking an EQAS compared with their routine work, or may only give each case a cursory examination. No pathologist volunteered to taking extra care when undertaking EQAS. However, a few commented that there is often too little time to undertake EQAS, so that they tended to rush through the cases. It was also commented that EQAS are undertaken in isolation, when in practice pathologists often work in a collegiate way, and the EQAS does not assess how one handles difficult cases. One pathologist felt that the scoring system could allow a mediocre pathologist to score as well as an excellent one like him/herself! This would not be unexpected if only cases obtaining 80% consensus for a correct diagnosis are used in the performance analysis. Thus histopathology EQAS define common minimum levels of attainment (conformity to consensus) rather than measuring the expertise of an individual.
In the United Kingdom, for most EQAS the first action point for poor performance is triggered when an individual scores in the bottom 2.5%, after placing the scores in rank order, in two of three consecutive EQAS circulations.5 At this stage action is supportive, educational, and advisory. If in two of three subsequent EQAS circulations, the individual remains in the bottom 2.5%, the second action point is triggered and there is a formal investigation by the EQAS organisers.
When asked “Do you have confidence in the definition of poor performance and persistent poor performance, the distribution of responses is shown in fig 4. Almost half the pathologists neither agreed nor disagreed with this statement, giving a neutral response, but 23% did not have any confidence with the definition of poor performance. Some pathologists commented that the definition was a statistical measure and not a meaningful one, that the definition had not been critically examined, and that the EQAS was based on the assumption that there would be poor performers. One commented that the whole process was suspect. Indeed direct evidence to support the validity of the cut points used is lacking. Furthermore 80% of respondents felt that (non-pathologist) managers did not have an understanding of the limitations of EQAS.
Despite this, the majority of pathologists felt that EQAS could identify poorly performing pathologists (fig 5), although one cynical pathologist recalled that one may drive well during a driving test yet drive differently having passed the test! Concern was also raised by a few that the system was not robust in that some poor performers could be deemed competent while some competent pathologists could be labelled poor performers.
Nevertheless a significant number of pathologists still have concerns about the ability of EQAS to identify poor performers. There clearly needs to be additional work to convince pathologists about the validity of EQAS in defining good performance and poor performance. Further work is currently being undertaken to develop a more robust scoring systems: a new method of defining cut points based on the performance of the middle majority has been proposed and this is currently being examined in the context of the NHS Breast Screening EQAS.6
Circulating glass slides for EQAS to participating pathologists is resource intensive, being administratively and organisationally time consuming. With the increasing availability of digital technology it is feasible to store and distribute histological images electronically on CD ROM or by the internet. Indeed CD ROM and web based EQAS have been piloted in recent years by the NHSBSP. In this questionnaire pathologists were asked if they considered EQAS using digital images (CD ROM or web based) to be a valid method for assessing a pathologist’s practice; 66% responded no (with 21% don’t know and only 13% yes.). This was despite 75% claiming to have appropriate information technology skills to enable them to participate in EQAS through the internet and on CD-ROM. The principal concern expressed by pathologists was that making a diagnosis on digital images does not reflect how they work routinely (that is, with glass slides and a microscope) and therefore does not mirror real life. This is clearly a barrier to the future development of EQAS by electronic means. This may change over time as pathologist become used to examining cases on computer screens (for example with the increasing use of educational material distributed in a digital format). Other potential barriers to the uptake of digital EQAS include 13% not having access to appropriate hardware and 11% being unable to receive images owing to a hospital firewall and so on.
This study provides insight into the perceptions and experiences of pathologists and will assist those organising EQAS. It is reassuring that EQAS was shown to have achieved its original aim, and that the majority of pathologists found histopathology EQAS of educational value. Furthermore for some pathologists the scores obtained provided encouraging feedback on their performance. There is doubt, however, among some pathologists about the definition of poor performance and how robust schemes are at identify persistent poor performers. This is an area that clearly requires further work. As pathologists become more IT literate and computers more powerful there is an opportunity to undertake EQAS by digital means. However, it is likely that pathologists will be reluctant to undertake such schemes, as they do not reflect current practice and are not considered valid. These remain future barriers to be addressed.
I wish to thank the pathologists in the S&W of England who took the time to complete and return the questionnaires.
Conflict of interest: The author is QA coordinator for the NHS Breast Programme (Pathology) in the S&W of England.