Article Text

Download PDFPDF
Clinical digital neuropathology: experience and observations from a departmental digital pathology training programme, validation and deployment
  1. Bethany Jill Williams1,2,
  2. Azzam Ismail1,2,
  3. Arundhati Chakrabarty1,2,
  4. Darren Treanor1,2
  1. 1 Histopathology, Leeds Teaching Hospitals NHS Trust, Leeds, UK
  2. 2 Faculty of Medicine and Health, University of Leeds, Leeds, UK
  1. Correspondence to Dr Bethany Jill Williams, Histopathology, Leeds Teaching Hospitals NHS Trust, Leeds LS9 7TF, UK; bethany.williams2{at}nhs.net

Abstract

Aim To train and individually validate the neuropathologists in digital primary diagnosis and frozen section reporting using a novel protocol endorsed by the Royal College of Pathologists. The protocol allows early exposure to live digital reporting in a risk mitigated environment.

Methods Two specialist neuropathologists completed training in the use of a digital microscopy system for primary neuropathological diagnosis and frozen section assessment. Participants were exposed to training sets of 20 histology cases and 10 frozen sections designed to help them identify their personal digital diagnostic pitfalls. Following this, the pathologists viewed 340 live, complete neuropathology cases. All primary diagnoses were made on digital slides with immediate glass slide reconciliation before final case sign-out.

Results There was 100% clinical concordance between the digital slide and glass slide assessment of frozen section cases for each pathologist, and these assessments corresponded with the ground truth diagnoses obtained from examination of definitive histology. For primary diagnosis, there was complete clinical concordance between digital slide and glass slide diagnosis in 98.1% of cases. The majority of discordances were related to grading differences attributable to mitotic count differences.

Conclusion Neuropathologists can develop the ability to make primary digital diagnosis competently and confidently following a course of individual training and validation.

  • neuropathology
  • digital pathology
  • education
  • information technology

Data availability statement

No data are available.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Digital pathology

Digital pathology is a technology poised to revolutionise the way in which diagnostic histopathology services are delivered. Whole-slide imaging (WSI) systems are used to capture, transmit and store digital images of glass histology slides, which can then be viewed and assessed by a pathologist on a computer screen. The key benefit of digital pathology over conventional glass slide light microscopy is the flexibility and transferability of digital slides. Digital slides can be read by multiple viewers at multiple locations, thus facilitating remote consultations and streamlining workflows.1 Digital slides are invaluable in a number of applications, including frozen section diagnosis, multidisciplinary team meetings, quality assurance programmes and education.

Digital pathology in primary histopathological diagnosis

Digital pathology is used routinely for primary diagnosis of histopathological slides in a relatively limited number of institutions at present.

Historically, one of the key barriers to digital pathology adoptions has been concern about the evidence base for digital diagnosis and implications for patient safety.2 A recent systematic review of the diagnostic concordance of WSI and conventional light microscopy (LM) analysed data from 38 quality-assessed concordance studies and demonstrated a mean diagnostic concordance of WSI and LM of 92.4%.3 This compares favourably with the 93.4% concordance rate for repeat reads of the same case using LM in those studies that quoted it (n=10). Overall, the review found evidence to support a high level of diagnostic concordance for WSI. A further, more detailed systematic analysis of diagnostic discordance between WSI and LM reads of the same cases found 335 instances of discordant diagnosis out of 8069 documented instances of WSI and LM diagnostic comparison (4%).4 Further examination of these discordances revealed that the majority were of no clinical significance, and reflected diagnostic scenarios in which interobserver and intraobserver variation are common, such as the difference between two adjacent tumour grades. A number of potential ‘pitfalls’ for digital diagnosis were observed, including the identification of mitotic figures, the detection and grading of focal dysplasia, and the location of small diagnostic objects in large tissue searches (eg, micrometastasis detection.)

Successful regulatory clearance of two WSI scanners for primary diagnosis (Leica,5 Philips6) has resulted in an acceleration of clinical digital pathology deployment, particularly in the USA, and recent publications have addressed some of the more practical and pragmatic considerations of mass clinical deployment, including guidance on how to achieve ISO-15189 laboratory accreditation for a digital diagnostic service.7

Digital pathology training and validation

Until recently, there was very little guidance available for clinical pathologists that wish to train and validate their own use of digital pathology. The College of American Pathologists digital pathology validation guideline of 20138 recommends that all departments adopting WSI for diagnosis should conduct an intraobserver concordance study, comparing digital slide and glass slide diagnosis for a minimum of 60 specimens, with a washout period of at least 2 weeks between reads. This methodology provides a sound baseline validation that a departmental WSI system is able to produce diagnostic quality images, but may not be enough to convince a sceptical pathologist of their own ability to use WSI confidently and competently. Recent guidance published by the Royal College of Pathologists9 advocates training and validation in digital diagnosis at the level of the individual pathologist, in a real-world reporting environment. The guidance document includes a protocol for digital pathology diagnostic validation, based on a direct comparison technique to facilitate learning and enable the pathologist to gain confidence via risk mitigated live reporting.

Digital neuropathology

As arguably one of the most specialised diagnostic topographies, neuropathology stands to benefit a great deal from digitisation. Networked digital pathology systems allow more flexibility in who reports what and where, and can help ensure that complex histology slides are transferred to a suitably experienced neuropathologist for frozen section analysis, primary diagnosis or secondary opinion instantaneously, regardless of the geographical location of the specimen.

There are little data regarding diagnostic safety in digital neuropathology, as the majority of published validation studies have excluded neuropathology specimens or only included them in small numbers.3 One multispecialty study identified a single major discrepancy between glass and digital diagnoses, which related to a neuropathology case.10 A pilot digital neuropathology study11 found individual digital:glass concordance rates for two neuropathologists of 94.9% and 88%, and identified two common causes of glass:digital discrepancy: identification of mitoses and assessment of nuclear detail.

Following a successful pilot in primary digital diagnosis of breast histopathology, Leeds Teaching Hospitals NHS Trust decided to initiate training and validation in primary neuropathological diagnosis and frozen section evaluation. In this paper, we present our experience implementing the best practice validation protocol, and our opinions on the benefits and challenges of the use of digital slides in clinical neuropathology practice.

Methods

The study was performed in the histopathology department of St James University Hospital, Leeds, UK, a major NHS cancer centre, with a fully subspecialised diagnostic department. The on-site laboratory processes in the region of 250 000 hours and E-stained slides per year. Leeds provides neuropathology and ophthalmic pathology services to the West Yorkshire region, encompassing a population of 2.2 million, and includes multidisciplinary team meetings for adult, young adult and paediatric central nervous system tumours, and adult and paediatric neurology. The department receives approximately 2750 brain, ophthalmic, nerve and muscle specimens per annum, and an additional 300 frozen sections. The department’s two specialist neuropathologists, with combined consultant experience of 34 years (20 years and 14 years), were recruited to train in digital diagnosis, and validate their frozen section and primary diagnostic practice using a digital pathology system.

Primary diagnostic validation

All neuropathology histopathology glass slides, including H&E, immunohistochemistry and special stains, were scanned prior to laboratory sign-out, before distribution to participating pathologists. All slides were scanned on one of six Aperio AT2 scanners (Leica, Vista, CA, USA). Standard H&E and special stains were scanned at ×40 equivalent magnification, while immunohistochemistry was captured at ×20 equivalent magnification.

Automated tissue detection and scanning point placement provided by the scanner were used and quality checked by the scanner operator (a trained biomedical scientist) as per departmental protocol. The diagnostic images were stored in a remote digital archive and retrieved with e-Slide Manager software (Leica). The scanner operator performed a final quality control check on the captured images to detect scanning artefact and major focusing issues. Images were viewed by consultant neuropathologists using Leeds Virtual Microscope slide viewing software (University of Leeds, Leeds TH NHS Trust, UK) on medical grade 8 MP screens (Eizo, Hakusan, Japan). Figure 1 depicts one of our neuropathologists at work on their digital workstation during their validation.

Figure 1

Neuropathologist at Leeds Teaching Hospitals NHS Trust assessing a case on the digital microscope during the live phase of their validation.

The validation protocol for primary diagnosis is available as an appendix to the Royal College of Pathologists’ guidance for digital pathology, where it is highlighted as an example of best practice in training and validation. Validation consists of four phases: a training phase (T), a validation training set phase (V1), a live reporting validation phase (V2) and a summary phase (S). See table 1 for an overview of the validation procedure.

Table 1

Summary of validation phases

Training phase

The training phase (T) consisted of an hour-long individual session covering basic digital pathology skills including use of the image management software (e-slide manager) and the viewing software (Leeds Virtual Microscope). Participants were observed opening and navigating cases, and were given feedback regarding their use of input modalities (gaming mouse ergonomics and use of keyboard shortcuts). Participants were able to request additional training as required, and provided with user manuals for the software, and standard operating procedures for the validation protocol and for departmental digital reporting.

Validation 1—training set (V1)

In V1, each pathologist received a training pack consisting of a set of 20 challenging and educational neuropathology cases, all presented in both digital slide and glass slide formats. The training set was designed to encompass a broad range of diagnoses and tissue types, and to expose the pathologist to types of case that might be problematic to a novice digital diagnostician. The cases selected are documented in table 2. All cases/specimen types selected which were relevant to departmental practice, and the challenging cases were selected based on a review of the relevant discordance literature for neuropathology.4

Table 2

Validation training case set for primary diagnosis

Participants were allowed to take as long as they needed to complete the training set comfortably. Pathologists were asked to view the digital slides first for each case, recording both their diagnosis and their confidence in that diagnosis (on a Likert scale from 1 to 7, where 1 corresponded to not at all confident and 7 to very confident) in a workbook, which also contained the relevant clinical details for the case. Pathologists then viewed the glass slides for the case immediately after the digital read, and recorded any alteration in their assessment of the case, as well as their confidence in their glass slide diagnosis.

Following completion of the training set by both participants, the results were discussed in a group with the trainer, and all participants reviewed cases that had caused difficulty. Pathologists identified the types of case they found problematic on digital slides and progressed to the next phase, V2, armed with this information.

Validation 2—live cases (V2)

In V2, all departmental neuropathology cases were scanned prospectively. The pathologists made their live primary diagnosis on the digital slides, and recorded the diagnoses, and their diagnostic confidence on an Excel spreadsheet. All cases were then reviewed on glass prior to final sign-out, and any modification to the digital diagnosis was recorded, in addition to the pathologist’s confidence in the glass slide diagnosis. A record was also kept of any technical failures—for example, out-of-focus regions on slides or the presence of digital striping artefact.

When each pathologist had viewed approximately 2 months whole time equivalent workload (estimated at 150 histology cases on the basis of departmental data), their diagnostic spreadsheet was analysed, and concordance and discordance data were summarised (summary phase—S). These data were discussed with the participant, and the scope of that pathologist’s future digital pathology practice was agreed on.

Frozen section assessment

The aim of the frozen section is to provide a differential diagnosis/feedback to the surgeon, rather than provide a definitive diagnosis, and because of the relatively poor quality of frozen section histology compared with normally processed and sectioned histology slides, we devised a simplified frozen section training programme. Ten frozen sections were selected from the glass slide archive and scanned using an Aperio CS2 scanner (Leica). This low-throughput scanner was chosen for frozen sections because slides can be loaded and scanned without interrupting scanning programs on the larger, high-throughput scanners. All slides were scanned at ×40 equivalent magnification, and tissue detection software was not employed, so the entire scannable area of each glass slide was scanned, to ensure all tissue, however dispersed on the glass slide, was represented on the digital slide.

The 10 cases were selected to represent commonly encountered frozen section scenarios in our department. Each pathologist was provided with the digital slides for each case, which was presented alongside all relevant clinical information available to the original reporting pathologist. The cases selected can be viewed in table 3. The pathologist was asked to make their frozen assessment on the digital slides, record this and then immediately compare the digital slides with the glass slides for the same case, documenting any change in their assessment or their confidence in their report. Diagnostic confidence was measured using a 7-point Likert scale for both digital and glass slide reads.

Table 3

Frozen section training set

Results

Primary diagnosis

Validation 1—primary diagnostic training set (V1)

Each participant viewed the same training set of 20 neuropathology cases on digital slides and glass slides. The diagnostic concordance between digital and glass slide reads was 85% (17/20) for both participants. The discordances encountered are described in table 4, and frequently concerned mitotic figure detection and grading. In all cases of discordance, group review of the glass slides confirmed that these held the ground truth diagnosis.

Table 4

Discordant cases from the training phase of validation for primary diagnosis

Primary diagnosis validation 2—live cases (V2)

The participants viewed 340 complete neuropathology cases between them. The cases were representative of the specimen type and diagnostic category mix found in the departmental neuropathology workload, and included diagnostic biopsies and excisions, and included brain, muscle, nerve and ophthalmic specimens.

The pathologists had to defer full digital assessment in 16 cases due to quality issues. These instances all related to muscle biopsies, and in all cases the H&E slides were assessable on digital, but the crucial Gomori and ATPase stains were unreadable (figure 2). When these cases are excluded from the total, there was complete clinical concordance between the glass slide and digital slide reads in 98.1% of cases (318/324). Only 1.8% of cases had a clinically significant difference with the potential to affect diagnosis/prognosis between digital and glass slide reads. See table 5 for a breakdown of concordance statistics for the two pathologists.

Table 5

Live reporting validation statistics

Figure 2

Example of a Gomori-stained digital slide which pathologists found difficult to interpret.

All discordant cases were reviewed on glass and digital by the participant and the trainer. In all cases, the glass slides were judged to hold the ground truth. Clinically significant discordances concerned identification of mitotic figures and confident identification of malignant lymphoid proliferations (see table 6 for a summary of all discordances, and figure 3 for an example).

Table 6

Discordant cases from the live reporting phase of validation (V2)

Figure 3

Example of a digital slide where pathologists reported difficulty in assessing the mitotic count.

Diagnostic confidence and diagnostic modality preference

The mean diagnostic confidence, on a Likert scale from 1 to 7, was similar for each pathologist for digital slides and for glass slides (table 7), although the range of diagnostic confidence varied between digital and glass. Both pathologists detected a proportion of cases (4% for pathologist 1, 3% for pathologist 2) where they clearly preferred the glass slide presentation of the case. These cases all involved borderline mitotic counts.

Table 7

Pathologists’ diagnostic preferences

Frozen section diagnosis

There was 100% clinical concordance between the digital slide and glass slide assessment of frozen section cases for each pathologist, and these assessments corresponded with the ground truth diagnoses obtained from examination of definitive histology. Pathologists demonstrated equal confidence in their digital and glass slide assessments of frozen sections.

Discussion

Digital pathology is a transformative technology, with the potential to revolutionise the way in which neuropathology services are delivered. Digitisation of slides allows for rapid transferability, enabling the establishment of robust, efficient diagnostic networks for intraoperative diagnosis and consultations. In addition to streamlining diagnosis and referral, remote reporting of scanned slides could allow more equitable access to specialised neuropathological opinion. It is also likely that digitisation of the specialty could aid recruitment and retention of neuropathologists, by supporting flexible and remote working.

This study documents the first use of a Royal College of Pathologists’ approved validation and training protocol9 for the diagnosis of neuropathological specimens. This approach is focused on the training needs of the individual pathologist and is competence driven rather than target driven. Two specialist neuropathologists viewed 340 complete neuropathological cases, including H&E, immunohistochemistry and special stains. Complete clinical concordance was observed in 98.1% of cases, indicating excellent agreement between digital primary diagnosis and glass slide assessment. This statistic is similar to the published validation findings using the same protocol for breast histopathology.12 Our findings suggest that suitably trained and validated pathologists can competently and confidently use digital pathology for standard primary neuropathology reporting practice.

Our pathologists reported a number of key benefits to digital reporting, including

  • Instantaneous access to previous biopsies in the digital archive for comparison with new metastases and ability to compare these directly on screen.

  • Greater efficiency assessing multislide cases, especially cases with large immunohistochemistry panels.

  • Easier navigation between small pieces of tissue on a slide for fragmented specimens.

  • More efficient preparation and selection of cases for multidisciplinary team meetings (MDTMs) and tumour boards.

  • More secure, convenient MDTM and tumour boards (negating the need to physically transport glass slides from the histopathology department to the MDT suite in a separate institution).

  • Enhanced training experience for junior pathologists and trainees. Pathology trainees can be directed towards neuropathology cases with optimum educational value in the digital slide archive, facilitating more equitable distribution of training cases between a training group and allowing the trainer to personalise cases to the needs of the trainee.

  • Use of digital slides by the consultant histopathologist frees the glass for the student, who can study them, or the digital slides at leisure, without compromising turn-around times for the patient by delaying definitive diagnosis.

  • Review of digital slides allows for a more engaging teaching experience and allows a single pathologist to interact with a group of trainees, gathered around a screen, without the need for a multiheaded microscope.

In the course of their validation procedure, our pathologists identified key areas of digital reporting they found more difficult on digital slides, particularly in the early stages of the validation, and both appreciated a ‘learning curve’ for mitotic figure detection in particular. We propose two causes for this: first an observation that there was less contrast between chromatin and the nuclear background on digital slides, rendering the nuclei dark and difficult to interpret, and second the inability to adjust the fine focus of potential mitotic figures. In our experience, our validation procedure of direct comparison of digital and glass slide images allowed our pathologists to reconcile the appearance scanned mitoses with the glass slide image, and they soon gained confidence in digital mitotic scoring. Given the initial difficulty, and the importance of mitotic scoring in accurate tumour grading, the group decided that post-validation, any cases with ‘borderline’ mitotic counts should be reviewed on glass before sign-out, to ensure maintenance of diagnostic quality. In the future, the use of image analysis software could support the work of the pathologist by providing rapid, reproducible mitotic scoring for scanned digital pathology slides.

Since completion of the validation period in 2018, our neuropathologists now report all cases on digital slides as standard, deferring to glass slides only when they wish to confirm mitotic count in borderline lesions or where special stains are too dark for easy digital assessment.

Take home messages

  • Digital pathology can provide the clinical neuropathologists with a number of quality and efficiency benefits.

  • Neuropathologists that have been trained in the use of a digital pathology system and completed a validation procedure can perform the majority of their primary diagnostic work using the digital microscope.

  • Some special stains can be harder to interpret on digital slides, particularly for the novice user. It is important that digital pathologists have access to conventional light microscopes for cases or slides where they lack confidence in their digital interpretation.

Data availability statement

No data are available.

Ethics statements

References

Footnotes

  • Handling editor Runjan Chetty.

  • Contributors BJW and DT designed the study. BJW drafted the manuscript and collected and analysed the data. AI and AC provided feedback on study design, participated in validation, provided images and gave feedback on drafts of the manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests DT and BJW are part of the Northern Pathology Imaging Co-Operative, which is supported by Leica Biosystems and Roche Ventana. Leeds Teaching Hospitals NHS Trust has a collaborative research partnership with Leica Biosystems for digital pathology deployment.

  • Provenance and peer review Not commissioned; externally peer reviewed.