Article Text

Download PDFPDF

Maintaining quality diagnosis with digital pathology: a practical guide to ISO 15189 accreditation
  1. Bethany Jill Williams1,2,
  2. Chloe Knowles1,
  3. Darren Treanor1,2
  1. 1 Leeds Teaching Hospitals NHS Trust, Leeds, UK
  2. 2 Department of Pathology, University of Leeds, Leeds, UK
  1. Correspondence to Dr Bethany Jill Williams, Leeds Teaching Hospitals NHS Trust, Leeds LS1 3EX, UK; bethany.williams2{at}nhs.net

Abstract

An ever-increasing number of clinical pathology departments are deploying, or planning to deploy digital pathology systems for all, or part of their diagnostic output. Digital pathology is an evolving technology, and it is important that departments uphold or improve on current standards. Leeds Teaching Hospitals NHS Trust has been scanning 100% of histology slides since September 2018, and has developed validation and validation protocols to train 38 histopathology consultants in primary digital diagnosis. In this practical paper, we will share our approach to ISO inspection of our digital pathology service, which resulted in successful ISO accreditation for primary digital diagnosis. We will offer practical advice on what types of procedure and documentation are necessary, both from the point of view of the laboratory and your reporting pathologists. We will explore topics including risk assessment, standard operating procedures, validation and training, calibration and quality assurance, and provide a checklist of the key digital pathology components you need to consider in your inspection preparations. The continuous quest for quality and safety improvements in our practice should underpin everything we do in pathology, including our digital pathology operations. We hope this publication will make it easier for subsequent departments to successfully achieve ISO 15189 accreditation and feel confident in their digital pathology services.

  • digital pathology
  • histopathology
  • laboratory management
  • quality assurance
  • quality control
View Full Text

Statistics from Altmetric.com

Introduction

Interest in the deployment of clinical digital pathology systems for primary diagnosis has increased dramatically in recent years, fuelled by the evolution of hardware and software solutions on the market, and the need for pathology services to tackle ever-increasing workloads, with a dwindling workforce, while maintaining quality and timeliness of diagnosis.1 Many departments have either deployed scanning technology or have planned or initiated a deployment, to harness the flexibility of digital images and potentially improve service capabilities.

‘ISO 15189 Medical laboratories—Requirements for quality and competence’ is an international standard that specifies the quality management system requirements pertinent to medical laboratories.2 Successful laboratory accreditation with national bodies (including UKAS in the UK, CLIA in the USA and SWEDAC in Sweden) should reassure patients and clinicians that the staff who carry out diagnostic and prognostic tests are competent, and that the equipment and processes they use are safe and fit for purpose. The deployment and integration of digital pathology diagnostic systems in a clinical histopathology department represents a departure from standard laboratory procedures, and the scope of accreditation will have to include examination of hardware and software, calibration of tools and devices, and the training and competence of laboratory staff and diagnosticians.

Leeds Teaching Hospitals NHS Trust has a single-site histology laboratory with full diagnostic subspecialisation, with 40 pathologists reporting 80 000 cases per year (approximately 290 000 slides). An initial digital breast histopathology pilot in 20173 lead to a full deployment for 100% histology slide scanning, which was achieved in September 2018. The laboratory currently scans around 1000 glass slides a day (approximately 1 Terabyte of image data), which are available for primary diagnosis by trained and validated consultant pathologists. In this practical guide, we will explain our approach to ISO accreditation, which resulted in successful UKAS accreditation of primary digital diagnosis in our department. We hope sharing key information from this process will aid other departments in their clinical deployment plans, and preparations for ISO inspection.

General principles

The first formal assessment for accreditation is an ‘initial assessment’, conducted by a Lead Assessor supported by technical assessors able to cover the scope of your application (including digital pathology). The assessment will involve detailed review of relevant departmental records, interviews with staff and managers and the witnessing of key activities, which may include digital diagnosis and slide scanning.

Preparation for accreditation inspection always requires effort and exertion on the part of the laboratory. The novelty of digital pathology, and laboratories’ relative inexperience using it, can make the process even more daunting. Stress can be minimised by careful planning in the early stages of your deployment, so you can lay the groundwork for safe, responsible practice from day one. It is important to identify key individuals, both in the laboratory, and among your diagnostic staff who will take responsibility for the delivery of core aspects of the accreditation procedure, and keep regular track of progress. In this paper, the accreditation considerations for the laboratory and diagnostic department will be considered in turn.

Laboratory considerations

ISO 15189 requires validation (assurance that a system meets the needs of stakeholders) and verification (evaluation of whether a system complies with regulation, requirement and specification) for any new process or technique that has been implemented in a laboratory. For digital pathology deployment, assessors will need to view a written document, supplemented with evidence, which addresses a number of key aspects of the implementation:

  1. Change control

  2. Risk assessment

  3. Verification and acceptance

  4. Comparability and reproducibility

  5. Training and competency

  6. Uncertainty of measurement

Change control

Change control, the systematic management of all changes to a system or process, is a vital part of a digital pathology deployment, and ensures that all changes are documented, no unnecessary changes are made, resources are utilised efficiently and existing services are not unnecessarily disrupted.

A full change control procedure, complete with documentation, must be developed and adhered to if digital pathology is being implemented into the laboratory as a new process. It is essential for ISO 15189 and ensures all aspects of the implementation are assessed and managed appropriately. It allows key people to be identified to ensure appropriate stakeholder engagement and that all evidence is submitted correctly and in a timely manner. If your initial accreditation inspection raises findings that need to be addressed or resolved prior to the next assessment, it will also simplify the process of resubmitting evidence.

Key personnel to engage during your change control process might include your clinical lead, representatives of departmental management (business, operations and service), your health and safety officer, quality control manager, members of laboratory staff of all grades, and a change lead.

Table 1 summarises the key aspects to consider for digital pathology accreditation change control.

Table 1

Key aspects to consider for digital pathology accreditation change control

Risk assessment

The health and safety risks of the proposed digital pathology process need to be scoped and assessed before it can be implemented. Your department's health and safety representative will be an invaluable resource to advise and assist in carrying out risk assessments. Types of assessment include:

  • Equipment (scanners, computer workstations etc).

  • Proposed process/workflow

  • A general risk assessment to include the environment in which the equipment will be sited, and how the laboratory staff will work safely the equipment.

  • A new display screen assessment should be carried out by all newly training staff members to ensure they are sat correctly when using the computer workstations.

Verification and acceptance

Verification for ISO 15189 in the laboratory requires evidence to show that the scanners and software have been adequately tested for their intended use, and are working as required, and as the manufacturer states (see table 2). This includes the scanners, any software provided by the company and any databases used. A written document detailing the evaluation methods and results must be submitted as evidence. A good way of providing this is to run internal tests against the initial manufacturer's installation and verification checklist from when the scanners were first installed.

Table 2

Examples of verification and acceptance criteria for digital pathology hardware and software

Comparability and reproducibility

If multiple scanners are being utilised as part of the digital pathology system, ISO 15189 requires evidence to demonstrate that all scanners used produce images that are of equal diagnostic quality. This can be done by scanning a test set of slides on each of the scanners and asking a suitably experienced and validated digital pathologist to assess them. Suitable cases might include a malignant breast core biopsy for tumour grading, a bowel cancer screening specimen and a sentinel lymph node. Factors to consider when assessing the images would be:

  • Is the background clear?

  • Is the image in focus?

  • Is the staining crisp and clear?

  • Are the images comparable across all scanners?

  • Is there a significant difference in the interpretation of key diagnostic features in images obtained from different scanners?

Inter-laboratory assessment schemes are common for standard glass slide histology, and are likely to be adopted for digital pathology whole slide images too. This would involve departments exchanging whole slide images, and asking pathologists to assess images produced in different laboratories. As digital pathology is a new technique, it may be difficult to share images from one department to another. An alternative to an inter-laboratory scheme is to rescan a case previously scanned and ask the reporting pathologist to reassess the case and compare their assessment with the original report.

Training and competency

All parts of the digital pathology process need formal documentation in the form of quality managed standard operating procedures (SOPs). Laboratory staff should be familiar with these documents, and able to access them easily for reference. Examples of SOPs include the following:

  • How to operate the scanners.

  • How to operate the image software and database.

  • Troubleshooting—both for the scanners and workflow.

  • Maintenance of the scanners.

To complement the content of the SOPs, relevant training booklets and competency assessments need to be documented and regularly reviewed. A suitable training programme needs to be delivered to all new users of digital pathology in the laboratory to ensure they feel safe to work in that area without supervision. These SOPs and training materials will require regular updates to reflect changes in practice and the acquisition of new and updated hardware and software.

Uncertainty of measurement

If you are using digital measurement software, you will need to tackle the question of uncertainty of measurement. A calibration slide with predefined values can be used to assess whether scanned objects are captured to scale, and this should be audited in your department.

An example of a calibration slide (Applied Image, NY, USA) is shown in figure 1. It contains a marked area with a predetermined height and width, with expected measurement given by supplier's calibration data (See figure 2). It is important that suppliers provide a calibration certificate and data for the slide, to demonstrate that the slide is fit for purpose before it is used for clinical calibration.

Figure 1

Example of an uncertainty of measurement tool (image courtesy of Applied Image, NY, USA).

Figure 2

Example of a calibration certificate (image courtesy of Applied Image, NY, USA).

A width and height measurement should be recorded using the proposed clinical measurement tool and monitored for any changes that are deemed out of the reference range. Scanner suppliers will differ in their approach, but is important to check the scanner's documentation to determine the reference ranges the measurements can fall under, for example, ±0.15 mm. This process should be repeated for all scanners used to scan slides for primary digital diagnosis, and any measurements falling outside of the manufacturer's acceptable reference range should be reported.

As with any other equipment used for measurement in a medical laboratory, the calibration slide itself needs to be calibrated. The process and frequency of this will differ between suppliers, so it is important to check how often this should be done.

To satisfy your department, and your assessor that your digital measurements taken are safe for clinical use, you will need to demonstrate that measurements taken on diagnostic images are accurate and reproducible. A relatively simple approach to this is to carry out and document an audit of clinically relevant measurements appropriate for the scope of your digital diagnostic practice. A small set of glass slides encompassing tumour measurements/margin assessment/Breslow thickness of melanoma, etc, can be assembled, and pathologists can be asked to make repeat measurements on glass and digital once a day for week or two. In this way, variability in measurement on both glass and digital slides can be documented, using both inter-observer and intra-observer comparisons.

Post-accreditation monitoring

At this point in time, based on our personal experience of digital pathology implementation, and existing national benchmark frequencies (eg, annual External Quality Assurance schemes, annual appraisal), we would suggest yearly audit/quality assurance benchmarks for digital pathology systems. As experience in digital pathology accumulates, and it becomes standard practice, the need for audit will reduce.

Clinical and diagnostic considerations

All the SOPs and workflows you establish in the laboratory should ensure that your diagnosticians are presented with quality whole slide images in a safe and timely manner. This rigour needs to be matched with appropriate training and validation in the diagnostic office.

Validation and training

Meaningful digital diagnosis training and validation should result in:

  • Pathologists that are confident in their abilities and their limitations with digital diagnosis.

  • Pathologists that are familiar with their hardware and software, and can recognise and report performance issues

  • A department with a shared understanding and investment in the digital pathology system

  • A department that can develop bespoke ways of using digital to improve its outputs, workflows and working environment

The Royal College of Pathologists guidance on digital pathology validation and training4 highlights that validation should be an individual process, occur in a real-world context, and be sufficiently rigorous to satisfy a reasonable internal or external observer that safety and clinical effectiveness are maintained. The Leeds digital pathology validation protocol3 is included as an example of best practice in digital validation, is pathologist-centric and utilises the existing evidence base (see table 3 for a summary of the protocol). It allows pathologists to self-identify their own digital pathology ‘pitfalls’ and training targets, while allowing them early experience of live digital reporting, in a risk mitigated environment.

Table 3

Summary of Royal College of Pathologist’s endorsed validation protocol for digital primary diagnosis

The first stage in this validation protocol is a period of training. Pathologists learn to use their digital pathology hardware and software, and are observed opening and assessing digital cases, so the trainer can give feedback. This is followed by a validation training phase, where the pathologist views a set of up to 20 cases relevant to their own practice (eg, breast only/breast and cardiothoracic/general pathology). The cases should reflect a range of tissue types, and locally relevant special stains and immunohistochemistry, and should include cases that may be problematic for the novice digital pathologist, such as dysplasia assessment and sentinel node metastasis detection. The pathologist views the slides on digital, records their diagnosis and their confidence in it, then immediately reviews the glass slides for the case, and documents any alteration in their impression of the case. This allows the pathologist to identify potential training ‘targets’, and types of case to pay particular attention to in the ‘live phase’ of their validation.

In the ‘live’ validation phase, the pathologist views all their workload on digital slides in the first instance. They document their digital diagnosis, but have a glass slide reconciliation, and alter the diagnosis if necessary before final sign out. Any discordances are recorded and reviewed, as well as the confidence of the pathologist in each digital diagnosis. After a suitable period (eg, 2 months whole time equivalent might be appropriate to ensure the pathologist has viewed a suitable breadth and depth of diagnosis), the pathologist can review their diagnostic concordance throughout the validation procedure, and decide on the safe scope of their digital practice. If there is a particular diagnostic scenario where they still lack confidence for example, mitotic counting or H elicobacter pylori identification, they may want to incorporate specific safety nets into the their digital work, such as glass slide checks for critical/grade boundary mitotic counts, or additional immunohistochemistry for gastric biopsies.

Whatever validation methodology is used, your pathologists should be able to explain it, and describe the scope of their digital pathology practice, and any ‘safety nets’ they use for difficult cases. For instance, a mandatory check on glass slides if a pathologist believes a granuloma is suspicious for mycobacteria, but no organisms are detected on special stains on digital, or a mandatory check on glass in ‘high-risk’ scenarios such as Barrett's oesophagus biopsies with possible dysplasia.

Preparing your pathologists for accreditation

Consultants should have ready access to their own data, documenting their individual training and validation for digital reporting, copies of relevant SOPs and protocols, and user guides/manuals for the software they use. A specific personal folder containing this information for each consultant could be stored on shared departmental drive, and accessed via a desktop shortcut (see table 4 for a list of documentation your pathologists may need to be aware of/have access to).

Table 4

Suggested pathologist documentation

Your pathologist should be able to demonstrate how they report a case to the assessor, and how they would recognise and report issues with digital slides, such as out of focus regions or digital artefact. Depending on your departmental SOPs, you may protocolise reflex rescanning of inadequate slides (digital slides on which the pathologist is not prepared to make a diagnosis for quality reasons), or deferral to glass slides in this situation. All quality issues should be reported and fed back to the laboratory, regardless of whether the pathologist can make a diagnosis or not on the suboptimal slide.

Pathologists that are reporting digitally should be familiar with, and able to access departmental SOPs for digital slide reporting, training and validation in digital reporting, and the relevant user guides for the software/slide viewer they use for primary diagnosis. They should be able to access their individual validation documentation, and talk through the implications of this validation, describing any situations in which they would defer to glass slide reporting. It can be helpful to circulate spreadsheets/templates for pathologist to record data on cases where they need to defer to glass, or where digital slides are suboptimal for assessment, which can be fed back to the laboratory on a regular basis.

Post-accreditation monitoring

Accreditation is an ongoing process, and departments must continuously monitor, and strive to improve the quality and safety of their digital pathology service. In light of this, it is important to continue to audit and evaluate digital diagnosis after successful ISO accreditation. This should include documentation and investigation of scanning issues (eg, out of focus slides/slide regions, incidence of digital artefact) and diagnostic issues (eg, frequency and reason for deferral to glass). Digital diagnosis can be audited on an annual or 6 monthly basis, by retrieving a random sample of archived cases, and reviewing the diagnosis. This could incorporate comparison with glass slides, providing the pathologist participating in the audit with an opportunity for continuing professional development.

Conclusion

We have presented a practical guide to advise clinical laboratories on how to prepare for ISO accreditation encompassing a digital pathology diagnostic service, and summarised some of the key steps and considerations, in both the laboratory, and the diagnostic office. We hope this will prove a useful guide to the many departments that are preparing for digitisation. Digital pathology technology, and our understanding of the scope and limitations of digital practice continue to evolve, and pathology services, and the patients that rely on them, stand to benefit enormously from digital pathology adoption. It is however important that the pathology community continues to prioritise the quality and safety of our processes and outputs with the introduction of new technologies and techniques.

Take home messages

  • Achieving accreditation of primary digital diagnosis as part of ISO 15189 requires novel procedures, protocols and documentation.

  • Early identification of key personnel to take responsibility for clearly defined areas of accreditation is vital in preparation for an ISO inspection.

  • Accreditation is an ongoing process, and a digital pathology service must be continuously monitored and improved.

References

View Abstract

Footnotes

  • Handling editor Runjan Chetty.

  • Contributors BJW drafted and revised the manuscript. BJW provided all material pertaining to clinical validation. CK provided themes and reviewed manuscript for laboratory medicine section. DT reviewed and refined manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests Leeds Teaching Hospitals NHS Trust has a collaborative partnership with Leica Biosystems for a research digital pathology deployment.

  • Patient consent for publication Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement There are no data in this work. Data are available upon reasonable request.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.