Article Text

Enabling digital pathology in the diagnostic setting: navigating through the implementation journey in an academic medical centre
  1. Chee Leong Cheng1,
  2. Rafay Azhar1,
  3. Shi Hui Adeline Sng1,
  4. Yong Quan Chua1,
  5. Jacqueline Siok Gek Hwang1,
  6. Jennifer Poi Fun Chin1,
  7. Waih Khuen Seah1,
  8. Janel Chui Ling Loke2,
  9. Roy Hang Leng Ang2,
  10. Puay Hoon Tan1
  1. 1Department of Pathology, Singapore General Hospital, Singapore, Republic of Singapore
  2. 2Integrated Health Information Systems, Singapore, Republic of Singapore
  1. Correspondence to Dr Chee Leong Cheng, Department of Pathology, Singapore General Hospital, 20 College Road, Academia, Level 10, Diagnostics Tower, Singapore 169856, Republic of Singapore; cheng.chee.leong{at}


Aims As digital pathology (DP) and whole slide imaging (WSI) technology advance and mature, there is an increasing drive to incorporate DP into the diagnostic environment. However, integration of DP into the diagnostic laboratory is a non-trivial task and filled with unexpected challenges unlike standalone implementations. We share our journey of implementing DP in the diagnostic laboratory setting, highlighting seven key guiding principles that drive the progression through implementation into deployment and beyond.

Methods The DP implementation with laboratory information system integration was completed in 8 months, including validation of the solution for diagnostic use in accordance with College of American Pathologists guidelines. We also conducted prospective validation via paired delivery of glass slides and WSI to our pathologists postdeployment.

Results Common themes in our guiding principles included emphasis on workflow and being comprehensive in the approach, looking beyond pathologist user champions and expanding into an extended project team involving laboratory technicians, clerical/data room staff and archival staff. Concordance between glass slides and WSI ranged from 93% to 100% among various applications on validation. We also provided equal opportunities for every pathologist throughout the department to be competent and confident with DP through prospective validation, with overall concordance of 96% compared with glass slides, allowing appreciation of the advantages and limitations of WSI, hence enabling the use of DP as a useful diagnostic modality.

Conclusions Smooth integration of DP into the diagnostic laboratory is possible with careful planning, discipline and a systematic approach adhering to our guiding principles.


Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.


Digital pathology (DP) describes the utilisation of whole slide imaging (WSI) technology to digitise glass slides used in examination of tissues and cells. WSI technology has seen rapid advances in its performance and ability to produce quality representation of glass slides, with several studies attesting to its good diagnostic concordance with glass slides.1–4 Further push to adopt DP include the potential to use WSI to remodel pathology services with cost savings5 and for applications beyond that of current glass slides, including image analysis.6

Since 2009, the Department of Pathology at Singapore General Hospital (SGH) has piloted and used WSI for education and research purposes, and has previously published our experiences with DP.7 ,8 In 2014, we extended the DP programme to cover the diagnostic setting and in 2015 the laboratory incorporated WSI and the telepathology checklists into the College of American Pathologists (CAP) accreditation, with no deficiencies cited. Our DP system (DPS) is also currently carrying the in vitro diagnostic label in compliance with our country's regulatory requirements. In addition, we successfully migrated more than 40 000 WSI across DP platforms, allowing previously scanned WSI to be centralised into a single location.

In this article, we provide highlights of our implementation journey in introducing an enterprise DPS closely integrated with the laboratory information system (LIS), incorporating these into the diagnostic laboratory workflow. We summarise the journey by featuring seven key principles below:

  • Principle 1: not just about the interested few but about the entire laboratory

  • Principle 2: ensuring DPS works together with LIS

  • Principle 3: no longer about the technology, but about the workflow

  • Principle 4: test, test, test—test the workflow, test generously, test comprehensively

  • Principle 5: train, train, train—train early in the project, train the workflow, train succinctly yet comprehensively

  • Principle 6: validating DPS—predeployment to kick-start, postdeployment to gain experience

  • Principle 7: stability and sustainability—define the processes, governance and feedback mechanisms.

Overview of SGH DPS and its implementation

SGH is the largest hospital in Singapore, with approximately 1600 beds, close to 80 000 patient discharges and more than 80 000 elective operations per year.9 SGH Department of Pathology is the largest public healthcare pathology service and training department in Singapore receiving more than 45 000 surgical pathology and 30 000 cytology accessions per year.10 It supports SGH as well as specialist centres in the SGH campus. SGH is one of two designated Academic Medical Centres (AMCs) in Singapore, and SGH Department of Pathology, together with the Kandang Kerbau Women's and Children's Hospital, constitutes an Academic Clinical Programme in Pathology.

Our pilot implementation of DP in 2009 used Aperio ScanScope XT System (Leica Biosystems, Nussloch, Germany). In 2014, we deployed Philips IntelliSite Pathology Solution (Koninklijke Philips N.V., The Netherlands) to support expansion into the diagnostic laboratory. Figure 1 is a high-level overview of the DPS implemented in 2014 to support the expanded use.

Figure 1

Overview of SGH DPS, with campus wide access to the main central WSI repository of 1.6 pentabytes (a.k.a. ‘DPS Server & Storage’) and remote access via DMZ beyond the campus. WSI scanning can be accomplished both in the main laboratory and the operating theatre frozen section laboratory. DMZ, demilitarisation zone; DPS, digital pathology system; FS, frozen section; LIS, laboratory information system; OT, operating theatre; SGH, Singapore General Hospital; WSI, whole slide imaging.

The key aims of the project were to:

  1. Integrate WSI into the diagnostic workflow, including integration with the LIS.

  2. Support remote consultation between the frozen section (FS) laboratory in the operating theatre (OT) and the main laboratory sited in a separate building (which is approximately 15 min walking distance away from OT).

  3. Enable collaboration and remote access to WSI beyond the SGH campus.

The core solution was implemented within 8 months from July 2013 to February 2014. However, prior to commencement of the implementation, internal workflow review and product agnostic pre-emptive workflow changes were introduced. Figure 2 provides an overview of the implementation timeline, including the key milestones. There was strong emphasis on change management, including departmental awareness and workflow, which translated into a system configuration that fitted the requirements of our diagnostic environment.

Figure 2

Overview of the implementation and key milestones. The solution was implemented over a period of 8 months, although prior to project kick off, pre-emptive workflow interventions were introduced, which facilitated smooth transition into digital pathology (DP)-enabled environment. CAP, College of American Pathologists; DP, digital pathology; DPS, digital pathology system; LIS, laboratory information system.

Principle 1: not just about the interested few but about the entire laboratory

While our pilot implementation focused more on pathologists’ use of WSI for education and research, resonating with the interested few who became forerunners and frequent users, mobilising solely these users for implementation in the diagnostic setting was clearly insufficient. This was because the touchpoints for ensuring successful incorporation and integration of DPS into the diagnostic workflow went beyond just the pathologists, but involved a bigger group of users, including laboratory technologists and clerical (ie, data room) staff, whose inputs would eventually feed into the DPS.

This effort to extend DP beyond the interested few began at the selection stage for a suitable solution, and eventually translated into strong ownership and support of the project throughout the department. It is also reflected in our organisation of the project team, as illustrated in figure 3. The project steering committee consisted of key leadership in the department and representatives from our technical partners and funding support agency, which set the overall direction and guidance for the project. The core project team consisted of pathologist user champions, laboratory technologists, laboratory administrator and a project manager, who were the main drivers of the project. Most importantly, the core team garnered key user personnel across the various roles in the department as part of an ‘extended project team’, who were tasked to look deeply into specific workflows, ensuring a comprehensive study of optimisations in the laboratory required to support DP.

Figure 3

Project governance and team structure, and key users involved. While the core project team is central to integrating the inputs, the extended project team involves a larger range of roles that hold the key to a comprehensive study of the impact of DP across the various touchpoints in the diagnostic laboratory. DP, digital pathology; LIS, laboratory information system; WSI, whole slide imaging.

Principle 2: ensuring DPS works together with LIS

Aims of integration of DPS with LIS

One of the key objectives of this implementation project was to integrate the DPS with our LIS. The imperative to integrate with LIS included:

  1. Reducing separate double entries of information into DPS.

  2. Ensuring that changes of patient and case information in LIS were automatically updated into DPS to prevent discrepancies. This addressed the continuous update of case information through the diagnostic workflow, which represents the norm.

  3. Allowing ready scaling up of slide scanning and sustainable information management. We intended that our LIS would continue to be the central hub of information management in the diagnostic setting, and duplication of information management efforts in multiple systems might not be sustainable.

  4. If properly executed, the integration would allow synergy to be built between DPS and LIS, and provided opportunity for novel approaches in functionalities and to overcome system limitations.

It is well known that LIS integration has a multitude of challenges, which may deter laboratories from embarking on such a model of DPS deployment. There were many users of LIS and multiple existing system flows, which needed to be taken into consideration even if the effort for DPS implementation only focused on specific modules. There was a need to ensure that other LIS modules were not affected in the course of DPS implementation. In addition, the approach needed to be comprehensive, with the ability to address exceptions even if these were relatively less common. Furthermore, integration with LIS could be potentially cumbersome if not well designed, and it would not be easy to reverse a bad design decision once deployed. There was a need to ensure that the LIS vendor was willing to go through the DP journey with users, including providing solutions and creative ideas.

LIS integration exercise

Despite the many potential challenges, we undertook the LIS integration with emphasis on four main accountabilities, which were in line with the levels of metadata addressed in Digital Imaging and Communications in Medicine Supplement 122:11

  1. Patient accountability: Ensuring that patient information remained consistent and updated in DPS, especially where merging of patient records was concerned, and that the LIS remained the source of truth that reconciled patient information from multiple sources.

  2. Case accountability: Ensuring that a case existed within the LIS at scanning, including in high-acuity environments like FS.

  3. Specimen accountability: Ensuring that distinct parts of a case, including different types of specimen assets received (eg, wet tissues, blocks and slides received for external consultation) or partitioned for special processing (eg, electron microscopy), were distinctly identified in the LIS so that any derivatives (eg, slides from a specific consultation block) can be readily and uniquely identified.

  4. Slide accountability: Ensuring that each slide was uniquely identified and all its properties (eg, levels and stains) recorded.

Figure 4 summarises the integration pattern and messages exchanged between LIS and DPS, using the unique slide ID captured in the two-dimensional barcode of the slide label, which links and drives the interface exchanges via HL7 v2.3 messages. We chose to be practical in our approach, with clear identification of the roles of DPS and LIS, as well as provided flexibility in how the two systems could work together.

Figure 4

Summary of integration pattern with emphasis on the principles of patient accountability (eg, ensuring integrity of identifiers and updates are reflected via LIS into DPS), case accountability (eg, case always exist in the LIS at point of scanning, even in FS), specimen accountability (eg, all parts received in consultation cases are distinctly and clearly identified in the LIS) and slide accountability (eg, each and every slide derivatives are accounted for and clearly labelled in the LIS). The messages exchanged between LIS and DPS are outlined in the figure, using the unique slide ID captured in the two-dimensional barcode of the slide label, which links and drives the interface exchanges via HL7 v2.3 messages. LIS, laboratory information system; DPS, digital pathology system; FS, frozen section.

Principle 3: no longer about the technology, but about the workflow

Collating a workflow checklist

Since we were no longer dealing with a standalone DPS, we needed to move beyond thinking only about the technology, but to critically examine the workflow to ensure success with LIS integration. We collated a list of 12 workflows as outlined in table 1, systematically examining them, their interactions with LIS and their potential role in DPS, and identified user champions for each workflow as part of our extended project team. Apart from addressing immediate concerns at deployment, we also encompassed what we would like the system to be able to support in a holistic manner. For example, while co-reporting with residents was not one of our short-term goals, the DPS workflows should take this into consideration in order to future proof the solution. These 12 workflows became our checklist constituting the scope that guided each stage of implementation, particularly during design and testing, as well as for postdeployment benefits study. We identified a number of enhancements to laboratory work processes to ensure alignment of LIS information flow with DPS, which are summarised in online supplementary table S1. These helped to enable sound integration, and also indirectly improved information governance throughout the histopathology laboratory.

Table 1

List of 12 workflows identified and the user champions involved

Consolidation workshop

We subsequently conducted an intensive 1-week workflow consolidation workshop at the end of month 4 of implementation, involving members of the core and extended project teams. The final output of this workshop was presented to and endorsed by the steering committee. The purpose of the workshop was to consolidate and link the various workflows to ensure that dependencies were clearly addressed and to define the future state workflow that would feed into our functional design, business rules and solution configuration of DPS. The workshop was planned such that expert DPS and LIS solution specialists from the vendors were also deeply involved on table for quick assessment of technical feasibility and agile solution proposal, so design decisions could be quickly finalised and put into the system configuration.

Principle 4: test, test, test—test the workflow, test generously, test comprehensively

Testing was executed in three phases, namely system test (DPS only, leading to technical readiness for integration), system integration test (testing communication and business rules between DPS and LIS) and user acceptance test (UAT) (testing the execution of workflows). Unlike a standalone implementation, where the focus was simply about testing the DPS configuration and its ability to scan the slides, the bulk of our test effort went into demonstrating our solution's ability to function appropriately and adequately in the laboratory workflow. Intensive testing was conducted throughout months 6 and 7 to validate the solution flow against laboratory work processes by simulation of real-life situations, involving key roles in actual usage scenarios. A generous testing time frame, which constituted close to about a third of the entire implementation timeline, was provided to ensure sufficient opportunity for root cause analysis, resolution and retest of defects. A sample of the UAT script is captured in online supplementary figure S1. The scope of the testing followed the 12 main processes listed in table 1, providing the baseline for comprehensive coverage. Extreme scenarios (eg, cases with more than a hundred slides and extremely long reports, which were not rare in an AMC setting) were included, and this allowed us to detect anomalies (eg, truncated messages) that would have otherwise been overlooked if we performed only cursory testing with simple scenarios. After system go-live, we also ran through representative test scripts (so-called ‘business verification testing’) to ensure that final system configurations locked down following completion of UAT were effected in the production environment. We need to emphasise that testing (especially UAT) cannot be completely relegated to the vendor, and the users have to take leadership and ownership of executing the workflows.

Principle 5: train, train, train—train early in the project, train the workflow, train succinctly yet comprehensively

We already trained user champions as early as month 3 (see figure 2), allowing them to understand the ‘out-of-the-box’ stock system, experience it and feedback on areas requiring further adjustments and configuration. This provided a strong foundation for fruitful discussions during the consolidation workshop in month 4. Following completion of testing, the SGH project team together with the vendor conducted a week-long, hands-on intensive training sessions for over 60 users (including just over 20 practicing pathologists) throughout the surgical pathology section, covering a majority of end users and ensuring an educated user base who could use DPS at clinical go-live. One key focus of the training was integrated workflow covering both DPS solution and LIS enhancements to support DPS. Due to the large user base, we needed to ensure that the sessions were succinctly delivered in less than 3 h covering the key essentials to kick-start the user, and yet provided ready online access to comprehensive workflow-centric training materials (see online supplementary figure S2 for a sample of the training material).

Principle 6: validating DPS—predeployment to kick-start, postdeployment to gain experience

Predeployment validation

When we began our validation of the DPS in month 6 (December 2013), CAP had already published online validation guidelines for WSI.12 We addressed time constraints of an actual DPS implementation, while closely following CAP recommendations, including sufficient coverage for the intended use (ie, H&E, immunohistochemistry, FS, etc), closely emulating the actual clinical environment, sample size of at least 60 cases for each major application (at least 20 cases for each additional application), and studying intraobserver variability.

The timeline of validation (figure 2), while occurring after installation of the final production DPS, overlapped with testing and implementation of LIS integration. In doing so, we maximised the use of time and resources, given that both activities had little dependency on each other and involved different groups of participants, since LIS integration testing focused on information integrity between systems, while validation focused mainly on quality of scans and ability to use DPS to read WSI.

To plan, regulate and manage the validation activity, we derived a validation action plan (figure 5), incorporating the entire set of validation, review and discrepancy management approaches. A total of nine fully qualified practicing pathologists were involved, with the following used for validation: 60 routine cases, 60 FS cases, 30 cases with immunohistochemistry (including both red and brown chromogens) and 20 cases with special stains (mixture of common special stains used).

Figure 5

Validation action plan for predeployment validation of digital pathology system (DPS), outlining the decisions and actions at key steps of the validation process.

All pathologists involved were adequately trained in using DPS prior to the validation. We used a wash-out period of at least 2 weeks between the two reads. For practical considerations, glass slides were reviewed first followed by digital slides. The case constitution was a reflection of cases routinely reported by our laboratory. Regardless of the complexity and category (ie, ‘routine H&E’, ‘immunohistochemistry’, etc) of the case, the entire case would be scanned and available for review, focusing on the overall diagnostic interpretation, and any reporting of biomarkers (eg, ER, HER2) that a pathologist would include as part of our diagnostic report. For FS cases specifically, we used a sequential list of 60 cases and scanned these prospectively in our FS laboratory using the scanner deployed in the OT within the day that the FS cases were received, without any specific ‘drying’ of the slides, so as to closely mimic the real-life scenario. For FS, we also requested the pathologist to time and capture if they managed to complete reading the slides (both glass and digital) within a stipulated timeframe since turnaround time is a key performance indicator. An example of the template we used to capture FS validation is presented in online supplementary figure S3, while our validation results are summarised in table 2. The overall concordance ranged from 93% to 100% for the different applications. We classified a case as non-concordant if major discrepancy existed that might result in a different clinical management or where benign cases were interpreted as malignant and vice versa. For immunohistochemistry, we applied more stringent criteria for clinically significant cut-offs or positive versus negative interpretation. After review of non-concordant and controversial cases by the review panel (comprising head of department, head of section and two practicing pathologists), the cases that were non-concordant were deemed to be a result of inherent intraobserver variations that might occur even on second reads of glass slides, and not due to substantial differences in quality of glass slides compared with WSI.

Table 2

Validation results of glass slides compared with WSI scanned using the DPS to be implemented

Postdeployment prospective validation

Our principle of deployment was to ensure that all pathologists in the department had equal opportunities to access DP, allowing them to gain familiarity and experience through constant exposure to digital imaging platforms. With this in mind, we conducted a voluntary prospective validation exercise over 6 months following deployment by delivering paired glass slides and WSI simultaneously as part of our daily routine work to the pathologists and residents, and requesting them to feedback if the diagnostic impression on the WSI and the corresponding glass slides matched using a simple questionnaire. While the predeployment validation following CAP recommendations focused on assuring technological validity, this postdeployment validation aimed to monitor and encourage the use of DP among pathologists and residents, allowing them to understand and discover advantages and caveats to the use of WSI, while allowing feedback to identify areas for improvement. This validation also provided a more direct comparison of the glass slides and WSI, allowing technical issues to be more readily detected (eg, if all the fragments on glass slides are represented in WSI). The cases were a sample of our daily workload and chosen randomly from biopsy cases so that they reflected a representative range of specimens received by our laboratory. During this period, we also scanned all our FS slides and randomly selected some for review. The questionnaire was attached together with the case paperwork and consisted of the following feedback/questions:

  1. Initial diagnostic impression on WSI

  2. Does the impression of the digital slides match that of the glass slides? (Choose from ‘Yes’, ‘No’, ‘Somewhat’ and provide reasons for the latter two)

  3. Please rate the quality of the whole slide images (choose from ‘1’ (worst) and ‘5’ (excellent)).

The questionnaires were collected when the case paperwork was returned to the clerical staff for completion and sign out. Out of the 6606 cases scanned, we received 2136 responses (32.3% response rate). About 10 users (pathologists and residents) returned responses for at least 60 cases (table 3A). As the exercise was entirely voluntary, it allowed us to establish a baseline on the adoption of DP in the department.

Table 3

Prospective validation findings. (A) Number of users (pathologists and residents) against number of paired cases with response. (B) Summary of concordance between WSI and glass slides for prospectively paired cases. ‘Frozen section’ were cases where only frozen section slides were involved. ‘Routine’ were cases selected as part of routine paraffin processing. (C) Category of reasons for 83 cases where response for concordance between glass slides and WSI was ‘somewhat’ or ‘no’

Tables 3B and C provide a summary of the key findings of the prospective validation exercise. Out of the 2136 responses, WSI matched glass slide findings in more than 96% of the cases, while 4% of the cases were not deemed fully concordant, although majority were classified as ‘somewhat’. The main reasons for classifying the concordance as ‘somewhat’ or ‘no’ were difficulties to confidently identify or exclude organisms, which was almost always Helicobacter pylori, and diagnostic features that were not in the scan focus. With regard to image quality, most were deemed satisfactory (score ≥3) for the vast majority of cases (more than 99% of the routine cases and 96% of the FS cases, see online supplementary figure S4). More importantly, the feedback allowed us to adjust and improve our processes in delivery of diagnostic quality WSI.

Principle 7: stability and sustainability—define the postdeployment processes, governance and feedback mechanisms

We assigned dedicated medical technologists, who had also been deeply involved in implementation, to maintain and support the operations of DPS. We aimed to provide feedback on all DPS issues within the day, which were maintained as an issue log, with corresponding screenshots captured, regardless of whether the issue occurred within or outside the DPS (eg, slide preparation problem, LIS glitches), so that we had comprehensive coverage on factors that could affect WSI quality and DP user experience. In addition, we arranged regular sessions with our vendors to ensure that issues and incidences were accurately reflected and resolved with root cause identified. The laboratory manual was duly updated to reflect the modified workflows incorporating DPS and all validation and training records were maintained. We had since undergone a few version upgrades to both scanner and image management platforms, each time in thorough discussion with the vendor and with full assessment of the release notes to decide if revalidation and retesting were required.

To maintain proper governance in the use of DPS, we issued the following guidelines to pathologists as a reminder of their responsibility in the use of DPS:

  • Guideline for pathologist use of SGH-DPS on diagnostic histopathology materials

  • Guideline for referring cases for formal diagnostic consult via SGH-DPS

  • Guideline for use of WSI in SGH-DPS for education purposes.

It has been over one and a half years since we fully deployed the current DPS and we have seen more than encouraging adoption of DP across the department. Furthermore, we have seen objective improvements of work processes in multidisciplinary meeting preparations and review of historical cases when we prospectively scan the biopsy cases as they pass through the routine laboratory workflow, reducing the pain of laboratory staff in tracing glass slides from the deep archive or from individual pathologists who had yet to file the slides.


While integration of DPS into the diagnostic laboratory can be challenging, we believe that it is possible for smooth adoption with careful planning and discipline, following the principles and approaches outlined above (see summary in table 4). Success of LIS integration is underpinned by emphasis on clearly defined workflows with goal states in mind, with involvement of the right people and ensuring that we have a comprehensive checklist to constantly remind ourselves at every stage of implementation. Furthermore, workflow review and enhancements before and during implementation provide a good opportunity for process improvements, tightening of governance as well as better workload tracking.

Table 4

Summary of principles used to guide our DPS implementation journey

Take home messages

  • Integration of digital pathology into the diagnostic laboratory requires a systematic approach and strong guiding principles, as outlined in this article.

  • Strong emphasis on workflow and garnering robust support from the department with involvement of a wide variety of roles beyond the pathologists are key to the success of implementation.


We would like to acknowledge the funding support by the Ministry of Health, Singapore, for SGH digital pathology expansion project, project implementation support provided by Integrated Health Information Systems (IHIS) and MOH Holdings Pte Ltd, as well as involvement of SGH department of pathology staff.


Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.


  • Handling editor Cheok Soon Lee

  • Contributors CLC wrote the manuscript. PHT provided guidance and significant inputs for refinement of the work. RA, SHAS and YQC are project team members, and reviewed and collated the data for the validation exercise. The rest of the authors are management and project team members who contributed significantly to the digital pathology implementation.

  • Competing interests SGH department of pathology is a digital pathology reference site for Philips. Philips took no part in the manuscript nor its review.

  • Ethics approval Singhealth Centralised Institutional Review Board (CIRB) deemed this as a quality improvement project, which does not require formal CIRB review.

  • Provenance and peer review Not commissioned; externally peer reviewed.