Abstract: FR-OR097
Evaluation of a Computer-Aided Quality Assessment of Whole Slide Images for Computational Pathology
Session Information
- New Techniques and Breakthroughs in Renal Pathology
November 08, 2019 | Location: Salon A/B, Walter E. Washington Convention Center
Abstract Time: 05:42 PM - 05:54 PM
Category: Pathology and Lab Medicine
- 1601 Pathology and Lab Medicine: Basic
Authors
- Chen, Yijiang, Case Western Reserve University, Cleveland, Ohio, United States
- Zee, Jarcy, Arbor Research Collaborative for Health, Ann Arbor, Michigan, United States
- Smith, Abigail R., Arbor Research Collaborative for Health, Ann Arbor, Michigan, United States
- Jayapandian, Catherine P., Case Western Reserve University, Cleveland, Ohio, United States
- Hodgin, Jeffrey B., The University of Michigan, Ann Arbor, Michigan, United States
- Howell, David, Duke University and Durham VA Hospitals, Durham, North Carolina, United States
- Palmer, Matthew, University of Pennsylvania, Philadelphia, Pennsylvania, United States
- Thomas, David B., IYM Health Financial Services, LLC, Durham, North Carolina, United States
- Cassol, Clarissa Araujo, The Ohio State University, Columbus, Ohio, United States
- Farris, Alton Brad, Emory University, Atlanta, Georgia, United States
- Perkinson, Kathryn Roberson, Duke University Medical Center, Durham, North Carolina, United States
- Hewitt, Stephen M., National Cancer Institute, Bethesda, Maryland, United States
- Madabhushi, Anant, Case Western Reserve University, Cleveland, Ohio, United States
- Barisoni, Laura, Duke University, Durham, North Carolina, United States
- Janowczyk, Andrew, Case Western Reserve University, Cleveland, Ohio, United States
Background
The establishment of computational image analysis has uncovered inconsistency in quality of whole slide images (WSI) across pathology laboratories, due to pre-analytic (fixation and tissue processing) and analytic (cutting, staining and scanning) variations. While pathologists train themselves to read through artifacts, computational pathology systems are not trained to adjust to such variability. Our group developed a pipeline aided by an open-source computer-based quality control tool (HistoQC) to identify heterogeneity, qualify WSI for computational image analysis (see Figure 1-A), and output tissue masks (see Figure 1-B) that exclude the artifacts. The aim of this study is to test whether HistoQC can efficiently and reliably qualify WSI based on artifact detection.
Methods
1814 WSIs (458 H&E, 470 PAS, 438 silver, 448 trichrome) from 512 NEPTUNE digital renal biopsies were analyzed by HistoQC and reviewed for disqualification. Disqualified (extreme outliers) WSIs and 10% of the qualified WSIs, randomly selected, were manually scored by 2 reviewers for the presence of glass slide, tissue, and scanning artifacts. Principal component analysis (PCA) of HistoQC metrics and logistic regression was used to evaluate the association between HistoQC and human assessment.
Results
151 WSIs were considered extreme outliers by HistoQC. Only 318 (151 disqualified + 167 qualified) of the 1814 WSIs required human review. PCA components based on HistoQC metrics demonstrated good to strong prediction of human identified artifacts (C-index range 0.64-0.83, see Figure 1-C).
Conclusion
HistoQC can aid in the identification of pre-analytic and analytic artifacts and of variations in WSI presentation. Furthermore, this pipeline may enable efficient curation of digital pathology repositories and reduce computational image analysis variability.
Funding
- NIDDK Support