from a pathologists perspective Jonhan Ho MD MS Disclosure Image Quality definemeasure Image quality is good enough if It has a resolution of 012345 μ pixel It is captured in XYZ color spacepixel depth ID: 930817
Download Presentation The PPT/PDF document "Image Quality in Digital Pathology" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Image Quality in Digital Pathology
(from a pathologist’s perspective)
Jonhan
Ho, MD, MS
Slide2Disclosure
Slide3Slide4Slide5Slide6Image Quality: define/measure
Slide7Image quality is good enough if:
It has a resolution of 0.12345
μ
/pixel
It is captured in XYZ color space/pixel depth
It has a MTF curve that looks perfect
It has a focus quality score of 123
Has a high/wide dynamic range
Slide8What is “resolution”?
Spatial resolution
Sampling period
Optical resolution
Sensor resolution
Monitor resolution
New Year’s resolution???????
Slide9Optical resolution
Theoretical maximum resolution of a 0.75 NA lens is 0.41
μ
. 1.30 NA – 0.23
μ
.
Has NOTHING to do with magnification! (we will get to that later.)
Slide10Depth of Field
As aperture widens
Resolution improves
Depth of field narrows
Less tissue will be in focus
Slide11Image quality is good enough if:
It has a resolution of 0.12345
μ
/pixel
It is captured in XYZ color space/pixel depth
It has a MTF curve that looks perfect
It has a focus quality score of 123
Has a high/wide dynamic range
Slide12Image quality is good enough if it is:
“Sharp”
“Clear”
“Crisp”
“True”
“Easy on the eyes”
Slide13Image quality is good enough if it is:
“Sharp”
“Clear”
“True”
Slide14Image quality is good enough if:
You can see everything you can see on a glass slide
Slide15Slide16Slide17Image quality is good enough if:
I can make a diagnosis from it
Slide18Slide19Slide20Image quality is good enough if:
I can make as good a diagnosis from it as I can glass slides.
This is a concordance study
OK, but how do you measure this?!?!?!?!?!
Slide21Gold standard = Another Diagnosis
Slide22Concordance validation
Some intra-observer variability
Even more
interobserver
variability
Order effect
“great case” effect
Slide23Concordance validation
Case selection
Random, from all benches?
Enriched, with difficult cases?
Presented with only initial H&E?
Allow ordering of levels, IHC, special stains?
If so, how can you compare with the original diagnosis?
Presented with all previously ordered stains?
If so, what about diagnosis bias?
How old of a case to allow?
Slide24Concordance validation
Subject selection
Subspecialists? Generalists?
Do all observers read all cases, even if they are not accustomed to reading those types of cases?
Multi-institutional study
Do observers read cases from other institutions?
Staining/cutting protocol bias
Slide25Concordance validation
Measuring concordance
Force pathologist to report in discrete data elements?
This is not natural! (especially in inflammatory processes!)
What happens if 1 data element is minimally discordant?
Allow pathologist to report as they normally do?
Free text – who decides if they are concordant? How much discordance to allow? What are the criteria?
Slide26Concordance study bottom line
Very difficult to do with lots of noise
Will probably conclude that can make equivalent diagnoses
At the end, we will have identified cases that are discordant, but what does that mean?
What caused the discordances?
Bad images? If so what made them bad?
Familiarity with digital?
Lack of coffee?!?!?!
Still doesn’t feel like we’ve done our due diligence – what exactly are the differences between glass and digital?
Slide27PERCEPTION = REALITY
Slide28PERCEPTION = QUALITY
“Sharp, clear, true”
Slide29Psychophysics
The study of the relationship between the physical attributes of the stimulus and the psychological response of the observer
Slide30What we need is -
Slide31Images, image quality and observer performance: new horizons in radiology lecture.
Kundel
HL. Radiology. 1979 Aug;132(2):265-71
Slide32Kundel on image quality
“The highest quality image is one that enables the
observer
to most accurately report diagnostically relevant
structures
and
features
.”
Slide33Slide34Receiver Operator Curve (ROC)
Slide35Conspicuity index formula
K = f(Size, contrast, Edge Gradient/surround complexity)
Probability of detection = f(K)
Slide36Kundel, 1979
“Just as a limited alphabet generates an astonishing variety of words, an equally limited number of features may generate an equally astonishing number of pictures.”
Slide37Can this apply to pathology?
What is our alphabet? MORPHOLOGY!
Red blood cells
Identify inflammation by features
Eosinophils
Plasma cells
Hyperchromasia
,
pleomorphism
, NC ratio
Build features into microstructures and macrostructures
Put features and structures into clinical context and compare to normal context
Formulate an opinion
Slide38Slide39Slide40Slide41Slide42Advantages of feature based evaluation
Better alleviates experience bias, context bias
Can better perform
interobserver
concordancy
Connects pathologist based tasks with measurable output understandable by engineers
Precedent in image interpretability (NIIRS)
Slide43NIIRS 1
“Distinguish between major land use classes (agricultural, commercial, residential)”
Slide44NIIRS 5
“Identify Christmas tree plantations”
Slide45Disadvantages of feature based evaluation
Doesn’t eliminate the “representative ROI” problem
Still a difficult study to do
How to select features? How many?
How to determine gold standard?
What about features that are difficult to discretely characterize? (“
hyperchromasia
”, “
pleomorphism
”)
Slide46Bottom line for validation
All of these methods must be explored as they each have their advantages and disadvantages
Technical
Diagnostic concordance
Feature vocabulary comparison
Slide47Image perception - Magnification
Ratio
Microscope
Lens
Oculars
Scanner
Lens
Sensor resolution
Monitor resolution
Monitor distance
Slide4840X magnification from object to sensor
1 pixel = 10 µm at the sensor
1 pixel = 0.25 µm at the sample
10/0.25 = 40X
270
µm
pixel pitch of monitor
~27X magnification from sensor to monitor
1 pixel =270 µm at the monitor
1 pixel = 10 µm at the sensor
270 / 10 = ~27X
= 1080X TOTAL magnification from object to monitor
This is the equivalent of a 108X objective on a microscope!!??
Magnification at the monitor
Slide49Scan Type
Magnification
Effective Viewing Magnification (at
1:1)
Manual Scope Equivalent Objective Magnification
Object to Sensor
Sensor to Monitor
TOTAL
10”
24”
48”
10”
24”
48”
20X
20
27
540
540
225
112.5
~54x
~22.5x
~11.3x
40X
40
27
1080
1080
450
225
~108x
~45x
~22.5x
Near point = 10”
What if the sensor was obscenely high resolution?
Slide50Other things that cause bad images
Tissue detection
Focus
Slide51Tissue detection
Slide52What about Phantoms?
Slide53One final exercise in image perception
Slide54Slide55?
hoj@upmc.edu