I would rather have Dr. Watson analyze this big data, s/he has already been doing this for other medical applications. If your stroke hospital can't see that this could be much better that all of their stroke doctors combined they need to stop being a stroke hospital.
Other uses by Dr. Watson:
VA partners with IBM to use supercomputer Watson to treat cancer
MD Anderson Cancer Center to Use IBM Watson
IBM Watson, Boston Children’s team on rare pediatric diseases
Multiple Sclerosis @Point of Care app with Dr. Watson
Fighting Diabetes with Watson: Medtronic & IBM Watson Health
IBM Watson Makes a Treatment Plan for Brain-Cancer Patient in 10 Minutes; Doctors Take 160 Hours
The latest here:
Application of Machine Learning to Automated Analysis of Cerebral Edema in Large Cohorts of Ischemic Stroke Patients
- 1Division of Neurocritical Care, Department of Neurology, Washington University in St. Louis, St. Louis, MO, United States
- 2Division of Cerebrovascular Diseases, Department of Neurology, Washington University in St. Louis, St. Louis, MO, United States
- 3Department of Radiology, Washington University in St. Louis, St. Louis, MO, United States
Introduction
Over 10 million persons suffer a stroke each year worldwide (1).
Most of these patients have at least one brain imaging study performed
during their acute hospitalization, primarily for diagnostic purposes on
presentation (2).
Follow-up scans are often obtained to evaluate the size of infarction,
degree of cerebral edema, as well as exclude the development of
hemorrhagic transformation (3).
Computed tomography (CT) is the most frequently employed modality for
acute stroke imaging due to its widespread availability, lower cost, and
greater speed of scanning, especially important in acutely unstable
patients where “time is brain” (4).
Although conventional CT does not have the ability of magnetic
resonance imaging (MRI) to detect hyper-acute stroke, its ability to
track progression of infarction and edema after stroke are comparable
while affording greater temporal resolution with serial imaging (5).
This practice means that there is a massive global imaging dataset of
stroke patients with information on stroke location, infarct size,
development of edema, and hemorrhagic transformation. While these
parameters can be assessed by human raters, such evaluation is not
scalable when leveraging imaging data from thousands of patients.
Cerebral edema develops around regions of brain
infarction within the first week after stroke. This pathologic increase
in brain water and hemispheric volume can lead to mass effect and is the
major cause of death and neurological worsening after stroke (6).
Development of edema is usually heralded by abrupt mental status
worsening 2 days or more after admission, when herniation and midline
shift have already developed (7).
However, this process actually begins in the first hours after stroke
and evolves continually and progressively over the first few days. At
first decreases in blood and cerebrospinal fluid (CSF) compartments
within the cranial compartment compensate for this increase in brain
volume. However, once this has been exhausted, decompensation with
worsening rapidly follows. Current measures of edema such as midline
shift (MLS) or neurological deterioration capture only this
decompensated state and not the critical early stages of edema before
worsening. Further, assessing edema utilizing only MLS neglects the full
spectrum of edema, including those with increased brain volume who
never develop MLS. Measures of lesion volume either requires MRI (not
feasible in all stroke patients) or can be estimated using CT; however,
hypodensity on CT may be subtle early on and represents a variable
combination of infarct plus edema. It is only the latter component that
contributes to swelling and risk of herniation, and so lesion volume
(even on MRI) only partially predicts risk of herniation (8).
We have proposed a sensitive quantitative metric of
edema severity that can be extracted from CT imaging at variable time
points after stroke (9).
This leverages the reciprocal biologic relationship between increase in
brain volume due to swelling and proportional decrease in CSF volume as
compensation. CSF is pushed out of hemispheric sulci, cerebral
ventricles, and the basal cisterns as edema develops in the hours and
days after stroke. The reduction in CSF volume precedes the development
of midline shift and clinical worsening due to edema. We demonstrated
that the volume of CSF displaced up to the time of maximal edema closely
correlated with extent of midline shift.
We have also developed an automated algorithm to segment CSF from CT scans of stroke patients (10).
This critical step employed random forest-based machine learning (ML)
trained on manually delineated scans. Features integrated into the ML
platform include Haar-like patterns of pixels. This supervised learning
approach was able to rapidly and reliably measure CSF volume on serial
CT scans from two sites in our preliminary testing, performing
significantly better than simple threshold-based models for CSF
segmentation which were confounded by density of infarction mimicking
CSF. Correlations of automated CSF volumes to ground-truth values
exceeded 0.95, with volumes that closely approximated actual CSF values
after active contour refinement. This automated approach facilitates the
translation of this metric to studies evaluating edema in large numbers
of stroke patients. Exploring the variability in quantifiable edema
severity between patients will not only unlock opportunities for precise
prediction of malignant edema at earlier time points but also provide
the basis for understanding the genetic basis of cerebral edema. Such
studies require thousands of stroke patients with serial imaging to
undergo CSF-based edema measurement. We now present a proof-of-principle
application of a processing algorithm capable of handling large
datasets of CT scans and extracting CSF volumes for such analyses.
No comments:
Post a Comment