Changing stroke rehab and research worldwide now.Time is Brain! trillions and trillions of neurons that DIE each day because there are NO effective hyperacute therapies besides tPA(only 12% effective). I have 523 posts on hyperacute therapy, enough for researchers to spend decades proving them out. These are my personal ideas and blog on stroke rehabilitation and stroke research. Do not attempt any of these without checking with your medical provider. Unless you join me in agitating, when you need these therapies they won't be there.

What this blog is for:

My blog is not to help survivors recover, it is to have the 10 million yearly stroke survivors light fires underneath their doctors, stroke hospitals and stroke researchers to get stroke solved. 100% recovery. The stroke medical world is completely failing at that goal, they don't even have it as a goal. Shortly after getting out of the hospital and getting NO information on the process or protocols of stroke rehabilitation and recovery I started searching on the internet and found that no other survivor received useful information. This is an attempt to cover all stroke rehabilitation information that should be readily available to survivors so they can talk with informed knowledge to their medical staff. It lays out what needs to be done to get stroke survivors closer to 100% recovery. It's quite disgusting that this information is not available from every stroke association and doctors group.

Wednesday, August 29, 2018

Application of Machine Learning to Automated Analysis of Cerebral Edema in Large Cohorts of Ischemic Stroke Patients

I would rather have Dr. Watson analyze this big data, s/he has already been doing this for other medical applications. If your stroke hospital can't see that this could be much better that all of their stroke doctors combined they need to stop being a stroke hospital.

Other uses by Dr. Watson:

VA partners with IBM to use supercomputer Watson to treat cancer

MD Anderson Cancer Center to Use IBM Watson

IBM Watson, Boston Children’s team on rare pediatric diseases

Multiple Sclerosis @Point of Care app with Dr. Watson

Fighting Diabetes with Watson: Medtronic & IBM Watson Health

IBM Watson Makes a Treatment Plan for Brain-Cancer Patient in 10 Minutes; Doctors Take 160 Hours

The latest here:

Application of Machine Learning to Automated Analysis of Cerebral Edema in Large Cohorts of Ischemic Stroke Patients

Rajat Dhar1*, Yasheng Chen2†, Hongyu An3 and Jin-Moo Lee2
  • 1Division of Neurocritical Care, Department of Neurology, Washington University in St. Louis, St. Louis, MO, United States
  • 2Division of Cerebrovascular Diseases, Department of Neurology, Washington University in St. Louis, St. Louis, MO, United States
  • 3Department of Radiology, Washington University in St. Louis, St. Louis, MO, United States
Cerebral edema contributes to neurological deterioration and death after hemispheric stroke but there remains no effective means of preventing or accurately predicting its occurrence. Big data approaches may provide insights into the biologic variability and genetic contributions to severity and time course of cerebral edema. These methods require quantitative analyses of edema severity across large cohorts of stroke patients. We have proposed that changes in cerebrospinal fluid (CSF) volume over time may represent a sensitive and dynamic marker of edema progression that can be measured from routinely available CT scans. To facilitate and scale up such approaches we have created a machine learning algorithm capable of segmenting and measuring CSF volume from serial CT scans of stroke patients. We now present results of our preliminary processing pipeline that was able to efficiently extract CSF volumetrics from an initial cohort of 155 subjects enrolled in a prospective longitudinal stroke study. We demonstrate a high degree of reproducibility in total cranial volume registration between scans (R = 0.982) as well as a strong correlation of baseline CSF volume and patient age (as a surrogate of brain atrophy, R = 0.725). Reduction in CSF volume from baseline to final CT was correlated with infarct volume (R = 0.715) and degree of midline shift (quadratic model, p < 2.2 × 10−16). We utilized generalized estimating equations (GEE) to model CSF volumes over time (using linear and quadratic terms), adjusting for age. This model demonstrated that CSF volume decreases over time (p lt; 2.2 × 10−13) and is lower in those with cerebral edema (p = 0.0004). We are now fully automating this pipeline to allow rapid analysis of even larger cohorts of stroke patients from multiple sites using an XNAT (eXtensible Neuroimaging Archive Toolkit) platform. Data on kinetics of edema across thousands of patients will facilitate precision approaches to prediction of malignant edema as well as modeling of variability and further understanding of genetic variants that influence edema severity.

Introduction

Over 10 million persons suffer a stroke each year worldwide (1). Most of these patients have at least one brain imaging study performed during their acute hospitalization, primarily for diagnostic purposes on presentation (2). Follow-up scans are often obtained to evaluate the size of infarction, degree of cerebral edema, as well as exclude the development of hemorrhagic transformation (3). Computed tomography (CT) is the most frequently employed modality for acute stroke imaging due to its widespread availability, lower cost, and greater speed of scanning, especially important in acutely unstable patients where “time is brain” (4). Although conventional CT does not have the ability of magnetic resonance imaging (MRI) to detect hyper-acute stroke, its ability to track progression of infarction and edema after stroke are comparable while affording greater temporal resolution with serial imaging (5). This practice means that there is a massive global imaging dataset of stroke patients with information on stroke location, infarct size, development of edema, and hemorrhagic transformation. While these parameters can be assessed by human raters, such evaluation is not scalable when leveraging imaging data from thousands of patients.
Cerebral edema develops around regions of brain infarction within the first week after stroke. This pathologic increase in brain water and hemispheric volume can lead to mass effect and is the major cause of death and neurological worsening after stroke (6). Development of edema is usually heralded by abrupt mental status worsening 2 days or more after admission, when herniation and midline shift have already developed (7). However, this process actually begins in the first hours after stroke and evolves continually and progressively over the first few days. At first decreases in blood and cerebrospinal fluid (CSF) compartments within the cranial compartment compensate for this increase in brain volume. However, once this has been exhausted, decompensation with worsening rapidly follows. Current measures of edema such as midline shift (MLS) or neurological deterioration capture only this decompensated state and not the critical early stages of edema before worsening. Further, assessing edema utilizing only MLS neglects the full spectrum of edema, including those with increased brain volume who never develop MLS. Measures of lesion volume either requires MRI (not feasible in all stroke patients) or can be estimated using CT; however, hypodensity on CT may be subtle early on and represents a variable combination of infarct plus edema. It is only the latter component that contributes to swelling and risk of herniation, and so lesion volume (even on MRI) only partially predicts risk of herniation (8).
We have proposed a sensitive quantitative metric of edema severity that can be extracted from CT imaging at variable time points after stroke (9). This leverages the reciprocal biologic relationship between increase in brain volume due to swelling and proportional decrease in CSF volume as compensation. CSF is pushed out of hemispheric sulci, cerebral ventricles, and the basal cisterns as edema develops in the hours and days after stroke. The reduction in CSF volume precedes the development of midline shift and clinical worsening due to edema. We demonstrated that the volume of CSF displaced up to the time of maximal edema closely correlated with extent of midline shift.
We have also developed an automated algorithm to segment CSF from CT scans of stroke patients (10). This critical step employed random forest-based machine learning (ML) trained on manually delineated scans. Features integrated into the ML platform include Haar-like patterns of pixels. This supervised learning approach was able to rapidly and reliably measure CSF volume on serial CT scans from two sites in our preliminary testing, performing significantly better than simple threshold-based models for CSF segmentation which were confounded by density of infarction mimicking CSF. Correlations of automated CSF volumes to ground-truth values exceeded 0.95, with volumes that closely approximated actual CSF values after active contour refinement. This automated approach facilitates the translation of this metric to studies evaluating edema in large numbers of stroke patients. Exploring the variability in quantifiable edema severity between patients will not only unlock opportunities for precise prediction of malignant edema at earlier time points but also provide the basis for understanding the genetic basis of cerebral edema. Such studies require thousands of stroke patients with serial imaging to undergo CSF-based edema measurement. We now present a proof-of-principle application of a processing algorithm capable of handling large datasets of CT scans and extracting CSF volumes for such analyses.

No comments:

Post a Comment