Changing stroke rehab and research worldwide now.Time is Brain! trillions and trillions of neurons that DIE each day because there are NO effective hyperacute therapies besides tPA(only 12% effective). I have 523 posts on hyperacute therapy, enough for researchers to spend decades proving them out. These are my personal ideas and blog on stroke rehabilitation and stroke research. Do not attempt any of these without checking with your medical provider. Unless you join me in agitating, when you need these therapies they won't be there.

What this blog is for:

My blog is not to help survivors recover, it is to have the 10 million yearly stroke survivors light fires underneath their doctors, stroke hospitals and stroke researchers to get stroke solved. 100% recovery. The stroke medical world is completely failing at that goal, they don't even have it as a goal. Shortly after getting out of the hospital and getting NO information on the process or protocols of stroke rehabilitation and recovery I started searching on the internet and found that no other survivor received useful information. This is an attempt to cover all stroke rehabilitation information that should be readily available to survivors so they can talk with informed knowledge to their medical staff. It lays out what needs to be done to get stroke survivors closer to 100% recovery. It's quite disgusting that this information is not available from every stroke association and doctors group.

Wednesday, February 11, 2026

A Smartphone App Could Change How Stroke Recovery is Measured

 Are you that fucking stupid? Ask the survivor; 'Are you fully recovered?' Then you work on survivor needs, NOT WHAT YOU THINK THE SURVIVOR CAN DO, WHAT THEY WANT!

'Measurements' don't get you recovered, only EXACT REHAB PROTOCOLS DO!

A Smartphone App Could Change How Stroke Recovery is Measured

In assessing stroke patient rehabilitation, clinicians have largely relied on their eyes to measure how well a stroke patient can move, grasp and let go. But what if a pocket-sized tool could capture every motion, quantify progress and provide new insight — all in the time it takes to open an app?

That’s the idea behind the Clinical Motor Recovery assessment tool, or C-MoRe, a new smartphone app developed by two University of California, Davis, master’s students in computer science, Ziqiang “Joe” Zhu and Jun Min Kim. By harnessing the technology of a smartphone’s camera and the power of machine learning, C-MoRe makes it possible to measure rehabilitation progress with unprecedented accuracy and speed, potentially transforming stroke care for patients and the clinicians who provide it.

A Pocket-Sized Tool with Big Potential

In collaboration with UC Irvine’s Department of Mechanical and Aerospace Engineering in the Samueli School of Engineering, C-MoRe has been applied to video footage of seven stroke patients performing the Box and Blocks Test, a standardized dexterity test in which patients must move blocks from one side of a partitioned box to the other.

In a recent paper published in IEEE Xplore, the researchers detail how C-MoRe has successfully detected block transfers with 100% accuracy when compared to humans. It also quantified various limb functions, including grasp and transfer duration, and movement amplitude and velocity, collecting data that can help a physician more accurately assess a stroke patient’s recovery and personalize their rehabilitation strategy, something that is near-impossible with a strictly human assessment.

“C-MoRe is two things. One is, let's make it easier for the clinicians who are actually administering this test by automating parts of it that they can then review,” Zhu said. 

“Additionally, because we're recording everything with a camera and all the movements are precisely annotated, we're able to get data that human eyes can't.”

With its availability (anyone with a smartphone can access it) and ability to assess more details than the human eye, a tool like C-MoRe could provide new, important insights into stroke patient recovery and create a pathway to more personalized care.  

An Idea Takes Shape

C-MoRe’s origins begin with Andria Farrens, who graduated from UC Davis with a Bachelor of Science degree in mechanical engineering and a minor in biomedical engineering. 

While Farrens was conducting postdoctoral research at the UC Irvine Samueli School of Engineering’s Department of Mechanical and Aerospace Engineering, she led a randomized controlled trial on robotic hand rehabilitation for stroke survivors. One day, a UC Irvine Health physical therapist on the research team began using a smartphone camera to record patients performing the Box and Blocks Test to show them their progress. 

At the same time, Farrens was working on a side project using Google’s MediaPipe Hand Landmarker open-source motion capture software, which lets users identify key parts of the hand and overlay visual effects. It was a lightbulb moment.

“We started to talk about, ‘Well, we can quantify that, and maybe not only have better information for the patient, but also a better understanding of what we actually improved in terms of hand function,’” said Farrens, who is now the manager of research programs at the Children’s Hospital of Orange County, a UC Irvine partner, and the C-MoRe project lead.

The idea for an app was forming, but Farrens didn’t feel she had the expertise to develop it herself. So, she turned to a familial UC Davis connection: her dad, Matthew Farrens, a professor of computer science. He connected her with Zhu and Kim, who were undergraduate computer science researchers at the time.

Teaching a Computer to See

In the past two years, Zhu and Kim have helped turn Andria Farrens’ idea into reality. They started by customizing MediaPipe Hand Landmarker settings to detect the hands of stroke patients in videos of them performing the Box and Blocks Test. They then developed a machine learning algorithm to detect the different states of the moving hand during the test.

One of the more challenging aspects of training a computer vision model on the Box and Blocks Test that Zhu and Kim encountered was detecting when a person released a block on the correct side. If a human sees someone release a block from their hand, the brain registers that the block was dropped. Computer vision models aren’t that intuitive.

Two young men are engaged in discussion while looking at a computer.
Kim, left, and Zhu work on teaching a computer vision model to recognize and analyze certain hand motions. (Elena Troncoso/UC Davis)
Computer monitor displaying a program tracking hands sorting colorful blocks.
C-MoRe tracks when a block is moved over the barrier and dropped on the other side. (Elena Troncoso/UC Davis)

“In our lab meetings, we would get together and really decompose what it means for us to understand that a block has dropped,” Kim said. “We had to really dumb it down to basic core principles and then train vision models to comprehend those principles.”

Zhu and Kim got around this by programming two proxies: one that examines the distance between the block and the hand, and another that detects changes on the destination side of the box, which can typically be attributed to the introduction of the block falling from someone’s hand.

Toward Precision Rehabilitation

One area Farrens hopes to make an impact with the data the app gathers is in assessing proprioception in stroke patients. Proprioception is the body’s ability to sense where it is in space without looking. It allows people to touch their nose with their eyes closed or walk without watching their feet.

A smartphone on a tripod displays a program tracking hands moving colorful blocks..
The app will allow physicians to measure rehabilitation progress with unprecedented accuracy and speed. (Elena Troncoso/UC Davis)

In using this app, Farrens noticed that, when assessing the grasping phase, she can better identify individuals with proprioception deficits who may be candidates for more specialized training focused on that deficit, rather than just repetitive movement training.

“The app fits into this idea of precision rehabilitation and being able to identify factors that mediate recovery and then potentially evaluate better and different types of training that then cause recovery,” she said.

In addition to C-MoRe’s rapid clinical assessment (Zhu and Kim are working on implementing immediate analysis), Farrens, Zhu and Kim also aim for the app to help build a larger data set on stroke patient recovery and motor function. The dataset will be made open-source so other clinicians and researchers can use it for modeling and recovery prediction in similar clinical settings.

From Pilot to Practice

The app is currently in the pilot stage. Farrens will soon begin working with it more in a clinical setting, and as they gather more data, the team plans to publish their findings and make the app widely available. They hope C-MoRe proves to be a valuable tool in stroke patient rehabilitation and can offer new information that can help with recovery.  

“We’re not looking to replace physicians; we’re looking to make their lives better,” Zhu said. “We’re quantifying the movements and looking at all of the hand movements and kinematics, which are just not possible with human eyes. By doing it this way, we might open up a new way to look at how stroke affects people.”

No comments:

Post a Comment