Changing stroke rehab and research worldwide now.Time is Brain! trillions and trillions of neurons that DIE each day because there are NO effective hyperacute therapies besides tPA(only 12% effective). I have 523 posts on hyperacute therapy, enough for researchers to spend decades proving them out. These are my personal ideas and blog on stroke rehabilitation and stroke research. Do not attempt any of these without checking with your medical provider. Unless you join me in agitating, when you need these therapies they won't be there.

What this blog is for:

My blog is not to help survivors recover, it is to have the 10 million yearly stroke survivors light fires underneath their doctors, stroke hospitals and stroke researchers to get stroke solved. 100% recovery. The stroke medical world is completely failing at that goal, they don't even have it as a goal. Shortly after getting out of the hospital and getting NO information on the process or protocols of stroke rehabilitation and recovery I started searching on the internet and found that no other survivor received useful information. This is an attempt to cover all stroke rehabilitation information that should be readily available to survivors so they can talk with informed knowledge to their medical staff. It lays out what needs to be done to get stroke survivors closer to 100% recovery. It's quite disgusting that this information is not available from every stroke association and doctors group.

Saturday, May 30, 2020

Geometric algorithms for predicting resilience and recovering damage in neural networks

No clue.

Geometric algorithms for predicting resilience and recovering damage in neural networks

GuruprasadRaghavan Caltech Pasadena, CA 91106 graghava@caltech.edu
JiayiLi UCLA Los Angeles, CA 90095 jiayi.li@g.ucla.edu
MattThomson Caltech Pasadena, CA 91106 mthomson@caltech.edu

Abstract

Biological neural networks have evolved to maintain performance despite significant circuit damage. To survive damage, biological network architectures have both intrinsic resilience to component loss and also activate recovery programs that adjust network weights through plasticity to stabilize performance. Despite the importance of resilience in technology applications, the resilience of artificial neural networks is poorly understood, and autonomous recovery algorithms have yet to be developed. In this paper, we establish a mathematical framework to analyze the resilience of artificial neural networks through the lens of differential geometry. Our geometric language provides natural algorithms that identify local vulnerabilities in trained networks as well as recovery algorithms that dynamically adjust networks to compensate for damage. We reveal striking vulnerabilities in commonly used image analysis networks, like MLP’s and CNN’s trained on MNIST and CIFAR10 respectively. We also uncover high-performance recovery paths that enable the same networks to dynamically re-adjust their parameters to compensate for damage. Broadly, our work provides procedures that endow artificial systems with resilience and rapid recovery routines to enhance the irintegration with IoT devices as well as enable their deployment for critical applications.
1 Introduction
Brains are remarkable machines whose computational capabilities have inspired many breakthroughs in machine learning [1, 2, 3, 4]. However, the resilience of the brain, its ability to maintain computational capabilities in harsh conditions and following circuit damage, remains poorly developed in current artificial intelligence paradigms [5] . Biological neural networks are known to implement redundancy and other architectural features that allow circuits to maintain performance following loss of neurons or lesion to sub-circuits [6, 7, 8, 9, 10]. In addition to architectural resilience, biological neural networks across species execute recovery programs that allow circuits to repair themselves through the activation of network plasticity following damage [11, 12, 13]. For example, recovery algorithms reestablish olfactory and visual behaviors in mammals following sensory specific cortical circuit lesions [14, 15]. Through resilience and recovery mechanisms, biological neural networks can function over the life of an animal, in difficult environments and maintain performance following seemingly catastrophic injuries like the loss of the entire visual cortex or hippocampus [16, 17, 18, 19]. Like brains, artificial neural networks also face difficult operating conditions that can induce component damage at different scales. Hardware failures in modern compute clusters due to accumulation of errors in Dynamic random access memory (DRAM) devices that occur at surprising rates, could be a disaster [20] for networks being used for critical applications, such as (i) decision-making in the healthcare industry, (ii) self-driving cars and (iii) for robots deployed in extreme environments.

Further the rising implementation of neural networks on physical hardware (like neuromorphic, edge devices) [21, 22] where networks can be disconnected from the internet and are under control of an end user necessitates the need for damage-resilient and dynamically recovering artificial neural networks. Yet, the resilience and recovery properties of biological neural networks are currently absent in the design of artificial neural networks. The resilience of living neural networks motivates theoretical and practical efforts to understand the resilience of artificial neural networks and to design new algorithms that reverse engineer resilience and recovery into artificial systems [23]. Recent studies [24, 25] have demonstrated that MLP and CNN architectures can be surprisingly robust to large scale node deletion. However, little is known about what induces network robustness, how do networks ultimately fail, or how to define recovery procedures that can maintain network performance during damage. We propose a mathematical framework grounded in differential geometry to study the resilience and the recovery of artificial neural nets. Globally, we formalize damage/response behavior as dynamic movement on a curved pseudo Riemannian manifold. Our geometric language provides new procedures for identifying network vulnerabilities by predicting local perturbations that adversely impact the functional performance of the network, and for uncovering high performance recovery paths that the network can traverse to maintain performance while it is being damaged. Our algorithms allow networks to maintain high performance during rounds of damage and repair through computationally efficient update algorithms that do not require conventional retraining. Broadly, our work provides procedures that will endow artificial systems with resilience and autonomous recovery policies to emulate the properties of biological neural networks and to enhance their deployment in critical technology applications.

No comments:

Post a Comment