Structural plasticity on an accelerated analog neuromorphic hardware system.
Clicks: 129
ID: 204873
2020
In computational neuroscience, as well as in machine learning, neuromorphic devices promise an accelerated and scalable alternative to neural network simulations. Their neural connectivity and synaptic capacity depend on their specific design choices, but is always intrinsically limited. Here, we present a strategy to achieve structural plasticity that optimizes resource allocation under these constraints by constantly rewiring the pre- and postsynaptic partners while keeping the neuronal fan-in constant and the connectome sparse. In particular, we implemented this algorithm on the analog neuromorphic system BrainScaleS-2. It was executed on a custom embedded digital processor located on chip, accompanying the mixed-signal substrate of spiking neurons and synapse circuits. We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology with respect to the nature of its training data, as well as its overall computational efficiency.
Reference Key |
billaudelle2020structuralneural
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
---|---|
Authors | Billaudelle, Sebastian;Cramer, Benjamin;Petrovici, Mihai A;Schreiber, Korbinian;Kappel, David;Schemmel, Johannes;Meier, Karlheinz; |
Journal | neural networks : the official journal of the international neural network society |
Year | 2020 |
DOI | S0893-6080(20)30355-5 |
URL | |
Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.