Abstract (EN):
We investigate the effect on synchrony of adding feedback loops and adaptation to a large class of feedforward networks. We obtain relatively complete results on synchrony for identical cell networks with additive input structure and feedback from the final to the initial layer of the network. These results extend the previous work on synchrony in feedforward networks by Aguiar et al. (Chaos 27:013103, 2017). We also describe additive and multiplicative adaptation schemes that are synchrony preserving and briefly comment on dynamical protocols for running the feedforward network that relates to unsupervised learning in neural nets and neuroscience. © 2018, Springer Science+Business Media, LLC, part of Springer Nature.
Language:
English
Type (Professor's evaluation):
Scientific