HOMEOSTATIC LEARNING RULE FOR ARTIFICIAL NEURAL NETWORKS

Martin Růžek

Abstract


This article presents an improvement of learning algorithm for articial
neural network that makes the learning more similar to biological neuron, but still
simple enough to be easily programmed. The idea is based on autonomous articial
neurons that are working together and at same time competing for resources; every
neuron is trying to be better then the others, but also needs the feed back from
other neurons. The proposed articial neuron has similar forward signal processing
as the standard articial perceptron, the main dierence is the learning phase. The
learning is based on observing the weights of other neurons, but only in biologically
plausible way, no back propagation of error or 'teacher' is allowed. The neuron is
sending the signal in forward direction into the higher layer, the information about
its function is being propagated in the opposite direction. This information does not
have the form of energy, it is the observation of how is the neuron's output accepted
by others. The neurons are trying to nd such setting of their internal parameters
that are optimal for the whole network. For this algorithm, it is necessary that the
neurons are organized in layers. The tests proved the viability of this concept, the
learning is slower; but has other advantages, such as resistance against catastrophic
interference or higher generalization.


Keywords


articial neural network, learning rule, biological neuron

References


MOSAVI M.R., KHISHE M., GHAMGOSAR A.: Classication of sonar data set using neural network trained by Gray Wolf Optimization, Neural Network World, 4/2016, pp. 393-415, doi:

14311/NNW.2016.26.023.

SASIKALA S., APPAVU S., S. GEETHA S.: Improving detection performance of articial neural network by Shapley value embedded genetic feature selector, Neural Network World, 2/2017, pp. 175-201, doi: 10.14311/NNW.2016.26.010.

HANG Y., ZHENG C.-D.: Novel stochastic stabiliy conditions of fuzzy neural networks with Markovian jumping parameter under impulsive perbations, Neural Network World, 6/2016, pp. 543-557, doi: 10.14311/NNW.2016.26.001.

ZHAO J., LV Y., ZHOU Z., CAO F.: A novel deep learning algorithm for incomplete face recognition: Low-rank-recovery network, Neural Networks, Volume 94, October 2017, pp. 115-124.

SEENIVASAGAM V, CHITRA R.: Myocardial infarction detection using intelligent algorithms, Neural Network World, 1/2016, pp. 91-110, doi: 10.14311/NNW.2016.26.005.

HLAVICA J, PRAUZEK M. PETEREK T. MUSILEK P: Assesment of parkinsons disease progression using neural networks and ANFIS models, Neural Network World, 2/2016, pp. 111-128, doi: 10.14311/NNW.2016.26.006.

KIM S., YU Z., LEE M.: Understanding human intention by connecting perception and action learning in articial agents, Neural Networks, Volume 92, August 2017, pp. 29-38.

LEE S., LEE C., KWAK D., HA J., ZHANG B.: Dual-memory neural networks for modeling cognitive activities of humans via wearable sensors, Neural Networks, Volume 92, August 2017, pp. 17-28.

SILVER D. et al: Mastering the game of Go with deep neural networks and tree search, Nature, 529/2016, pp. 484-489, doi: 10.1038/nature16961.

SCHMIDHUBER J.: Deep learning in neural networks: An overview, Neural Networks, 61/2015, pp. 85-117, doi: /10.1016/j.neunet.2014.09.003.

PECK Ch et al: Network-Related Challenges and Insights from Neuroscience, Bio-Inspired Computing and Communication, BIOWIRE -FirstWorkshop on Bio-Inspired Design of Networks, 2007, Cambridge, pp. 67-78.

MARKRAM H. et al: Introducing the Human Brain Project, The European Future Technologies Conference and Exhibition, 2011, Vol. 7, pp. 39-42.

SUSI G., CRISTINI A., SALERMO M.: Path multimodality in a Feedforward SNN module, using LIF with Latency model, Neural Network World, 4/2016, pp. 363-376, doi: 10.14311.

NNW.2016.26.021.

YOLCU U., BAS E., EGRIOGLU E., ALADAG C.H.: A New Multilayer Feedforward Network Based on Trimmed Mean Neuron Model, Neural Network World, 6/2015, pp. 587-602,

doi: 10.14311/NNW.2015.25.029.

BORGES F. S., PROTACHEVICZ P. R., LAMEU E. L., BONETTI R. C., BATISTA A. M.: Synchronised ring patterns in a random network of adaptive exponential integrate-and-re neuron model, Neural Networks , Volume 90, June 2017, pp. 1-7.

MIRJALILI S., LEWIS A.: S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization, Swarm and Evolutionary Computation, 9/2013, pp. 1-14, doi: 10.

/j.swevo.2012.09.002.

MUSCA S C, ROUSSET, S, ANS B: Articial neural network whispering to the brain: Nonlinear system attractors induce familiarity with never seen items, Connection Science,

Volume 21(4), 2009, pp. 359-377.

NIKOLIC D.: Practopoiesis: Or how life fosters a mind, Journal of Theoretical Biology, Volume 373, 2015, pp. 40-61.




DOI: http://dx.doi.org/10.14311/NNW.2018.%25x

Refbacks

  • There are currently no refbacks.


Should you encounter an error (non-functional link, missing or misleading information, application crash), please let us know at nnw.ojs@fd.cvut.cz.
Please, do not use the above address for non-OJS-related queries (manuscript status, etc.).
For your convenience we maintain a list of frequently asked questions here. General queries to items not covered by this FAQ shall be directed to the journal editoral office at nnw@fd.cvut.cz.