Share:


Artificial neural networks with dynamic synapses: a review

Abstract

Artificial neural networks (ANNs) are widely applied to solve real-world problems. Most of the actions we take and the processes around us are time-varying. ANNs with dynamic properties allow processing time-dependent data and solving tasks such as speech and text processing, prediction models, face and emotion recognition, game strategy development. Dynamics in neural networks can appear in the input data, the architecture of the neural network, and the individual elements of the neural network – synapses and neurons. Unlike static synapses, dynamic synapses can change their connection strength based on incoming information. This is a fundamental principle allows neural networks to perform complex tasks like word processing or face recognition more efficiently. Dynamic synapses play a key role in the ability of artificial neural networks to learn from experience and change over time, which is one of the key aspects of artificial intelligence. The scientific works examined in this article show that there are no literature sources that review and compare dynamic DNTs according to their synapses. To fill this gap, the article reviews and groups DNTs with dynamic synapses. Dynamic neural networks are defined by providing a general mathematical expression. A dynamic synapse is described by specifying its main properties and presenting a general mathematical expression. Also an explanation, how these synapses can be modelled and integrated into 11 different dynamic ANNs is shown. Moreover, structures of dynamic ANNs are compared according to the properties of dynamic synapses.


Article in Lithuanian.


Dirbtinių neuronų tinklų su dinaminėmis sinapsėmis apžvalga


Santrauka


Dirbtinių neuronų tinklai (DNT) yra plačiai taikomi realaus pasaulio problemoms spręsti. Dauguma mūsų atliekamų veiksmų ir mus supančių procesų yra kintantys laike. Neuronų tinklai, turintys dinamines savybes, leidžia apdoroti laike kintančius duomenis ir spręsti tokius uždavinius kaip kalbos ir teksto apdorojimas, prognozių modeliavimas, veido ar emocijų atpažinimas, žaidimų strategijų kūrimas. DNT dinamika užtikrinama įėjimo duomenų apdorojimo procese, neuronų tinklo sandaroje ar atskiruose DNT elementuose – sinapsėse ar neuronuose. Skirtingai nuo statinių sinapsių, dinaminės sinapsės turi gebėjimą keisti savo ryšio stiprumą pagal gaunamą informaciją. Ši savybė leidžia joms mokytis ir adaptuotis prie kintančių situacijų. Tai yra esminis principas, leidžiantis DNT efektyviau atlikti sudėtingas užduotis, tokias kaip teksto apdorojimas arba veido atpažinimas. Dinaminės sinapsės atlieka svarbų vaidmenį formuojant DNT gebėjimą mokytis iš patirties ir keistis laikui bėgant, o tai yra vienas iš pagrindinių dirbtinio intelekto (DI) aspektų. Šiame straipsnyje nagrinėti moksliniai darbai parodo, jog nėra literatūros šaltinių, kuriuose būtų apžvelgti ir palyginti dinaminiai DNT pagal jų sinapses. Siekiant užpildyti šią spragą, straipsnyje apžvelgiami ir sugrupuojami DNT su dinaminėmis sinapsėmis. Apibrėžiami dinaminiai neuronų tinklai pateikiant bendrinę matematinę išraišką. Apibūdinama dinaminė sinapsė nurodant jos pagrindines savybes ir pateikiant bendrinę matematinę išraišką. Nagrinėjama, kaip ši sinapsė gali būti modeliuojama ir integruojama į 11 skirtingų dinaminių DNT struktūrų. Išnagrinėtos dinaminių DNT struktūros palyginamos pagal dinaminių sinapsių savybes.


Reikšminiai žodžiai: dirbtinių neuronų tinklai, dinaminės sinapsės, dinaminiai ryšiai, laikui bėgant kintantys signalai.

Keyword : artificial neural networks, dynamic synapses, dynamic connections, time-varying signals

How to Cite
Dumpis, M. (2023). Artificial neural networks with dynamic synapses: a review. Mokslas – Lietuvos Ateitis / Science – Future of Lithuania, 15. https://doi.org/10.3846/mla.2023.20144
Published in Issue
Nov 8, 2023
Abstract Views
153
PDF Downloads
104
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

References

Back, A. D., & Tsoi, A. C. (1991). FIR and IIR synapses, a new neural network architecture for time series modeling. Neural Computation, 3(3), 375–385. https://doi.org/10.1162/neco.1991.3.3.375

Back, A. D., & Tsoi, A. C. (1992). An adaptive lattice architecture for dynamic multilayer perceptrons. Neural Computation, 4(6), 922–931. https://doi.org/10.1162/neco.1992.4.6.922

Biswas, T., & Fitzgerald, J. E. (2022). Geometric framework to predict structure from function in neural networks. Physical Review Research, 4(2), 023255. https://doi.org/10.1103/PhysRevResearch.4.023255

Cho, K., Merrienboer, B., Bahdanau, D., & Bengio, Y. (2014). On the properties of neural machine translation: Encoder-decoder approaches. In Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation (pp. 103–111), Doha, Qatar. Association for Computational Linguistics. https://doi.org/10.3115/v1/W14-4012

De Vries, B., Principe, J. C., & Guedes de Oliveira, P. (1991). Adaline with adaptive recursive memory. In Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop (pp. 101–110), Princeton, NJ, USA. IEEE. https://doi.org/10.1109/NNSP.1991.239531

Diepenhorst, M., Nijhuis, J. A. G., Venema, R. S., & Spaanenburg, L. (1995). Growing filters for finite impulse response networks. In Proceedings of ICNN’95 – International Conference on Neural Networks (Vol. 2, pp. 854–859), Perth, WA, Australia. IEEE. https://doi.org/10.1109/ICNN.1995.487530

Han, Y., Huang, G., Song, S., Yang, L., Wang, H., & Wang, Y. (2022). Dynamic neural networks: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 7436–7456. https://doi.org/10.1109/TPAMI.2021.3117837

Hashemi, A., Orzechowski, G., Mikkola, A., & McPhee, J. (2023). Multibody dynamics and control using machine learning. Multibody System Dynamics, 58, 397–431. https://doi.org/10.1007/s11044-023-09884-x

Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9, 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735

Huang, H. M., Wang, Z., Wang, T., Xiao, Y., & Guo, X. (2020). Artificial neural networks based on memristive devices: From device to system. Advanced Intelligent Systems, 2(12), 2000149. https://doi.org/10.1002/aisy.202000149

Lawrence, S., Tsoi, A., & Back, A. (1995). The Gamma MLP for speech phoneme recognition. In Advances in neural information processing systems. MIT Press.

Lin, H., Wang, C., Deng, Q., Xu, C., Deng, Z., & Zhou, C. (2021). Review on chaotic dynamics of memristive neuron and neural network. Nonlinear Dynamics, 106(1), 959–973. https://doi.org/10.1007/s11071-021-06853-x

Lu, Q., Zhao, Y., Huang, L., An, J., Zheng, Y., & Yap, E. H. (2023). Low-dimensional-materials-based flexible artificial synapse: Materials, devices, and systems. Nanomaterials, 13(3), 373. https://doi.org/10.3390/nano13030373

McClelland, J. L., & Rumelhart, D. E. (1987). Schemata and sequential thought processes in PDP models. In Parallel distributed processing: Explorations in the microstructure of cognition: Psychological and biological models (pp. 7–57). MIT Press. https://doi.org/10.7551/mitpress/5236.003.0004

Mei, J., Muller, E., & Ramaswamy, S. (2022). Informing deep neural networks by multiscale principles of neuromodulatory systems. Trends in Neurosciences, 45, 237–250. https://doi.org/10.1016/j.tins.2021.12.008

Navakauskas, D., Serackis, A., Matuzevičius, D. ir Laptik, R. (2014). Specializuotos elektroninės intelektualiosios sistemos garsams ir vaizdams apdoroti. Teorija ir taikymai. Technika. https://doi.org/10.3846/2310-M

Navakauskas, D. (1999). Artificial neural networks for the restoration of noise distorted songs audio records [Doctoral dissertation, Sciences of Technology, Electrical and Electronic Engineering (01T)]. Technika.

Paliwal, K. K. (1991). A time-derivative neural net architecture-an alternative to the time-delay neural net architecture. In Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop (pp. 367–375), Princeton, NJ, USA. IEEE. https://doi.org/10.1109/NNSP.1991.239505

Prisciandaro, E., Sedda, G., Cara, A., Diotti, C., Spaggiari, L., & Bertolaccini, L. (2023). Artificial neural networks in lung cancer research: A narrative review. Journal of Clinical Medicine, 12(3), 880. https://doi.org/10.3390/jcm12030880

Schuster, M., & Paliwal, K. K. (1997). Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing, 45, 2673–2681. https://doi.org/10.1109/78.650093

Tsoi, A. C., & Back, A. (1997). Discrete time recurrent neural network architectures: A unifying review. Neurocomputing, 15, 183–223. https://doi.org/10.1016/S0925-2312(97)00161-6

Waibel, A. (1989). Modular construction of time-delay neural networks for speech recognition. Neural Computation, 1, 39–46. https://doi.org/10.1162/neco.1989.1.1.39

Wan, E. A. (1990). Temporal backpropagation for FIR neural networks. In 1990 IJCNN International Joint Conference on Neural Networks (Vol. 1, pp. 575–580), San Diego, CA, USA. IEEE. https://doi.org/10.1109/IJCNN.1990.137629

Wan, E. A. (1991). Temporal backpropagation: An efficient algorithm for finite impulse response neural networks. In Connectionist models (pp. 131–137). Morgan Kaufmann. https://doi.org/10.1016/B978-1-4832-1448-1.50019-6

Zhang, T., Yang, K., Xu, X., Cai, Y., Yang, Y., & Huang, R. (2019). Memristive devices and networks for brain‐inspired computing. Physica Status Solidi (RRL)–Rapid Research Letters, 13(8), 1900029. https://doi.org/10.1002/pssr.201900029

Zhao, S., Ran, W., Lou, Z., Li, L., Poddar, S., Wang, L., Fan, Z., & Shen, G. (2022). Neuromorphic-computing-based adaptive learning using ion dynamics in flexible energy storage devices. National Science Review, 9(11). https://doi.org/10.1093/nsr/nwac158