Volume 17, No 1, 2020

Method of Artificial Neural Networks Teaching


Khudhair Abed Thamer

Abstract

The technique of artificial neural networks training has been developed. A distinctive feature of the proposed technique is that it provides training not only for the synaptic weights of the artificial neural network but also for the type and parameters of the membership function. If it is impossible to provide, the specified quality of functioning of artificial neural networks due to the learning of the parameters of the artificial neural network, the architecture of artificial neural networks is trained. The choice of architecture, type and parameters of the membership function takes into account the computing resources of the tool and taking into account the type and amount of information that is coming to the input of an artificial neural network. Also, while using the proposed method, there is no accumulation of error learning artificial neural networks as a result of processing information, which is supplied to the input of artificial neural networks. The development of the proposed methodology is due to the need to train artificial neural networks, in order to process more information, with the uniqueness of the made decisions. According to the results of the research, it is established that the mentioned training method provides on average 16-23 percent higher efficiency of training of artificial neural networks and does not accumulate errors during training. This technique will allow to train artificial neural networks; identify effective measures to improve the performance of artificial neural networks. Also, the developed technique will increase the efficiency of the functioning of artificial neural networks by learning the parameters and architecture of artificial neural networks. The technique proposed by the authors reduces the use of computing resources for support and decision-making systems. Using the developed methodology will develop measures that are aimed at improving the efficiency of artificial neural networks training and increase the speed of the processing information.


Pages: 43-64

DOI: 10.14704/WEB/V17I1/a207

Keywords: Artificial neural networks; Synaptic scales; Membership function; Processing information; Intelligent decision support systems

Full Text