Features of deep study neural network

Computer Science & Engineering

Authors

First and Last Name Academic degree E-mail Affiliation
Yurij Olenych No incomlviv [at] gmail.com Ivan Franko National University of Lviv, 107 Tarnavsky St., Lviv, Ukraine
Lviv, Ukraine
Sergiy Sveleba, Sc.D. incomlviv [at] gmail.com Ivan Franko National University of Lviv, 107 Tarnavsky St., Lviv, Ukraine
Lviv, Ukraine

I and my co-authors (if any) authorize the use of the Paper in accordance with the Creative Commons CC BY license

First published on this website: 30.06.2019 - 19:07
Abstract

In this paper, for testing multilayer neural networks, two sets (for example, a set of symptoms of diseases) characterized by close-sized parameters, and their sets have a significant (over 50%) coverage area. It is established that for division of two such sets it is necessary that the optimal number of elements (parameters) of sets is more than 20. It has been established that for multilayer neural networks, the functional dependence of the number of neurons in the layer and the number of hidden layers on the error of learning has extreme points, which allows a certain percentage to carry out the classification of arrays with a significant percentage of overlapping of their elements.

References

[1]     Tom M. Mitchell, Machine Learning // McGraw-Hill Science/Engineering/Math; (March 1, 1997), 432 pages ISBN: 0070428077

[2] NumPy & SciPy libraries. // [Cited 2019, 10 April]. – Available from: https://scipy.org/scipylib/download.html.

Full text