Another Look at the Perceptron
Abstract
The article explores a method for classifying elements of linearly separable sets using the Perceptron algorithm and implemented with Python. Experimented in two-dimensional space with the following sets: linearly separated, linearly separated by elements close to the line and merging elements into one another. The optimal weights of the linear predictive function weights are searched using a Stochastic Gradient Descent. The purpose is to find the weights with reference to error of the minimum predictive function minus the real value of the class, following the changing weights of the moving of the gradient.
The aim of the article is to show an opportunity to solve classification problem using a single neural network. Because the classification method is limited to linearly divisible sets, and often we do not know whether the sets are such previously a variant of the perceptron algorithm is offered which, in case of impossibility of further detection of more optimal weights, stops automatically and possibly use other classification algorithms for a more accurate classification.
Keywords
Full Text:
PDFReferences
Rosenblatt, F. (1957) The perceptron, a perceiving and recognizing automaton Project Para. Cornell Aeronautical Laboratory
Elizondo, D. (2006) The linear separability problem: Some testing methods. IEEE Transactions on neural networks, 17(2), 330-344.
Brownlee, J. (2016) How To Implement The Perceptron Algorithm From Scratch In Python, In Code Machine Learning Algorithms From Scratch - https://machinelearningmastery.com/implement-perceptron-algorithm-scratch-python/
Raschka, S., Mirjalili, V. (2017) Python machine learning. Packt Publishing Ltd.
Moore, A. W., & Lee, M. S. (1994) Efficient algorithms for minimizing cross validation error. In Machine Learning Proceedings 1994 (pp. 190-198). Morgan Kaufmann.
Refbacks
- There are currently no refbacks.