Square Neurons, Power Neurons, and Their Learning Algorithms

Liu, Ying (2018) Square Neurons, Power Neurons, and Their Learning Algorithms. American Journal of Computational Mathematics, 08 (04). pp. 296-313. ISSN 2161-1203

[thumbnail of AJCM_2018120615051448.pdf] Text
AJCM_2018120615051448.pdf - Published Version

Download (434kB)

Abstract

In this paper, we introduce the concepts of square neurons, power neu-rons, and new learning algorithms based on square neurons, and power neurons. First, we briefly review the basic idea of the Boltzmann Machine, specifically that the invariant distributions of the Boltzmann Machine generate Markov chains. We further review ABM (Attrasoft Boltzmann Machine). Next, we review the θ-transformation and its completeness, i.e. any function can be expanded by θ-transformation. The invariant distribution of the ABM is a θ-transformation; therefore, an ABM can simulate any distribution. We review the linear neurons and the associated learning algorithm. We then discuss the problems of the exponential neurons used in ABM, which are unstable, and the problems of the linear neurons, which do not discriminate the wrong answers from the right answers as sharply as the exponential neurons. Finally, we introduce the concept of square neurons and power neurons. We also discuss the advantages of the learning algorithms based on square neurons and power neurons, which have the stability of the linear neurons and the sharp discrimination of the exponential neurons.

Item Type: Article
Subjects: OA Open Library > Mathematical Science
Depositing User: Unnamed user with email support@oaopenlibrary.com
Date Deposited: 17 Jun 2023 07:21
Last Modified: 12 Jan 2024 05:01
URI: http://archive.sdpublishers.com/id/eprint/1069

Actions (login required)

View Item
View Item