138.

c++ neural networks and fuzzy logic C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao
M&T Books, IDG Books Worldwide, Inc.
ISBN: 1558515526   Pub Date: 06/01/95
  

Table of Contents


Glossary

A

Activation
The weighted sum of the inputs to a neuron in a neural network.
Adaline
Adaptive linear element machine.
Adaptive Resonance Theory
Theory developed by Grossberg and Carpenter for categorization of patterns, and to address the stability–plasticity dilemma.
Algorithm
A step-by-step procedure to solve a problem.
Annealing
A process for preventing a network from being drawn into a local minimum.
ART
(Adaptive Resonance Theory) ART1 is the result of the initial development of this theory for binary inputs. Further developments led to ART2 for analog inputs. ART3 is the latest.
Artificial neuron
The primary object in an artificial neural network to mimic the neuron activity of the brain. The artificial neuron is a processing element of a neural network.
Associative memory
Activity of associating one pattern or object with itself or another.
Autoassociative
Making a correspondence of one pattern or object with itself.

B

Backpropagation
A neural network training algorithm for feedforward networks where the errors at the output layer are propagated back to the layer before in learning. If the previous layer is not the input layer, then the errors at this hidden layer are propagated back to the layer before.
BAM
Bidirectional Associative Memory network model.
Bias
A value added to the activation of a neuron.
Binary digit
A value of 0 or 1.
Bipolar value
A value of –1 or +1.
Boltzmann machine
A neural network in which the outputs are determined with probability distributions. Trained and operated using simulated annealing.
Brain-State-in-a-Box
Anderson’s single-layer, laterally connected neural network model. It can work with inputs that have noise in them or are incomplete.

C

Cauchy machine
Similar to the Boltzmann machine, except that a Cauchy distribution is used for probabilities.
Cognitron
The forerunner to the Neocognitron. A network developed to recognize characters.
Competition
A process in which a winner is selected from a layer of neurons by some criterion. Competition suggests inhibition reflected in some connection weights being assigned a negative value.
Connection
A means of passing inputs from one neuron to another.
Connection weight
A numerical label associated with a connection and used in a weighted sum of inputs.
Constraint
A condition expressed as an equation or inequality, which has to be satisfied by the variables.
Convergence
Termination of a process with a final result.
Crisp
The opposite of fuzzy—usually a specific numerical quantity or value for an entity.

D

Delta rule
A rule for modification of connection weights, using both the output and the error obtained. It is also called the LMS rule.

E

Energy function
A function of outputs and weights in a neural network to determine the state of the system, e.g., Lyapunov function.
Excitation
Providing positive weights on connections to enable outputs that cause a neuron to fire.
Exemplar
An example of a pattern or object used in training a neural network.
Expert system
A set of formalized rules that enable a system to perform like an expert.

F

FAM
Fuzzy Associative Memory network. Makes associations between fuzzy sets.
Feedback
The process of relaying information in the opposite direction to the original.
Fit vector
A vector of values of degree of membership of elements of a fuzzy set.
Fully connected network
A neural network in which every neuron has connections to all other neurons.
Fuzzy
As related to a variable, the opposite of crisp. A fuzzy quantity represents a range of value as opposed to a single numeric value, e.g., “hot” vs. 89.4°.
Fuzziness
Different concepts having an overlap to some extent. For example, descriptions of fair and cool temperatures may have an overlap of a small interval of temperatures.
Fuzzy Associative Memory
A neural network model to make association between fuzzy sets.
Fuzzy equivalence relation
A fuzzy relation (relationship between fuzzy variables) that is reflexive, symmetric, and transitive.
Fuzzy partial order
A fuzzy relation (relationship between fuzzy variables) that is reflexive, antisymmetric, and transitive.

G

Gain
Sometimes a numerical factor to enhance the activation. Sometimes a connection for the same purpose.
Generalized Delta rule
A rule used in training of networks such as backpropagation training where hidden layer weights are modified with backpropagated error.
Global minimum
A point where the value of a function is no greater than the value at any other point in the domain of the function.

H

Hamming distance
The number of places in which two binary vectors differ from each other.
Hebbian learning
A learning algorithm in which Hebb’s rule is used. The change in connection weight between two neurons is taken as a constant times the product of their outputs.
Heteroassociative
Making an association between two distinct patterns or objects.
Hidden layer
An array of neurons positioned in between the input and output layers.
Hopfield network
A single layer, fully connected, autoassociative neural network.

I

Inhibition
The attempt by one neuron to diminish the chances of firing by another neuron.
Input layer
An array of neurons to which an external input or signal is presented.
Instar
A neuron that has no connections going from it to other neurons.

L

Lateral connection
A connection between two neurons that belong to the same layer.
Layer
An array of neurons positioned similarly in a network for its operation.
Learning
The process of finding an appropriate set of connection weights to achieve the goal of the network operation.
Linearly separable
Two subsets of a linear set having a linear barrier (hyperplane) between the two of them.
LMS rule
Least mean squared error rule, with the aim of minimizing the average of the squared error. Same as the Delta rule.
Local minimum
A point where the value of the function is no greater than the value at any other point in its neighborhood.
Long-term memory (LTM)
Encoded information that is retained for an extended period.
Lyapunov function
A function that is bounded below and represents the state of a system that decreases with every change in the state of the system.

M

Madaline
A neural network in which the input layer has units that are Adalines. It is a multiple-Adaline.
Mapping
A correspondence between elements of two sets.

N

Neural network
A collection of processing elements arranged in layers, and a collection of connection edges between pairs of neurons. Input is received at one layer, and output is produced at the same or at a different layer.
Noise
Distortion of an input.
Nonlinear optimization
Finding the best solution for a problem that has a nonlinear function in its objective or in a constraint.

O

On center off surround
Assignment of excitatory weights to connections to nearby neurons and inhibitory weights to connections to distant neurons.
Orthogonal vectors
Vectors whose dot product is 0.
Outstar
A neuron that has no incoming connections.

P

Perceptron
A neural network for linear pattern matching.
Plasticity
Ability to be stimulated by new inputs and learn new mappings or modify existing ones.

R

Resonance
The responsiveness of two neurons in different layers in categorizing an input. An equilibrium in two directions.

S

Saturation
A condition of limitation on the frequency with which a neuron can fire.
Self-organization
A process of partitioning the output layer neurons to correspond to individual patterns or categories, also called unsupervised learning or clustering.
Short-term memory (STM)
The storage of information that does not endure long after removal of the corresponding input.
Simulated annealing
An algorithm by which changes are made to decrease energy or temperature or cost.
Stability
Convergence of a network operation to a steady-state solution.
Supervised learning
A learning process in which the exemplar set consists of pairs of inputs and desired outputs.

T

Threshold value
A value used to compare the activation of a neuron to determine if the neuron fires or not. Sometimes a bias value is added to the activation of a neuron to allow the threshold value to be zero in determining the neuron’s output.
Training
The process of helping in a neural network to learn either by providing input/output stimuli (supervised training) or by providing input stimuli (unsupervised training), and allowing weight change updates to occur.

U

Unsupervised learning
Learning in the absence of external information on outputs, also called self-organization or clustering.

V

Vigilance parameter
A parameter used in Adaptive Resonance Theory. It is used to selectively prevent the activation of a subsystem in the network.

W

Weight
A number associated with a neuron or with a connection between two neurons, which is used in aggregating the outputs to determine the activation of a neuron.


Table of Contents

Copyright © IDG Books Worldwide, Inc.



C++ Neural Networks and Fuzzy Logic
C++ Neural Networks and Fuzzy Logic
ISBN: 1558515526
EAN: 2147483647
Year: 1995
Pages: 139

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net