Neural Networks
Artificial neural networks
are among the most powerful learning models. They have the
versatility to approximate a
wide range of complex functions representing multidimensional
input-output maps. Neural
networks also have inherent adaptability, and can
perform robustly even in
noisy environments.
An Artificial Neural Network
(ANN) is an information processing paradigm that is
inspired by the way
biological nervous systems, such as the brain, process information.
The key element of this
paradigm is the novel structure of the information processing
system. It is composed of a
large number of highly interconnected simple processing
elements (neurons) working
in unison to solve specific problems. ANNs, like people,
learn by example. An ANN is
configured for a specific application, such as pattern
recognition or data
classification, through a learning process. Learning in biological
systems involves adjustments
to the synaptic connections that exist between the neurons.
This is true of ANNs as
well. ANNs can process information at a great speed owing to
their highly massive
parallelism.
Neural networks, with their
remarkable ability to derive meaning from complicated or
imprecise data, can be used
to extract patterns and detect trends that are too complex to
be noticed by either humans
or other computer techniques. A trained neural network can
be thought of as an
"expert" in the category of information it has been given to analyse.
This expert can then be used
to provide projections given new situations of interest and
answer "what if"
questions. Other advantages include:
1. Adaptive learning: An
ability to learn how to do tasks based on the data given for
training or initial
experience.
2. Self-Organisation: An ANN
can create its own organisation or representation of
the information it receives
during learning time.
3. Real Time Operation: ANN
computations may be carried out in parallel, and
special hardware devices are
being designed and manufactured which take
advantage of this
capability.
4. Fault Tolerance via
Redundant Information Coding: Partial destruction of a
network leads to the
corresponding degradation of performance. However, some
network capabilities may be
retained even with major network damage.
Biological Neural Networks
Much is still unknown about
how the brain trains itself to process information, so theories
abound. In the human brain,
a typical neuron collects signals from others through a host
of fine structures called dendrites.
The neuron sends out spikes of electrical activity
through a long, thin stand
known as an axon, which splits into thousands of branches. At
the end of each branch, a
structure called a synapse converts the activity from the axon
into electrical effects that
inhibit or excite activity from the axon into electrical effects
that inhibit or excite
activity.
Artificial Neural Networks
Artificial neural networks
are represented by a set of nodes, often arranged in layers, and
a set of weighted directed
links connecting them. The nodes are equivalent to neurons,
while the links denote
synapses. The nodes are the information processing units and the
links acts as communicating
media.
No comments:
Post a Comment