3.1 Back Propogation Back Propogation is a

3.1 Supervised Learning
Learning is necessary whenever we want a computer to perform a
task because it cannot be programmed by the conventional means .
For example, it is not possible to directly write a computer program
that recognizes the speech. However, it is in necessary to collect a
large number of example speech signals with their attached content
and to use a supervised learning algorithm to approximate the input
output relationship implied by the training examples i.e creating a
model and then using this model on the testing data.
3.2 Feedforward Neural Network
The Feedforward Neural Network is the basic and most commonly
used artificial neural network. It consists of a number of layers of
artificial neurons which are called as units that are arranged in
layers. Deep neural networks, which are known to be capable of
representing the complex functions that achieve high performance
on difficult and dense problems such as in the field of vision and
speech. FNNs have achieved success in a number of different areas,
mostly in large vocabulary continuous speech recognition ,
where they are directly responsible for improvements over previous
highly-tuned, state-of-theart systems and the Recurrent Neural
networks are the extension of these feedforward neural networks
which has an internal memory to store the arbitarary inputs at
different layers.
3.3 Back Propogation
Back Propogation is a method which is used for training the neural
network and it is the type of supervised learning method.Back propogation
is used for the minimization of the error by applying the
optimizer which performs some kind of parial derevative functions
to update the weights and bias so that it may results in minimizing
the total error .
3.4 Restricted Boltzmann machine
A restricted Boltzmann machine (RBM) is a stochastic artificial neural
network that is able to calculate the probability distribution over
its set of inputs. RBMs can be used in different types of applications
like dimensionality reduction, classification,collaborative filtering
feature learning and many more.They can be trained in either supervised
or unsupervised ways which basically depends on the task
it has to perform . The Restricted Boltzmann Machine alone is not
powerful for generating the sequences so, it is combined with the
Recurrent Neural Networks to build a more powerful model for
generating the sequences.
4

x

Hi!
I'm Joan!

Would you like to get a custom essay? How about receiving a customized one?

Check it out