Winter Internship Report
Designing a Part Of Speech Tagger
NIT Patna, Bihar
Under the Guidance of
Dr. A. K. Singh
Department of Computer Science &
INDIAN INSTITUTE OF TECHNOLOGY
(BANARAS HINDU UNIVERSITY)
VARANASI – 221005
It could be taken as the superset of
machine learning which itself is a superset of deep learning. On a frank scale,
it could be said as the Technology which gives a machine human like
A branch of Artificial Intelligence
which deals with the way of communicating
with a machine/intelligent system with any natural language like English or Hindi.
Giving a computer the ability to learn
without being explicitly programmed on that very interest. Basically, training
a system on the past so that it could predict the output of present/future.
It has two Sub branches –
Machine learning is the superset of
The machines generate their features
by themselves, basically forming Algorithms to mimic human brain.
It is implemented through neural
networks which has a basic unit called perceptron which is the functional unit
of the neural networks.
The basic Structure of a perceptron. At
first the weights are randomly assigned to the inputs.
Compares the output with the given
output and changes the weight correspondingly.
Multiple neural network with several
hidden layers constitute of deep network
Networks that are not cyclic in
nature, i.e. the outputs are independent of each other.
Here, a neuron in a layer is only
connected to a small region of the layer before it. It’s a feed forward neural
network inspired from the visual cortex.
The neural network in which the
present output depends on the previous outputs (Could be understood as an
analogy to Dynamic programming).
Basic structure of a RNN
There are some limitations with RNN
When the change in weight is very very