This paper deals with advancements that have
occurred in the field of Natural language processing. Most
of the text prediction-related tasks are conducted in
different dialects but not in Malayalam. Numerous
frameworks with various techniques were produced for
various dialects. Only a few of the models fit into
Malayalam. In this paper, the BERT algorithm is applied
to the Malayalam language. BERT uses bi-directional
pre-training on multiple tasks such as Next Sentence
Prediction and Masked Language Model. This paper
mainly deals with the Next sentence prediction.
BERT, Malayalam Next Sentence Prediction.