Self-Attention GRU Networks for Fake Job Classification


Authors : Ankit Kumar

Volume/Issue : Volume 6 - 2021, Issue 11 - November

Google Scholar : http://bitly.ws/gu88

Scribd : https://bit.ly/32Qc7tw

This paper analyses the Employment Scam Aegean Dataset and compares various machine learning algorithms including Logistic Regression, Decision Tree, Random Forest, XGBoost, K-Nearest Neighbor, Naïve Bayes and Support Vector Classifier on the task of fake job classification. The paper also proposes two self-attention enhanced Gated Recurrent Unit networks, one with vanilla RNN architecture and other with Bidirectional architecture, for classifying the fake job from real ones. The proposed framework uses Gated Recurrent Units with multi-head self-attention mechanism to enhance the long term retention within the network. In comparison to the other algorithms, the two GRU models proposed in this paper are able to obtain better result.

Keywords : Fake Job Classification; Text Classification; Gated Recurrent Unit; Recurrent Neural Networks.

CALL FOR PAPERS


Paper Submission Last Date
30 - April - 2024

Paper Review Notification
In 1-2 Days

Paper Publishing
In 2-3 Days

Video Explanation for Published paper

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe