Comparitive Analysis of Gradient Boosting and Transformer Based Models for Binary Classification in Tabular Data


Authors : Jebaraj Vasudevan

Volume/Issue : Volume 10 - 2025, Issue 3 - March


Google Scholar : https://tinyurl.com/3x3p86xk

Scribd : https://tinyurl.com/53x4mxx2

DOI : https://doi.org/10.38124/ijisrt/25mar416

Google Scholar

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.

Note : Google Scholar may take 15 to 20 days to display the article.


Abstract : This study compares the classification performance of the Gradient Boosting (XGBoost), and Transformer based model with multi-head self-attention for Tabular Data. While the methods exhibit broadly similar performance, the Transformer model particularly excels in Recall by about 8% showing that it would be better suited to applications such as Fraud Detection in Payment processing and Medical Diagnostics.

Keywords : Transformer, Gradient Boosting, XGBoost, Tabular Data.

References :

  1. T. Chen and C. Guestrin, "XGBoost: A Scalable Tree Boosting System," 2016.
  2. X. Huang, A. Khetan, M. Cvitkovic and Z. Karnin, "TabTransformer: Tabular Data Modeling Using Contextual Embeddings," 2020.
  3. "Kaggle," [Online]. Available: https://www.kaggle.com/datasets/blastchar/telco-customer-churn/data.
  4.  A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones and A. Gomez, "Attention is all you need," 2017.

This study compares the classification performance of the Gradient Boosting (XGBoost), and Transformer based model with multi-head self-attention for Tabular Data. While the methods exhibit broadly similar performance, the Transformer model particularly excels in Recall by about 8% showing that it would be better suited to applications such as Fraud Detection in Payment processing and Medical Diagnostics.

Keywords : Transformer, Gradient Boosting, XGBoost, Tabular Data.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe