Mathematical Foundations of Regression Analysis: A Study of Linear and Logistic Models


Authors : Mritunjay Mukherjee

Volume/Issue : Volume 10 - 2025, Issue 9 - September


Google Scholar : https://tinyurl.com/4z5uyuk3

Scribd : https://tinyurl.com/effr5cz4

DOI : https://doi.org/10.38124/ijisrt/25sep997

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.

Note : Google Scholar may take 30 to 40 days to display the article.


Abstract : Regression modelsform the backbone of modern statistical inference and predictive analytics.This paper presents a rigorous mathematical examination of two fundamental approaches: linear regression and logistic regression. Beginning with the formulation of each model, we derive their objective functions—the least squares criterion for linear regression and the log-likelihood for logistic regression. Closed-form solutions for linear regression are contrasted with the iterative optimization required in logistic regression, highlighting the importance of gradient-based methods. Special emphasis is placed on demonstrating how these mathematical principles can be applied to real-life datasets.

References :

  1. James, G., Witten, D., Hastie, T., & Tibshirani, R. (2023). An introduction to statistical learning: With applications in Python. Springer. Retrieved from https:// www.statlearning.com/
  2. Šimundić, A. M. (2014). Bias in research. Biochemia Medica, 24(1), 12–15. https://doi.org/10.11613/BM.2014.003
  3. Idriss, J. (2020, December 8). Ordinary least squares and normal equations in linear regression. Medium. Retrieved January 16, 2025, from https:// medium.com/@jairiidriss/ordinary-least-squares-and-normal-equations-in-linear- regression-85af6ccc5bf5
  4. Khan Academy. (n.d.). Gradient descent. Retrieved January 16, 2025, from https://www.khanacademy.org
  5. Google Colab. (n.d.). Google Colab: Your Jupyter notebook on the cloud. Retrieved January 16, 2025, from https://colab.research.google.com
  6. Scikit-learn. (n.d.). Linear models: Logistic regression. Scikit-learn documentation. Retrieved January 16, 2025, from https://scikit-learn.org/stable/ modules/linear_model.html#logistic-regression
  7. Scikit-learn. (n.d.). Ordinary least squares. Retrieved January 16, 2025, from https://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares
  8. Kaggle. (n.d.). Refugee dataset. Retrieved January 16, 2025, from https:// www.kaggle.com

Regression modelsform the backbone of modern statistical inference and predictive analytics.This paper presents a rigorous mathematical examination of two fundamental approaches: linear regression and logistic regression. Beginning with the formulation of each model, we derive their objective functions—the least squares criterion for linear regression and the log-likelihood for logistic regression. Closed-form solutions for linear regression are contrasted with the iterative optimization required in logistic regression, highlighting the importance of gradient-based methods. Special emphasis is placed on demonstrating how these mathematical principles can be applied to real-life datasets.

CALL FOR PAPERS


Paper Submission Last Date
31 - December - 2025

Video Explanation for Published paper

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe