MONTE-CARLO STUDY OF SOME ROBUST ESTIMATORS THE SIMPLE LINEAR REGRESSION CASE
Keywords:
Robust estimation, Monte-Carlo, Lognormal and CauchyAbstract
In this study, Least Trimmed Squares (LTS), Theil’s Pair-wise Median (Theil) and Bayesian estimation methods (BAYES) are compared relative to the OLSE via Monte-Carlo Simulation. Variance, Bias, Mean Square Error (MSE) and Relative Mean Square Error (RMSE) are computed and used to evaluate and rank the estimators’ performance. The Simple Linear Regression model is investigated for the conditions in which the error term is assumed to be drawn from three error distributions: unit normal, lognormal and Cauchy. Theil’s non-parametric estimation procedure was found to have the strongest and most reliable performance. The second-best results are obtained from LTS method. Though it was observed that the Bayesian estimators are affected by deviation of the dataset from normality, yet it is established from the results that the Bayesian estimators performed optimally more than all other competitors, even under non normal situations (especially under the standard lognormal distribution) in some cases, except whenever the error is drawn from a heavy tail distribution (Lognormal and Cauchy). OLSE is only reliable as long as the normality assumptions hold.
References
Ayoade, I. Adewole, Olusoga A. Fashoranbaku (2021). Determination of Quantile Range of Optimal
Hyperparameters using Bayesian Estimation. Tanzania Journal of science 47,1
,981-987. ISSN 0856-1761
Agullo, J. (2001). A new algorithm for computing the Least Trimmed Squared regression estimator.
Computational Statistics and Data Analysis
Bayes, Thomas (1763). An essay towards solving a problem in the doctrine of chances.
In Philosophical Transactions. London.
Birkes, D., and Dodge, Y. (1993). Alternative Methods of Regression. New York, NY: Wiley.
Dietz, E.J. (1987). A comparison of robust estimators in simple linear regression.Communication in
Statistics-Simulation.
Evans, M., Hastings, N., & Peacock, B. (1993). Statistical Distributions(2nd edition).New York, NY:
Wiley.
Hussain, S.S., and Sprent, P. (1983). Nonparametric Regression. Journal of the Royal Statistical
Society. Series A.
Jinlin, Z., Zhiqiang, G., and Zhihuan, S. (2017). Variational Bayesian Gaussian Mixture Regression
for Soft Sensing Key Variables in Non-Gaussian Industrial Processes. IEEE Trans. Contr. Sys.
Techn.
Laplace, P.S. (1800). Seances Des Ecoles Normales.
Meenai, Y.A., and Yasmeen, F. (2008). Nonparametric regression analysis. Proceedings of the 8t
Islamic Countries Conference of Statistical Sciences, Lahore - Pakistan.
Michael, E.T., and Christopher, M. B. (1999). Probabilistic Principal Component Analysis. Journal
of the Royal Statistical Society, Series B. .
Rousseeuw, P.J., and Leroy, A.M. (1987). Robust regression and outliers detection. New York, NY:
Wiley.
Sunita Chulani (1999). Bayesian analysis of empirical software engineering cost models. IEEE
transactions on Software Engineering.
Theil, H. (1950). A rank-invariant method of linear and polynomial regression analysis.
Indagationes Mathematicae
William, M. Bolstad. (2010). Understanding computational Bayesian Statistics. Wiley series in
computational Statistics. John Wiley & Sons, Inc., Hoboken, New Jersey.
William, M. Bolstad. (2004). Introduction to Bayesian Statistics. John Wiley & Sons,
Inc., Hoboken, New Jersey.
Wolpert, Robert L., and Mengersen, Kerrie L. (2004). Adjusted Likelihoods for Synthesizing
EmpiricalEvidence from Studies that Differ in Quality and Design:Effects of Environmental
TobaccoSmoke. Statist. Sci.

Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Ayoade Adewole

This work is licensed under a Creative Commons Attribution 4.0 International License.