Balanced Bayesian LASSO for heavy tails

Daniel F Linder, Viral Panchal, Hani Samawi, Duchwan Ryu

Research output: Contribution to journalArticle

Abstract

Regression procedures are not only hindered by large p and small n, but can also suffer in cases when outliers are present or the data generating mechanisms are heavy tailed. Since the penalized estimates like the least absolute shrinkage and selection operator (LASSO) are equipped to deal with the large p small n by encouraging sparsity, we combine a LASSO type penalty with the absolute deviation loss function, instead of the standard least squares loss, to handle the presence of outliers and heavy tails. The model is cast in a Bayesian setting and a Gibbs sampler is derived to efficiently sample from the posterior distribution. We compare our method to existing methods in a simulation study as well as on a prostate cancer data set and a base deficit data set from trauma patients.

Original languageEnglish (US)
Pages (from-to)1115-1132
Number of pages18
JournalJournal of Statistical Computation and Simulation
Volume86
Issue number6
DOIs
StatePublished - Apr 12 2016

Fingerprint

Heavy Tails
Shrinkage
Outlier
Operator
Prostate Cancer
Gibbs Sampler
Loss Function
Posterior distribution
Sparsity
Least Squares
Penalty
Deviation
Regression
Simulation Study
Estimate
Heavy tails
Outliers
Model
Prostate cancer
Loss function

Keywords

  • Lasso
  • heavy tail
  • loss function
  • outlier
  • regression
  • sparsity

ASJC Scopus subject areas

  • Statistics and Probability
  • Modeling and Simulation
  • Statistics, Probability and Uncertainty
  • Applied Mathematics

Cite this

Balanced Bayesian LASSO for heavy tails. / Linder, Daniel F; Panchal, Viral; Samawi, Hani; Ryu, Duchwan.

In: Journal of Statistical Computation and Simulation, Vol. 86, No. 6, 12.04.2016, p. 1115-1132.

Research output: Contribution to journalArticle

Linder, Daniel F ; Panchal, Viral ; Samawi, Hani ; Ryu, Duchwan. / Balanced Bayesian LASSO for heavy tails. In: Journal of Statistical Computation and Simulation. 2016 ; Vol. 86, No. 6. pp. 1115-1132.
@article{b37c44b807004ea2a9793a526ac1bad6,
title = "Balanced Bayesian LASSO for heavy tails",
abstract = "Regression procedures are not only hindered by large p and small n, but can also suffer in cases when outliers are present or the data generating mechanisms are heavy tailed. Since the penalized estimates like the least absolute shrinkage and selection operator (LASSO) are equipped to deal with the large p small n by encouraging sparsity, we combine a LASSO type penalty with the absolute deviation loss function, instead of the standard least squares loss, to handle the presence of outliers and heavy tails. The model is cast in a Bayesian setting and a Gibbs sampler is derived to efficiently sample from the posterior distribution. We compare our method to existing methods in a simulation study as well as on a prostate cancer data set and a base deficit data set from trauma patients.",
keywords = "Lasso, heavy tail, loss function, outlier, regression, sparsity",
author = "Linder, {Daniel F} and Viral Panchal and Hani Samawi and Duchwan Ryu",
year = "2016",
month = "4",
day = "12",
doi = "10.1080/00949655.2015.1053886",
language = "English (US)",
volume = "86",
pages = "1115--1132",
journal = "Journal of Statistical Computation and Simulation",
issn = "0094-9655",
publisher = "Taylor and Francis Ltd.",
number = "6",

}

TY - JOUR

T1 - Balanced Bayesian LASSO for heavy tails

AU - Linder, Daniel F

AU - Panchal, Viral

AU - Samawi, Hani

AU - Ryu, Duchwan

PY - 2016/4/12

Y1 - 2016/4/12

N2 - Regression procedures are not only hindered by large p and small n, but can also suffer in cases when outliers are present or the data generating mechanisms are heavy tailed. Since the penalized estimates like the least absolute shrinkage and selection operator (LASSO) are equipped to deal with the large p small n by encouraging sparsity, we combine a LASSO type penalty with the absolute deviation loss function, instead of the standard least squares loss, to handle the presence of outliers and heavy tails. The model is cast in a Bayesian setting and a Gibbs sampler is derived to efficiently sample from the posterior distribution. We compare our method to existing methods in a simulation study as well as on a prostate cancer data set and a base deficit data set from trauma patients.

AB - Regression procedures are not only hindered by large p and small n, but can also suffer in cases when outliers are present or the data generating mechanisms are heavy tailed. Since the penalized estimates like the least absolute shrinkage and selection operator (LASSO) are equipped to deal with the large p small n by encouraging sparsity, we combine a LASSO type penalty with the absolute deviation loss function, instead of the standard least squares loss, to handle the presence of outliers and heavy tails. The model is cast in a Bayesian setting and a Gibbs sampler is derived to efficiently sample from the posterior distribution. We compare our method to existing methods in a simulation study as well as on a prostate cancer data set and a base deficit data set from trauma patients.

KW - Lasso

KW - heavy tail

KW - loss function

KW - outlier

KW - regression

KW - sparsity

UR - http://www.scopus.com/inward/record.url?scp=84953838480&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84953838480&partnerID=8YFLogxK

U2 - 10.1080/00949655.2015.1053886

DO - 10.1080/00949655.2015.1053886

M3 - Article

AN - SCOPUS:84953838480

VL - 86

SP - 1115

EP - 1132

JO - Journal of Statistical Computation and Simulation

JF - Journal of Statistical Computation and Simulation

SN - 0094-9655

IS - 6

ER -