Fully robust versions of the elastic net estimator are introduced for linear
and logistic regression. The algorithms to compute the estimators are based on
the idea of repeatedly applying the non-robust classical estimators to data
subsets only. It is shown how outlier-free subsets can be identified
efficiently, and how appropriate tuning parameters for the elastic net
penalties can be selected. A final reweighting step improves the efficiency of
the estimators. Simulation studies compare with non-robust and other competing
robust estimators and reveal the superiority of the newly proposed methods.
This is also supported by a reasonable computation time and by good performance
in real data examples.