Abstract
Quantile regression is a popular method with a wide range of scientific applications, but the computation for quantile regression is challenging. Hunter and Lange proposed an MM algorithm for solving optimization problems in parametric quantile regression models. For nonparametric and semiparametric quantile regression, their algorithm can be applied to estimate unknown quantile functions in a pointwise manner. However, the resulting estimates may suffer from drawbacks like nonsmooth with discontinuous points and unstable at extreme quantile levels. To remedy the above issues, we propose a new MM algorithm and show that it yields continuous, smoother, and faster estimated quantile functions. We systematically study the new MM algorithm using the local linear quantile regression model. We prove that the proposed algorithm preserves the monotone descent property in an asymptotic sense. We then extend it to some popular nonparametric and semiparametric quantile regression models. For semiparametric models, we propose new efficient backfitting algorithms based on the new MM algorithm. Compared to traditional backfitting algorithms, the new procedures can significantly reduce computational costs for fully iterative backfitting. The performance of the proposed algorithms is demonstrated via extensive simulation studies and a real data example. Supplementary materials for this article are available online.
Supplementary Materials
Appendix:
The Appendix contains all the proofs for this article. (appendix.pdf)
Code:
It contains the MATLAB code that implements the global MM algorithm. (code.zip)
Motorcycle dataset:
The motorcycle dataset is provided in the R package “MASS” as “mcycle”.
Disclosure Statement
The authors report there are no competing interests to declare.
Acknowledgments
The authors thank the editor, the associate editor, and two anonymous referees for their insightful comments, which have helped us substantially improve the quality of the article.