66
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Locally time-varying parameter regression

Pages 269-300 | Received 08 Jan 2023, Accepted 04 Mar 2024, Published online: 10 Apr 2024
 

Abstract.

I discuss a framework to allow dynamic sparsity in time-varying parameter regression models. The conditional variances of the innovations of time-varying parameters are time varying and equal to zero adaptively via thresholding. The resulting model allows the dynamics of the time-varying parameters to mix over different frequencies of parameter changes in a data driven way and permits great flexibility while achieving model parsimony. A convenient strategy is discussed to infer if each coefficient is static or dynamic and, if dynamic, how frequent the parameter change is. An MCMC scheme is developed for model estimation. The performance of the proposed approach is illustrated in studies of both simulated and real economic data.

JEL codes::

Acknowledgments

I would like to thank Professor Esfandiar Maasoumi (the editor) and three referees for many invaluable comments that have greatly improved the article. All remaining errors are my own. The views in this article are solely the author’s responsibility and are not related to the company the author works in.

Disclosure statement

The author reports that there are no competing interests to declare.

Notes

1. In my experiments, allowing individualized AR coefficient ρj for each latent variable zj, t appears to be over-parametrized and tends to produce numerically unstable estimates of the stationary probability of zero process variance, which is a key parameter for model inference in this article.

2. Examples include Fruhwirth-Schnatter and Wagner (Citation2010), Belmonte et al. (Citation2014), Bitto and Fruhwirth-Schnatter (Citation2019), Cadonna et al. (Citation2020), Hauzenberger et al. (Citation2020), etc.

3. Another example is Huber et al. (Citation2019) that applies a thresholding approximation in MCMC draws of the mixture innovation model to speed up computation but suffers from convergence issues (Dufays et al. Citation2021).

4. See Kowal et al. (Citation2019) for documented superior performance of the dynamic horseshoe model in studies comparing with other dynamic shrinkage models such as the dynamic normal-gamma approach of Kalli and Griffin (Citation2014).

5. In the equation of the decomposition, note that the union of θq and Θθq is Θ.

6. Alternative approaches to simulate the latent states from a linear Gaussian state space system include Fruhwirth-Schnatter (Citation1994), Rue (Citation2001), McCausland et al. (Citation2011), etc.

7. The proposed sampler for the block {β,z,θq} can also be derived by the partially collapsed Gibbs sampler approach of Park and van Dyk (Citation2009). I thank a reviewer for pointing this out.

8. Producing 1,000 posterior draws takes about 114 seconds on a standard desktop computer with a 3.0 GHz Intel Core i5 CPU running in MATLAB R2020b.

9. The IF is computed by the initial monotone sequence method of Geyer (Citation1992). A smaller IF value implies less correlated and hence better mixed posterior draws.

10. The log growth rate is computed as the difference of the logarithm of the U.S. industrial production index at the last month of each calendar quarter, multiplied by 100∕3.

11. The series names are INDPRO, TB3MS, and GS10.

12. Trace plots for the parameters log(v2), log(q1q) and log(1+ρ1ρ) are provided in Appendix I.

13. Such a finding is consistent with previous studies such as Estrella et al. (Citation2003).

14. Additional results of the point-wise posterior mean of ω are provided in Appendix I.

16. The results for the full set of regressors are provided in Appendix J.

17. To avoid the risk of near-singular sample covariance matrix, one can add ϵiIK to A in the i+1th draw where ϵ is a small positive number (e.g., 1e-6).

18. Sampling from the GIG distribution is by adapting the Matlab function gigrnd written by Enes Makalic and Daniel Schimidt that implements an algorithm from Devroye (Citation2014).

19. The gamma distribution is parameterized such that the mean of G(a, b) is ab.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.