This experimental paper from 2009 by Akhmatskaya, Bou-Rabee and Reich in the Journal of Computational Physics takes a look at what in statistics people would call HMC with partial momentum refreshment.The paper got my attention mainly because I met Elena Akhmatskaya at MCMSki.
The ‘generalized’ in the paper refers to the fact that the momenta are not resampled at each new HMC step, but rather according towhere
is the old momentum, and we mix it with another Gaussian RV of the same distribution
for some parameter
. We recover HMC for
, as in that case the momentum is always refreshed.
Also, they are looking at a so-called shadow HMC variant, which really samples from a Hamiltonian system that is not the original one, but close to it. Correction for sampling from the wrong distribution is done by importance weighting (when she was telling me that I thought “I knew IS could work in high dimensions!”, as they sample from up to 10,000 dimensions and do better than HMC!).
Now the main gist of the paper is that they where trying to get rid of momentum flips in the case of rejection of the proposal, as it was previously thought that momentum flips lead to a worse sampler. The intuition would be that a momentum flip leads you back to where you started. However, the finding is that to the contrary, their version without flip (that still satisfies a detailed balance criterion) does worse in actual estimation tasks than the conservative Generalized Shadow HMC (GSHMC) sampler. I like when things stay easy. But as this paper does not describe the actual GSHMC method in detail, I will have to go back to the original paper, as my interest is sparked.