Zephyrnet Logo

Simpler (classical) and faster (quantum) algorithms for Gibbs partition functions

Date:

Srinivasan Arunachalam1, Vojtech Havlicek1,2, Giacomo Nannicini1, Kristan Temme1, and Pawel Wocjan1

1IBM Quantum, IBM T.J. Watson Research Center
2School of Mathematics, University of Bristol

Find this paper interesting or want to discuss? Scite or leave a comment on SciRate.

Abstract

We present classical and quantum algorithms for approximating partition functions of classical Hamiltonians at a given temperature. Our work has two main contributions: first, we modify the classical algorithm of Stefankovic, Vempala and Vigoda (J. ACM, 56(3), 2009) to improve its sample complexity; second, we quantize this new algorithm, improving upon the previously fastest quantum algorithm for this problem, due to Harrow and Wei (SODA 2020).
The conventional approach to estimating partition functions requires approximating the means of Gibbs distributions at a set of inverse temperatures that form the so-called cooling schedule. The length of the cooling schedule directly affects the complexity of the algorithm. Combining our improved version of the algorithm of Stefankovic, Vempala and Vigoda with the paired-product estimator of Huber (Ann. Appl. Probab., 25(2), 2015), our new quantum algorithm uses a shorter cooling schedule than previously known. This length matches the optimal length conjectured by Stefankovic, Vempala and Vigoda. The quantum algorithm also achieves a quadratic advantage in the number of required quantum samples compared to the number of random samples drawn
by the best classical algorithm, and its computational complexity has quadratically better dependence on the spectral gap of the Markov chains used to produce the quantum samples.

Partition functions are the normalizing factors of the Gibbs distribution. Their estimation is surprisingly useful: zero temperature partition functions for a suitably chosen Hamiltonian can for example encode the number of graph colorings or the number of subgraphs of a graph. It is therefore important to understand how to improve their computation with quantum algorithms.

Here we present classical and quantum algorithms for this problem. Our work has two main contributions: first, we modify the classical algorithm of Stefankovic, Vempala and Vigoda (J. ACM, 56(3), 2009) to improve its sample complexity; second, we quantize this new algorithm, improving upon the previously fastest quantum algorithm for this problem, due to Harrow and Wei (SODA 2020).

The usual approach to estimation of partition functions requires approximating the means of Gibbs distributions at a set of inverse temperatures that form the so-called cooling schedule. The length of the cooling schedule directly affects the complexity of the algorithm. Combining our improved version of the algorithm of Stefankovic, Vempala and Vigoda with the paired-product estimator of Huber (Ann. Appl. Probab., 25(2), 2015), our new quantum algorithm uses a shorter cooling schedule than previously known. This length matches the optimal length conjectured by Stefankovic, Vempala and Vigoda. The quantum algorithm also achieves a quadratic advantage in the number of required quantum samples compared to the number of random samples drawn by the best classical algorithm, and its computational complexity has quadratically better dependence on the spectral gap of the Markov chains used to produce the quantum samples.

► BibTeX data

► References

[1] Ivona Bezáková, Daniel Štefankovič, Vijay V Vazirani, and Eric Vigoda. Accelerating simulated annealing for the permanent and combinatorial counting problems. SIAM Journal on Computing, 37(5):1429–1454, 2008. doi:10.1137/​050644033.
https:/​/​doi.org/​10.1137/​050644033

[2] Kurt Binder and Dieter W. Heermann. Monte Carlo Simulation in Statistical Physics: An Introduction (Graduate Texts in Physics). Springer, 2019.

[3] Gilles Brassard, Peter Hoyer, Michele Mosca, and Alain Tapp. Quantum amplitude amplification and estimation. Contemporary Mathematics, 305:53–74, 2002.

[4] Shouvanik Chakrabarti, Andrew M Childs, Shih-Han Hung, Tongyang Li, Chunhao Wang, and Xiaodi Wu. Quantum algorithm for estimating volumes of convex bodies. arXiv:1908.03903, 2019.
arXiv:1908.03903

[5] Guillaume Desjardins, Yoshua Bengio, and Aaron C Courville. On tracking the partition function. In Advances in neural information processing systems, pages 2501–2509. Citeseer, 2011.

[6] Martin Dyer and Alan Frieze. Computing the volume of convex bodies: a case where randomness provably helps. Probabilistic combinatorics and its applications, 44:123–170, 1991.

[7] Martin Dyer, Alan Frieze, and Ravi Kannan. A random polynomial-time algorithm for approximating the volume of convex bodies. Journal of the ACM (JACM), 38(1):1–17, 1991. doi:10.1145/​102782.102783.
https:/​/​doi.org/​10.1145/​102782.102783

[8] Andrew Gelman and Xiao-Li Meng. Simulating normalizing constants: From importance sampling to bridge sampling to path sampling. Statistical science, pages 163–185, 1998. doi:10.1214/​ss/​1028905934.
https:/​/​doi.org/​10.1214/​ss/​1028905934

[9] Paul Glasserman. Monte Carlo methods in financial engineering, volume 53. Springer Science & Business Media, 2013.

[10] Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning. MIT Press, 2016. http:/​/​www.deeplearningbook.org.
http:/​/​www.deeplearningbook.org

[11] Yassine Hamoudi and Frédéric Magniez. Quantum Chebyshev’s Inequality and Applications. In 46th International Colloquium on Automata, Languages, and Programming (ICALP), volume 132, pages 69:1–69:16, 2019. doi:10.4230/​LIPIcs.ICALP.2019.69.
https:/​/​doi.org/​10.4230/​LIPIcs.ICALP.2019.69

[12] Aram W Harrow and Annie Y Wei. Adaptive quantum simulated annealing for bayesian inference and estimating partition functions. In Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pages 193–212, 2020. doi:10.1137/​1.9781611975994.12.
https:/​/​doi.org/​10.1137/​1.9781611975994.12

[13] Geoffrey E Hinton, Simon Osindero, and Yee-Whye Teh. A fast learning algorithm for deep belief nets. Neural computation, 18(7):1527–1554, 2006. doi:10.1162/​neco.2006.18.7.1527.
https:/​/​doi.org/​10.1162/​neco.2006.18.7.1527

[14] Mark Huber. Approximation algorithms for the normalizing constant of Gibbs distributions. The Annals of Applied Probability, 25(2):974–985, 2015. doi:10.1214/​14-AAP1015.
https:/​/​doi.org/​10.1214/​14-AAP1015

[15] Mark Huber and Sarah Schott. Using TPA for Bayesian inference. Bayesian Statistics, 9:257–282, 2010. doi:10.1093/​acprof:oso/​9780199694587.003.0009.
https:/​/​doi.org/​10.1093/​acprof:oso/​9780199694587.003.0009

[16] Mark Jerrum and Alistair Sinclair. Approximating the permanent. SIAM Journal On Computing, 18(6):1149–1178, 1989. doi:10.1137/​0218077.
https:/​/​doi.org/​10.1137/​0218077

[17] Mark R. Jerrum, Leslie G. Valiant, and Vijay V. Vazirani. Random generation of combinatorial structures from a uniform distribution. Theoretical Computer Science, 43:169 – 188, 1986. doi:10.1016/​0304-3975(86)90174-X.
https:/​/​doi.org/​10.1016/​0304-3975(86)90174-X

[18] Vladimir Kolmogorov. A faster approximation algorithm for the Gibbs partition function. In Proceedings of the 31st Conference On Learning Theory, volume 75 of Proceedings of Machine Learning Research, pages 228–249. PMLR, 2018.

[19] Qiang Liu and Alexander T. Ihler. Bounding the partition function using holder’s inequality. In Proceedings of the 28th International Conference on Machine Learning, ICML, pages 849–856. Omnipress, 2011.

[20] F. Magniez, A. Nayak, J. Roland, and M. Santha. Search via quantum walk. SIAM Journal on Computing, 40(1):142–164, 2011. doi:10.1137/​090745854.
https:/​/​doi.org/​10.1137/​090745854

[21] Ashley Montanaro. Quantum speedup of monte carlo methods. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 471(2181):20150301, 2015.

[22] Elchanan Mossel and Allan Sly. Exact thresholds for ising–gibbs samplers on general graphs. Ann. Probab., 41(1):294–328, 01 2013. doi:10.1214/​11-AOP737.
https:/​/​doi.org/​10.1214/​11-AOP737

[23] Daniel Štefankovič, Santosh Vempala, and Eric Vigoda. Adaptive simulated annealing: A near-optimal connection between sampling and counting. Journal of the ACM (JACM), 56(3):1–36, 2009. doi:10.1145/​1516512.1516520.
https:/​/​doi.org/​10.1145/​1516512.1516520

[24] Mario Szegedy. Quantum speed-up of markov chain based algorithms. In 45th Annual IEEE symposium on foundations of computer science, pages 32–41. IEEE, 2004.

[25] J. P. Valleau and D. N. Card. Monte carlo estimation of the free energy by multistage sampling. The Journal of Chemical Physics, 57(12):5457–5462, 1972.

[26] Eric Vigoda. Improved bounds for sampling colorings. Journal of Mathematical Physics, 41(3):1555–1569, 2000. doi:10.1063/​1.533196.
https:/​/​doi.org/​10.1063/​1.533196

[27] Marc Vuffray, Sidhant Misra, Andrey Lokhov, and Michael Chertkov. Interaction screening: Efficient and sample-optimal learning of ising models. In Advances in Neural Information Processing Systems, pages 2595–2603, 2016.

[28] Pawel Wocjan and Anura Abeyesinghe. Speedup via quantum sampling. Physical Review A, 78(4):042336, 2008. doi:10.1103/​PhysRevA.78.042336.
https:/​/​doi.org/​10.1103/​PhysRevA.78.042336

Cited by

[1] Patrick Rall, “Faster Coherent Quantum Algorithms for Phase, Energy, and Amplitude Estimation”, arXiv:2103.09717.

[2] Chen-Fu Chiang, Anirban Chowdhury, and Pawel Wocjan, “Space-efficient Quantization Method for Reversible Markov Chains”, arXiv:2206.06886.

[3] Garrett T. Floyd, David P. Landau, and Michael R. Geller, “Quantum algorithm for Wang-Landau sampling”, arXiv:2208.09543.

[4] Pawel Wocjan and Kristan Temme, “Szegedy Walk Unitaries for Quantum Maps”, arXiv:2107.07365.

[5] Patrick Rall and Bryce Fuller, “Amplitude Estimation from Quantum Signal Processing”, arXiv:2207.08628.

The above citations are from SAO/NASA ADS (last updated successfully 2022-09-05 12:10:09). The list may be incomplete as not all publishers provide suitable and complete citation data.

On Crossref’s cited-by service no data on citing works was found (last attempt 2022-09-05 12:10:06).

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?