Title
Unifying Presampling via Concentration Bounds
Abstract
Auxiliary-input (AI) idealized models, such as auxiliary-input random oracle model (AI-ROM) and auxiliary-input random permutation model (AI-PRM), play a critical role in assessing non-uniform security of symmetric key and hash function constructions. However, obtaining security bounds in these models is often much more challenging. The presampling technique, initially introduced by Unruh (CRYPTO' 07) for AI-ROM and later exported to several other models by Coretti et al. (EUROCRYPT' 18). It generically reduces security proofs in AI models to much simpler bit-fixing (BF) models, making it much easier to obtain concrete bounds in AI models. As a result, the presampling technique has leads to simpler proofs for many known bounds (e.g. one-way functions), and has been applied to many settings where the compression technique appears intractable (e.g., Merkle-Damgard hashing). We study the possibility of leveraging the presampling technique to the quantum world. To this end, - We show that such leveraging will resolve a major open problem in quantum computing, which is closely related to the famous Aaronson-Ambainis conjecture (ITCS' 11). - Faced with this barrier, we give a new but equivalent bit-fixing model and a simple proof of presampling techniques for arbitrary oracle distribution in the classical setting, including AI-ROM and AI-PRM. Our theorem matches the best-known security loss and unifies previous presampling techniques by Coretti et al. (EUROCRYPT' 18) and Coretti et al. (CRYPTO' 18). - Finally, we leverage our new classical presampling techniques to a novel "quantum bit-fixing" version of presampling. It matches the optimal security loss of the classical presampling. Using our techniques, we give the first post-quantum non-uniform security for salted Merkle-Damgard hash functions and reprove the tight nonuniform security for function inversion by Chung et al. (FOCS' 20).
Year
DOI
Venue
2021
10.1007/978-3-030-90459-3_7
THEORY OF CRYPTOGRAPHY, TCC 2021, PT I
DocType
Volume
ISSN
Conference
13042
0302-9743
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Siyao Guo1164.26
Qian Li200.34
Qipeng Liu300.34
Jiapeng Zhang400.68