On the closest vector problem with a distance guarantee

Daniel Dadush, Oded Regev, Noah Stephens-Davidowitz

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a new efficient algorithm for the search version of the approximate Closest Vector Problem with Preprocessing (CVPP). Our algorithm achieves an approximation factor of O(n√log n), improving on the previous best of O(n1.5) due to Lag arias, Lenstra, and Schnorr {hkzbabai}. We also show, somewhat surprisingly, that only O(n) vectors of preprocessing advice are sufficient to solve the problem (with the slightly worse approximation factor of O(n)). We remark that this still leaves a large gap with respect to the decisional version of CVPP, where the best known approximation factor is O(√n/log n) due to Aharonov and Regev [2]. To achieve these results, we show a reduction to the same problem restricted to target points that are close to the lattice and a more efficient reduction to a harder problem, Bounded Distance Decoding with preprocessing (BDDP). Combining either reduction with the previous best-known algorithm for BDDP by Liu, Lyubashevsky, and Micciancio [3] gives our main result. In the setting of CVP without preprocessing, we also give a reduction from (1+ε)γ approximate CVP to γ approximate CVP where the target is at distance at most 1+1/ε times the minimum distance (the length of the shortest non-zero vector) which relies on the lattice sparsification techniques of Dadush and Kun [4]. As our final and most technical contribution, we present a substantially more efficient variant of the LLM algorithm (both in terms of run-time and amount of preprocessing advice), and via an improved analysis, show that it can decode up to a distance proportional to the reciprocal of the smoothing parameter of the dual lattice [5]. We show that this is never smaller than the LLM decoding radius, and that it can be up to an wide tilde Ω(√n) factor larger.

Original languageEnglish (US)
Title of host publicationProceedings - IEEE 29th Conference on Computational Complexity, CCC 2014
PublisherIEEE Computer Society
Pages98-109
Number of pages12
ISBN (Print)9781479936267
DOIs
StatePublished - 2014
Event29th Annual IEEE Conference on Computational Complexity, CCC 2014 - Vancouver, BC, Canada
Duration: Jun 11 2014Jun 13 2014

Other

Other29th Annual IEEE Conference on Computational Complexity, CCC 2014
CountryCanada
CityVancouver, BC
Period6/11/146/13/14

Fingerprint

Preprocessing
Decoding
Target
Decode
Smoothing Parameter
Approximation
Minimum Distance
Best Approximation
Efficient Algorithms
Directly proportional
Radius
Sufficient

Keywords

  • BDD
  • BDDP
  • CVP
  • CVPP

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Computational Mathematics

Cite this

Dadush, D., Regev, O., & Stephens-Davidowitz, N. (2014). On the closest vector problem with a distance guarantee. In Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014 (pp. 98-109). [6875479] IEEE Computer Society. https://doi.org/10.1109/CCC.2014.18

On the closest vector problem with a distance guarantee. / Dadush, Daniel; Regev, Oded; Stephens-Davidowitz, Noah.

Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014. IEEE Computer Society, 2014. p. 98-109 6875479.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Dadush, D, Regev, O & Stephens-Davidowitz, N 2014, On the closest vector problem with a distance guarantee. in Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014., 6875479, IEEE Computer Society, pp. 98-109, 29th Annual IEEE Conference on Computational Complexity, CCC 2014, Vancouver, BC, Canada, 6/11/14. https://doi.org/10.1109/CCC.2014.18
Dadush D, Regev O, Stephens-Davidowitz N. On the closest vector problem with a distance guarantee. In Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014. IEEE Computer Society. 2014. p. 98-109. 6875479 https://doi.org/10.1109/CCC.2014.18
Dadush, Daniel ; Regev, Oded ; Stephens-Davidowitz, Noah. / On the closest vector problem with a distance guarantee. Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014. IEEE Computer Society, 2014. pp. 98-109
@inproceedings{f7f172fcc94947beb7d886b257157d16,
title = "On the closest vector problem with a distance guarantee",
abstract = "We present a new efficient algorithm for the search version of the approximate Closest Vector Problem with Preprocessing (CVPP). Our algorithm achieves an approximation factor of O(n√log n), improving on the previous best of O(n1.5) due to Lag arias, Lenstra, and Schnorr {hkzbabai}. We also show, somewhat surprisingly, that only O(n) vectors of preprocessing advice are sufficient to solve the problem (with the slightly worse approximation factor of O(n)). We remark that this still leaves a large gap with respect to the decisional version of CVPP, where the best known approximation factor is O(√n/log n) due to Aharonov and Regev [2]. To achieve these results, we show a reduction to the same problem restricted to target points that are close to the lattice and a more efficient reduction to a harder problem, Bounded Distance Decoding with preprocessing (BDDP). Combining either reduction with the previous best-known algorithm for BDDP by Liu, Lyubashevsky, and Micciancio [3] gives our main result. In the setting of CVP without preprocessing, we also give a reduction from (1+ε)γ approximate CVP to γ approximate CVP where the target is at distance at most 1+1/ε times the minimum distance (the length of the shortest non-zero vector) which relies on the lattice sparsification techniques of Dadush and Kun [4]. As our final and most technical contribution, we present a substantially more efficient variant of the LLM algorithm (both in terms of run-time and amount of preprocessing advice), and via an improved analysis, show that it can decode up to a distance proportional to the reciprocal of the smoothing parameter of the dual lattice [5]. We show that this is never smaller than the LLM decoding radius, and that it can be up to an wide tilde Ω(√n) factor larger.",
keywords = "BDD, BDDP, CVP, CVPP",
author = "Daniel Dadush and Oded Regev and Noah Stephens-Davidowitz",
year = "2014",
doi = "10.1109/CCC.2014.18",
language = "English (US)",
isbn = "9781479936267",
pages = "98--109",
booktitle = "Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014",
publisher = "IEEE Computer Society",

}

TY - GEN

T1 - On the closest vector problem with a distance guarantee

AU - Dadush, Daniel

AU - Regev, Oded

AU - Stephens-Davidowitz, Noah

PY - 2014

Y1 - 2014

N2 - We present a new efficient algorithm for the search version of the approximate Closest Vector Problem with Preprocessing (CVPP). Our algorithm achieves an approximation factor of O(n√log n), improving on the previous best of O(n1.5) due to Lag arias, Lenstra, and Schnorr {hkzbabai}. We also show, somewhat surprisingly, that only O(n) vectors of preprocessing advice are sufficient to solve the problem (with the slightly worse approximation factor of O(n)). We remark that this still leaves a large gap with respect to the decisional version of CVPP, where the best known approximation factor is O(√n/log n) due to Aharonov and Regev [2]. To achieve these results, we show a reduction to the same problem restricted to target points that are close to the lattice and a more efficient reduction to a harder problem, Bounded Distance Decoding with preprocessing (BDDP). Combining either reduction with the previous best-known algorithm for BDDP by Liu, Lyubashevsky, and Micciancio [3] gives our main result. In the setting of CVP without preprocessing, we also give a reduction from (1+ε)γ approximate CVP to γ approximate CVP where the target is at distance at most 1+1/ε times the minimum distance (the length of the shortest non-zero vector) which relies on the lattice sparsification techniques of Dadush and Kun [4]. As our final and most technical contribution, we present a substantially more efficient variant of the LLM algorithm (both in terms of run-time and amount of preprocessing advice), and via an improved analysis, show that it can decode up to a distance proportional to the reciprocal of the smoothing parameter of the dual lattice [5]. We show that this is never smaller than the LLM decoding radius, and that it can be up to an wide tilde Ω(√n) factor larger.

AB - We present a new efficient algorithm for the search version of the approximate Closest Vector Problem with Preprocessing (CVPP). Our algorithm achieves an approximation factor of O(n√log n), improving on the previous best of O(n1.5) due to Lag arias, Lenstra, and Schnorr {hkzbabai}. We also show, somewhat surprisingly, that only O(n) vectors of preprocessing advice are sufficient to solve the problem (with the slightly worse approximation factor of O(n)). We remark that this still leaves a large gap with respect to the decisional version of CVPP, where the best known approximation factor is O(√n/log n) due to Aharonov and Regev [2]. To achieve these results, we show a reduction to the same problem restricted to target points that are close to the lattice and a more efficient reduction to a harder problem, Bounded Distance Decoding with preprocessing (BDDP). Combining either reduction with the previous best-known algorithm for BDDP by Liu, Lyubashevsky, and Micciancio [3] gives our main result. In the setting of CVP without preprocessing, we also give a reduction from (1+ε)γ approximate CVP to γ approximate CVP where the target is at distance at most 1+1/ε times the minimum distance (the length of the shortest non-zero vector) which relies on the lattice sparsification techniques of Dadush and Kun [4]. As our final and most technical contribution, we present a substantially more efficient variant of the LLM algorithm (both in terms of run-time and amount of preprocessing advice), and via an improved analysis, show that it can decode up to a distance proportional to the reciprocal of the smoothing parameter of the dual lattice [5]. We show that this is never smaller than the LLM decoding radius, and that it can be up to an wide tilde Ω(√n) factor larger.

KW - BDD

KW - BDDP

KW - CVP

KW - CVPP

UR - http://www.scopus.com/inward/record.url?scp=84906658974&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84906658974&partnerID=8YFLogxK

U2 - 10.1109/CCC.2014.18

DO - 10.1109/CCC.2014.18

M3 - Conference contribution

SN - 9781479936267

SP - 98

EP - 109

BT - Proceedings - IEEE 29th Conference on Computational Complexity, CCC 2014

PB - IEEE Computer Society

ER -