### Abstract

We study the polynomial reconstruction problem for low-degree multivariate polynomials over double-struck F sign[2]. In this problem, we are given a set of points x ∈{0, 1}^{n} and target values f(x) ∈ {0, 1} for each of these points, with the promise that there is a polynomial over double-struck F sign[2] of degree at most d that agrees with f at 1 - ε fraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2^{-d} + δ fraction of the points for any ε, δ > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d = 1, which follows from a celebrated result of Håstad [16]. In the setting of Computational Learning, our result shows the hardness of (non-proper)agnostic learning of parities, where the learner is allowed a low-degree polynomial over double-struck F sign[2] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results extend to multivariate polynomial reconstruction over any finite field.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2007 |

Pages | 349-359 |

Number of pages | 11 |

DOIs | |

State | Published - 2007 |

Event | 48th Annual Symposium on Foundations of Computer Science, FOCS 2007 - Providence, RI, United States Duration: Oct 20 2007 → Oct 23 2007 |

### Other

Other | 48th Annual Symposium on Foundations of Computer Science, FOCS 2007 |
---|---|

Country | United States |

City | Providence, RI |

Period | 10/20/07 → 10/23/07 |

### Fingerprint

### ASJC Scopus subject areas

- Engineering(all)

### Cite this

*Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2007*(pp. 349-359). [4389506] https://doi.org/10.1109/FOCS.2007.4389506

**Hardness of reconstructing multivariate polynomials over finite fields.** / Gopalan, Parikshit; Khot, Subhash; Saket, Rishi.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2007.*, 4389506, pp. 349-359, 48th Annual Symposium on Foundations of Computer Science, FOCS 2007, Providence, RI, United States, 10/20/07. https://doi.org/10.1109/FOCS.2007.4389506

}

TY - GEN

T1 - Hardness of reconstructing multivariate polynomials over finite fields

AU - Gopalan, Parikshit

AU - Khot, Subhash

AU - Saket, Rishi

PY - 2007

Y1 - 2007

N2 - We study the polynomial reconstruction problem for low-degree multivariate polynomials over double-struck F sign[2]. In this problem, we are given a set of points x ∈{0, 1}n and target values f(x) ∈ {0, 1} for each of these points, with the promise that there is a polynomial over double-struck F sign[2] of degree at most d that agrees with f at 1 - ε fraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2-d + δ fraction of the points for any ε, δ > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d = 1, which follows from a celebrated result of Håstad [16]. In the setting of Computational Learning, our result shows the hardness of (non-proper)agnostic learning of parities, where the learner is allowed a low-degree polynomial over double-struck F sign[2] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results extend to multivariate polynomial reconstruction over any finite field.

AB - We study the polynomial reconstruction problem for low-degree multivariate polynomials over double-struck F sign[2]. In this problem, we are given a set of points x ∈{0, 1}n and target values f(x) ∈ {0, 1} for each of these points, with the promise that there is a polynomial over double-struck F sign[2] of degree at most d that agrees with f at 1 - ε fraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2-d + δ fraction of the points for any ε, δ > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d = 1, which follows from a celebrated result of Håstad [16]. In the setting of Computational Learning, our result shows the hardness of (non-proper)agnostic learning of parities, where the learner is allowed a low-degree polynomial over double-struck F sign[2] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results extend to multivariate polynomial reconstruction over any finite field.

UR - http://www.scopus.com/inward/record.url?scp=46749152630&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=46749152630&partnerID=8YFLogxK

U2 - 10.1109/FOCS.2007.4389506

DO - 10.1109/FOCS.2007.4389506

M3 - Conference contribution

SN - 0769530109

SN - 9780769530109

SP - 349

EP - 359

BT - Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2007

ER -