Differentially-private learning of low dimensional manifolds

Anna Choromanska, Krzysztof Choromanski, Geetha Jagannathan, Claire Monteleoni

Research output: Contribution to journalArticle

Abstract

In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.

Original languageEnglish (US)
Pages (from-to)91-104
Number of pages14
JournalTheoretical Computer Science
Volume620
DOIs
StatePublished - Mar 21 2016

Fingerprint

Learning algorithms
Data structures
Random Projection
Doubling
High-dimensional
Query
Manifold Learning
Alternate
Privacy
Learning Algorithm
Data Structures
Learning
Face

Keywords

  • Differential-privacy
  • Doubling dimension
  • Low dimensional manifolds
  • Random projection tree

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Differentially-private learning of low dimensional manifolds. / Choromanska, Anna; Choromanski, Krzysztof; Jagannathan, Geetha; Monteleoni, Claire.

In: Theoretical Computer Science, Vol. 620, 21.03.2016, p. 91-104.

Research output: Contribution to journalArticle

Choromanska, Anna ; Choromanski, Krzysztof ; Jagannathan, Geetha ; Monteleoni, Claire. / Differentially-private learning of low dimensional manifolds. In: Theoretical Computer Science. 2016 ; Vol. 620. pp. 91-104.
@article{1ce8c0d1042a47beaf50c00aecc6cb18,
title = "Differentially-private learning of low dimensional manifolds",
abstract = "In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.",
keywords = "Differential-privacy, Doubling dimension, Low dimensional manifolds, Random projection tree",
author = "Anna Choromanska and Krzysztof Choromanski and Geetha Jagannathan and Claire Monteleoni",
year = "2016",
month = "3",
day = "21",
doi = "10.1016/j.tcs.2015.10.039",
language = "English (US)",
volume = "620",
pages = "91--104",
journal = "Theoretical Computer Science",
issn = "0304-3975",
publisher = "Elsevier",

}

TY - JOUR

T1 - Differentially-private learning of low dimensional manifolds

AU - Choromanska, Anna

AU - Choromanski, Krzysztof

AU - Jagannathan, Geetha

AU - Monteleoni, Claire

PY - 2016/3/21

Y1 - 2016/3/21

N2 - In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.

AB - In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.

KW - Differential-privacy

KW - Doubling dimension

KW - Low dimensional manifolds

KW - Random projection tree

UR - http://www.scopus.com/inward/record.url?scp=84958280835&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84958280835&partnerID=8YFLogxK

U2 - 10.1016/j.tcs.2015.10.039

DO - 10.1016/j.tcs.2015.10.039

M3 - Article

VL - 620

SP - 91

EP - 104

JO - Theoretical Computer Science

JF - Theoretical Computer Science

SN - 0304-3975

ER -