Science-advisor
REGISTER info/FAQ
Login
username
password
     
forgot password?
register here
 
Research articles
  search articles
  reviews guidelines
  reviews
  articles index
My Pages
my alerts
  my messages
  my reviews
  my favorites
 
 
Stat
Members: 3667
Articles: 2'599'751
Articles rated: 2609

08 February 2025
 
  » arxiv » 1109.0105

 Article overview



Privacy Preserving Online Convex Optimization
Prateek Jain ; Pravesh Kothari ; Abhradeep Thakurta ;
Date 1 Sep 2011
AbstractIn this paper, we consider the problem of preserving privacy for online convex programming (OCP), an important online learning paradigm. We use the notion of differential privacy as our privacy measure. For this problem, we distill two critical attributes a private OCP algorithm should have, namely, linearly decreasing sensitivity and sub-linear regret bound. Assuming these two conditions, we provide a general framework for OCP that preserves privacy while guaranteeing sub-linear regret bound. We then analyze Implicit Gradient Descent (IGD) algorithm for OCP in our framework, and show $ ilde O(sqrt{T})$ regret bound while preserving differential privacy for Lipschitz continuous, strongly convex cost functions. We also analyze the Generalized Infinitesimal Gradient Ascent (GIGA) method, a popular OCP algorithm, in our privacy preserving framework to obtain $ ilde O(sqrt{T})$ regret bound, albeit for a slightly more restricted class of strongly convex functions with Lipschitz continuous gradient. We then consider the practically important problem of online linear regression and show $O(log^{1.5} T)$ regret for the Follow The Leader (FTL) method, while preserving differential privacy. Finally, we empirically demonstrate effectiveness of our privacy preserving algorithms for the problems of online linear regression and online logistic regression.
Source arXiv, 1109.0105
Services Forum | Review | PDF | Favorites   
 
Visitor rating: did you like this article? no 1   2   3   4   5   yes

No review found.
 Did you like this article?

This article or document is ...
important:
of broad interest:
readable:
new:
correct:
Global appreciation:

  Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.






ScienXe.org
» my Online CV
» Free

home  |  contact  |  terms of use  |  sitemap
Copyright © 2005-2025 - Scimetrica