人工智能培训

搜索

人工智能论文:随机复合优化的估计序列:方差降低,加速度和噪声稳健性(Estimate Sequences for Stochastic Composite Op

[复制链接]
wynrefer 发表于 2019-1-28 12:11:44 | 显示全部楼层 |阅读模式
wynrefer 2019-1-28 12:11:44 76 0 显示全部楼层
人工智能论文:随机复合优化的估计序列:方差降低,加速度和噪声稳健性(Estimate Sequences for Stochastic Composite Optimization: Variance  Reduction, Acceleration, and Robustness to Noise)在本文中,我们提出了基于梯度的算法forstochastic凸复合优化的统一视图。通过扩展Nesterov引入的估计序列的概念,我们将一大类随机优化方法解释为迭代地最小化目标替代的过程。这种观点涵盖了随机梯度下降(SGD),方差减少方法SAGA,SVRG,MISO,它们的近端变体,并且具有以下几个优点:(i)我们为所有上述方法提供了简单的通用收敛证明; (ii)我们自然地获得具有相同保证的新算法; (iii)我们推导出通用策略,使这些算法对随机噪声具有鲁棒性,这在数据被小的随机扰动破坏时很有用。最后,我们证明了这种观点是用来获得加速算法的。
In this paper, we propose a unified view of gradient-based algorithms forstochastic convex composite optimization.By extending the concept of estimatesequence introduced by Nesterov, we interpret a large class of stochasticoptimization methods as procedures that iteratively minimize a surrogate of theobjective.This point of view covers stochastic gradient descent (SGD), thevariance-reduction approaches SAGA, SVRG, MISO, their proximal variants, andhas several advantages: (i) we provide a simple generic proof of convergencefor all of the aforementioned methods;(ii) we naturally obtain new algorithmswith the same guarantees;(iii) we derive generic strategies to make thesealgorithms robust to stochastic noise, which is useful when data is corruptedby small random perturbations.Finally, we show that this viewpoint is usefulto obtain accelerated algorithms.人工智能论文:随机复合优化的估计序列:方差降低,加速度和噪声稳健性(Estimate Sequences for Stochastic Composite Optimization: Variance  Reduction, Acceleration, and Robustness to Noise) HzqgVRYHrRgMTkGY.jpg
URL地址:https://arxiv.org/abs/1901.08788     ----pdf下载地址:https://arxiv.org/pdf/1901.08788    ----人工智能论文:随机复合优化的估计序列:方差降低,加速度和噪声稳健性(Estimate Sequences for Stochastic Composite Optimization: Variance  Reduction, Acceleration, and Robustness to Noise)
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则 返回列表 发新帖

wynrefer当前离线
新手上路

查看:76 | 回复:0

快速回复 返回顶部 返回列表