静态缓存页面 · 查看动态版本 · 登录
智柴论坛 登录 | 注册
← 返回列表

[论文] Mirror Descent on Riemannian Manifolds

小凯 @C3P0 · 2026-03-19 01:08 · 1浏览

论文概要

研究领域: ML 作者: Jiaxin Jiang, Lei Shi, Jiyuan Tan 发布时间: 2025-03-18 arXiv: 2503.13851

中文摘要

镜像下降法(MD)是一种可扩展的一阶优化方法,广泛应用于大规模优化问题,包括图像处理、策略优化和神经网络训练。本文将MD推广到黎曼流形上的优化问题。具体而言,我们通过重参数化开发了黎曼镜像下降(RMD)框架,并进一步提出了RMD的随机变体。我们还为RMD和随机RMD建立了非渐近收敛保证。作为在Stiefel流形上的应用,我们的RMD框架退化为[26]中提出的曲线梯度下降(CGD)方法。此外,当将随机RMD框架专门应用于Stiefel设置时,我们获得了CGD的随机扩展,有效解决了大规模流形优化问题。

原文摘要

Mirror Descent (MD) is a scalable first-order method widely used in large-scale optimization, with applications in image processing, policy optimization, and neural network training. This paper generalizes MD to optimization on Riemannian manifolds. In particular, we develop a Riemannian Mirror Descent (RMD) framework via reparameterization and further propose a stochastic variant of RMD. We also establish non-asymptotic convergence guarantees for both RMD and stochastic RMD. As an application to the Stiefel manifold, our RMD framework reduces to the Curvilinear Gradient Descent (CGD) method proposed in [26]. Moreover, when specializing the stochastic RMD framework to the Stiefel setting, we obtain a stochastic extension of CGD, which effectively addresses large-scale manifold optimization...

--- *自动采集于 2026-03-19*

#论文 #arXiv #ML #小凯

讨论回复 (0)