Hiroyuki Kasai  Research interestsOverviewOptimization theory, machine learning, and signal processing (since 2013)
Multimedia coding & delivery protocol & system (since 1996)
Recent worksHere are some recent works. Riemannian optimization: overviewLet be a smooth realvalued function on a Riemannian manifold . We consider where is a given model variable. This problem has many applications; for example, in principal component analysis (PCA) and the subspace tracking problem on the Grassmann manifold. The lowrank matrix & tensor completion problem is a promising application concerning the manifold of fixedrank matrices/tensors. The linear regression problem is also defined on the manifold of fixedrank matrices. The independent component analysis (ICA) is an example defined on the oblique manifold. This problem can be solved by Riemannian deterministic algorithms (the Riemannian steepest descent, conjugate gradient descent, quasiNewton method, Newton's method, end trustregion (TR) algorithm etc.) or Riemannian stochastic algorithms mentioned below.
Riemannian stochastic gradient algorithms and theoretical convergence analysisWe specifically consider where is the total number of the elements, which is generally extremely large. A popular choice is the Riemannian stochastic gradient descent algorithm (RSGD). As RSGD calculates only for the th sample, the complexity per iteration is independent of the size of . However, RSGD is hindered by a slow convergence rate. To this end, we propose the Riemannian stochastic variance reduced gradient algorithm (RSVRG), which reduces the variance of the noisy stochastic gradient. RSQNVR has also recently been proposed, which achieves practical improvements for illconditioned problems. We also propose the Riemannian stochastic recursive gradient algorithm (RSRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of them are performed on both retractionconvex and nonconvex functions under computationally efficient retraction and vector transport operations.
Riemannian inexact trustregion algorithm and theoretical analysisThe Riemannian trustregion algorithm (RTR) comes with a global convergence property, and a superlinear local convergence rate to a secondorder optimal point. However, it is computationally prohibitive in a largescale setting to handle big Hessian matrices. The proposed algorithm approximates the gradient and the Hessian in addition to the solution of a TR subproblem. Addressing largescale finitesum problems, we specifically propose subsampled algorithms (SubRTR) with a fixed bound on subsampled Hessian and gradient sizes, where the gradient and Hessian are computed by a random sampling technique.
Riemannian manifold geometry and its application to optimization and machine learning problems
Riemannian manifold optimization utilizations
Lowrank matrix and tensor approximation
Stochastic optimization
