# Low-Rank Matrix Completion

16 papers with code • 0 benchmarks • 0 datasets

**Low-Rank Matrix Completion** is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.

Source: Universal Matrix Completion

# Benchmarks

# Datasets

# Greatest papers with code

# Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport

In recent years, stochastic variance reduction algorithms have attracted considerable attention for minimizing the average of a large but finite number of loss functions.

# Riemannian stochastic variance reduced gradient on Grassmann manifold

In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a compact manifold search space.

# Depth Image Inpainting: Improving Low Rank Matrix Completion with Low Gradient Regularization

The proposed low gradient regularization is integrated with the low rank regularization into the low rank low gradient approach for depth image inpainting.

# Structured Low-Rank Algorithms: Theory, MR Applications, and Links to Machine Learning

In this survey, we provide a detailed review of recent advances in the recovery of continuous domain multidimensional signals from their few non-uniform (multichannel) measurements using structured low-rank matrix completion formulation.

# Orthogonal Rank-One Matrix Pursuit for Low Rank Matrix Completion

Numerical results show that our proposed algorithm is more efficient than competing algorithms while achieving similar or better prediction performance.

# Deep Generalization of Structured Low-Rank Algorithms (Deep-SLR)

The main challenge with this strategy is the high computational complexity of matrix completion.

# Adaptive Matrix Completion for the Users and the Items in Tail

In this work, we show that the skewed distribution of ratings in the user-item rating matrix of real-world datasets affects the accuracy of matrix-completion-based approaches.

# Guaranteed Rank Minimization via Singular Value Projection

Minimizing the rank of a matrix subject to affine constraints is a fundamental problem with many important applications in machine learning and statistics.

# A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate.