Olivier Fercoq

Orcid: 0000-0002-3393-9757

According to our database1, Olivier Fercoq authored at least 44 papers between 2012 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient.
Open J. Math. Optim., January, 2023

Solving stochastic weak Minty variational inequalities without increasing batch size.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

2022
Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems.
Math. Oper. Res., November, 2022

On the Convergence of Stochastic Primal-Dual Hybrid Gradient.
SIAM J. Optim., 2022

Power-aware feature selection for optimized Analog-to-Feature converter.
Microelectron. J., 2022

Escaping limit cycles: Global convergence for constrained nonconvex-nonconcave minimax problems.
Proceedings of the Tenth International Conference on Learning Representations, 2022

2021
Scalable Semidefinite Programming.
SIAM J. Math. Data Sci., 2021

A generic coordinate descent solver for non-smooth convex optimisation.
Optim. Methods Softw., 2021

2020
An adaptive primal-dual framework for nonsmooth convex minimization.
Math. Program. Comput., 2020

Screening Rules and its Complexity for Active Set Identification.
CoRR, 2020

Restarting the accelerated coordinate descent method with a rough strong convexity estimate.
Comput. Optim. Appl., 2020

Feature selection algorithms for flexible analog-to-feature converter.
Proceedings of the 18th IEEE International New Circuits and Systems Conference, 2020

Improved Optimistic Algorithms for Logistic Bandits.
Proceedings of the 37th International Conference on Machine Learning, 2020

Random extrapolation for primal-dual coordinate descent.
Proceedings of the 37th International Conference on Machine Learning, 2020

2019
A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions.
SIAM J. Optim., 2019

Improving Evolutionary Strategies with Generative Neural Networks.
CoRR, 2019

Stochastic Conditional Gradient Method for Composite Convex Minimization.
CoRR, 2019

Stochastic Frank-Wolfe for Composite Convex Minimization.
Proceedings of the Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, 2019

A Conditional-Gradient-Based Augmented Lagrangian Framework.
Proceedings of the 36th International Conference on Machine Learning, 2019

Safe Grid Search with Optimal Complexity.
Proceedings of the 36th International Conference on Machine Learning, 2019

Almost surely constrained convex optimization.
Proceedings of the 36th International Conference on Machine Learning, 2019

Benchmarking GNN-CMA-ES on the BBOB noiseless testbed.
Proceedings of the Genetic and Evolutionary Computation Conference Companion, 2019

2018
A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization.
SIAM J. Optim., 2018

Neural Generative Models for Global Optimization with Gradients.
CoRR, 2018

A Conditional Gradient Framework for Composite Convex Minimization with Applications to Semidefinite Programming.
Proceedings of the 35th International Conference on Machine Learning, 2018

Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2018

2017
Gap Safe Screening Rules for Sparsity Enforcing Penalties.
J. Mach. Learn. Res., 2017

Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization.
Proceedings of the Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 2017

Data sparse nonparametric regression with ε-insensitive losses.
Proceedings of The 9th Asian Conference on Machine Learning, 2017

2016
Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent.
SIAM Rev., 2016

Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression.
CoRR, 2016

Joint quantile regression in vector-valued RKHSs.
Proceedings of the Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, 2016

GAP Safe Screening Rules for Sparse-Group Lasso.
Proceedings of the Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, 2016

SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization.
Proceedings of the 33nd International Conference on Machine Learning, 2016

Using big steps in coordinate descent primal-dual algorithms.
Proceedings of the 55th IEEE Conference on Decision and Control, 2016

2015
Accelerated, Parallel, and Proximal Coordinate Descent.
SIAM J. Optim., 2015

GAP Safe screening rules for sparse multi-task and multi-class models.
Proceedings of the Advances in Neural Information Processing Systems 28: Annual Conference on Neural Information Processing Systems 2015, 2015

Mind the duality gap: safer rules for the Lasso.
Proceedings of the 32nd International Conference on Machine Learning, 2015

2014
Synchronisation and control of proliferation in cycling cell population models with age structure.
Math. Comput. Simul., 2014

Fast distributed coordinate descent for non-strongly convex losses.
Proceedings of the IEEE International Workshop on Machine Learning for Signal Processing, 2014

2013
Ergodic Control and Polyhedral Approaches to PageRank Optimization.
IEEE Trans. Autom. Control., 2013

Smooth minimization of nonsmooth functions with parallel coordinate descent methods.
CoRR, 2013

Parallel Coordinate Descent for the Adaboost Problem.
Proceedings of the 12th International Conference on Machine Learning and Applications, 2013

2012
PageRank optimization applied to spam detection.
Proceedings of the 6th International Conference on Network Games, 2012


  Loading...