Ahmet Alacaoglu

Orcid: 0000-0002-2911-7048

According to our database1, Ahmet Alacaoglu authored at least 21 papers between 2017 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Extending the Reach of First-Order Algorithms for Nonconvex Min-Max Problems with Cohypomonotonicity.
CoRR, 2024

2023
Beyond the Golden Ratio for Variational Inequality Algorithms.
J. Mach. Learn. Res., 2023

Complexity of Single Loop Algorithms for Nonlinear Programming with Stochastic Objective and Constraints.
CoRR, 2023

Variance Reduced Halpern Iteration for Finite-Sum Monotone Inclusions.
CoRR, 2023

Convergence of First-Order Methods for Constrained Nonconvex Optimization with Dependent Data.
Proceedings of the International Conference on Machine Learning, 2023

2022
On the Convergence of Stochastic Primal-Dual Hybrid Gradient.
SIAM J. Optim., 2022

Convergence and Complexity of Stochastic Subgradient Methods with Dependent Data for Nonconvex Optimization.
CoRR, 2022

On the Complexity of a Practical Primal-Dual Coordinate Method.
CoRR, 2022

A Natural Actor-Critic Framework for Zero-Sum Markov Games.
Proceedings of the International Conference on Machine Learning, 2022

Stochastic Variance Reduction for Variational Inequality Methods.
Proceedings of the Conference on Learning Theory, 2-5 July 2022, London, UK., 2022

2021
Adaptation in Stochastic Algorithms: From Nonsmooth Optimization to Min-Max Problems and Beyond.
PhD thesis, 2021

Forward-reflected-backward method with variance reduction.
Comput. Optim. Appl., 2021

Convergence of adaptive algorithms for constrained weakly convex optimization.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

2020
An adaptive primal-dual framework for nonsmooth convex minimization.
Math. Program. Comput., 2020

Convergence of adaptive algorithms for weakly convex constrained optimization.
CoRR, 2020

Conditional gradient methods for stochastically constrained convex minimization.
Proceedings of the 37th International Conference on Machine Learning, 2020

A new regret analysis for Adam-type algorithms.
Proceedings of the 37th International Conference on Machine Learning, 2020

Random extrapolation for primal-dual coordinate descent.
Proceedings of the 37th International Conference on Machine Learning, 2020

2019
An Inexact Augmented Lagrangian Framework for Nonconvex Optimization with Nonlinear Constraints.
Proceedings of the Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, 2019

Almost surely constrained convex optimization.
Proceedings of the 36th International Conference on Machine Learning, 2019

2017
Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization.
Proceedings of the Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 2017


  Loading...