Saman Babaie-Kafaki

Orcid: 0000-0003-0122-8384

According to our database1, Saman Babaie-Kafaki authored at least 48 papers between 2010 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
A class of CG algorithms overcoming jamming of the iterative solving process and its application in image restoration.
J. Comput. Appl. Math., May, 2024

Eigenvalue Analyses on the Memoryless Davidon-Fletcher-Powell Method Based on a Spectral Secant Equation.
J. Optim. Theory Appl., January, 2024

2023
A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions.
Comput. Appl. Math., December, 2023

A nonlinear mixed-integer programming approach for variable selection in linear regression model.
Commun. Stat. Simul. Comput., November, 2023

A nonmonotone ADMM-based diagonal quasi-Newton Update with Application to the compressive Sensing Problem.
Math. Model. Anal., October, 2023

Nonmonotone Quasi-Newton-based conjugate gradient methods with application to signal processing.
Numer. Algorithms, August, 2023

An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method.
Asia Pac. J. Oper. Res., June, 2023

An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions.
Optim. Methods Softw., May, 2023

A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem.
Optim. Lett., March, 2023

A survey on the Dai-Liao family of nonlinear conjugate gradient methods.
RAIRO Oper. Res., January, 2023

2022
A hybrid quasi-Newton method with application in sparse recovery.
Comput. Appl. Math., September, 2022

Modified conjugate gradient method for solving sparse recovery problem with nonconvex penalty.
Signal Process., 2022

Improved high-dimensional regression models with matrix approximations applied to the comparative case studies with support vector machines.
Optim. Methods Softw., 2022

Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing.
Numer. Algorithms, 2022

2020
A restart scheme for the Dai-Liao conjugate gradient method by ignoring a direction of maximum magnification by the search direction matrix.
RAIRO Oper. Res., 2020

2019
An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems.
RAIRO Oper. Res., 2019

An adaptive nonmonotone trust region algorithm.
Optim. Methods Softw., 2019

Two-parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length.
Numer. Algorithms, 2019

A randomized nonmonotone adaptive trust region method based on the simulated annealing strategy for unconstrained optimization.
Int. J. Intell. Comput. Cybern., 2019

A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm.
Int. J. Comput. Math., 2019

An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix.
4OR, 2019

2018
Two accelerated nonmonotone adaptive trust region line search methods.
Numer. Algorithms, 2018

2017
A class of adaptive Dai-Liao conjugate gradient methods based on the scaled memoryless BFGS update.
4OR, 2017

2016
A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method.
RAIRO Oper. Res., 2016

On optimality of two adaptive choices for the parameter of Dai-Liao method.
Optim. Lett., 2016

A modified scaling parameter for the memoryless BFGS updating formula.
Numer. Algorithms, 2016

Hybridizations of genetic algorithms and neighborhood search metaheuristics for fuzzy bus terminal location problems.
Appl. Soft Comput., 2016

Descent Symmetrization of the Dai-Liao Conjugate Gradient Method.
Asia Pac. J. Oper. Res., 2016

2015
A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach.
Optim. Methods Softw., 2015

A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods.
Numer. Algorithms, 2015

On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae.
J. Optim. Theory Appl., 2015

2014
A descent family of Dai-Liao conjugate gradient methods.
Optim. Methods Softw., 2014

Two modified three-term conjugate gradient methods with sufficient descent property.
Optim. Lett., 2014

Two modified scaled nonlinear conjugate gradient methods.
J. Comput. Appl. Math., 2014

The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices.
Eur. J. Oper. Res., 2014

A descent extension of the Polak-Ribière-Polyak conjugate gradient method.
Comput. Math. Appl., 2014

On the sufficient descent condition of the Hager-Zhang conjugate gradient methods.
4OR, 2014

2013
A modified two-point stepsize gradient algorithm for unconstrained minimization.
Optim. Methods Softw., 2013

Erratum to: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization.
Optim. Methods Softw., 2013

On the sufficient descent property of the Shanno's conjugate gradient method.
Optim. Lett., 2013

A Note on the Global Convergence of the Quadratic Hybridization of Polak-Ribière-Polyak and Fletcher-Reeves Conjugate Gradient Methods.
J. Optim. Theory Appl., 2013

A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization.
4OR, 2013

2012
A Quadratic Hybridization of Polak-Ribière-Polyak and Fletcher-Reeves Conjugate Gradient Methods.
J. Optim. Theory Appl., 2012

A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei.
Comput. Optim. Appl., 2012

An Efficient and Practically Robust Hybrid Metaheuristic Algorithm for Solving Fuzzy Bus Terminal Location Problems.
Asia Pac. J. Oper. Res., 2012

2011
Two effective hybrid conjugate gradient algorithms based on modified BFGS updates.
Numer. Algorithms, 2011

Two effective hybrid metaheuristic algorithms for minimization of multimodal functions.
Int. J. Comput. Math., 2011

2010
Two new conjugate gradient methods based on modified secant equations.
J. Comput. Appl. Math., 2010


  Loading...