Zohre Aminifard

Orcid: 0000-0001-8828-0227

According to our database1, Zohre Aminifard authored at least 16 papers between 2019 and 2024.

Collaborative distances:
  • Dijkstra number2 of six.
  • Erdős number3 of five.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
A class of CG algorithms overcoming jamming of the iterative solving process and its application in image restoration.
J. Comput. Appl. Math., May, 2024

Eigenvalue Analyses on the Memoryless Davidon-Fletcher-Powell Method Based on a Spectral Secant Equation.
J. Optim. Theory Appl., January, 2024

2023
A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions.
Comput. Appl. Math., December, 2023

A nonlinear mixed-integer programming approach for variable selection in linear regression model.
Commun. Stat. Simul. Comput., November, 2023

A nonmonotone ADMM-based diagonal quasi-Newton Update with Application to the compressive Sensing Problem.
Math. Model. Anal., October, 2023

Nonmonotone Quasi-Newton-based conjugate gradient methods with application to signal processing.
Numer. Algorithms, August, 2023

An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method.
Asia Pac. J. Oper. Res., June, 2023

An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions.
Optim. Methods Softw., May, 2023

A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem.
Optim. Lett., March, 2023

2022
A hybrid quasi-Newton method with application in sparse recovery.
Comput. Appl. Math., September, 2022

Modified conjugate gradient method for solving sparse recovery problem with nonconvex penalty.
Signal Process., 2022

Improved high-dimensional regression models with matrix approximations applied to the comparative case studies with support vector machines.
Optim. Methods Softw., 2022

Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing.
Numer. Algorithms, 2022

2020
A restart scheme for the Dai-Liao conjugate gradient method by ignoring a direction of maximum magnification by the search direction matrix.
RAIRO Oper. Res., 2020

2019
Two-parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length.
Numer. Algorithms, 2019

An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix.
4OR, 2019


  Loading...