Shenglong Zhou

Orcid: 0000-0003-2843-1614

Affiliations:
  • University of Southampton, Southampton, School of Mathematics, UK
  • Beijing Jiaotong University, Department of Applied Mathematics, China (former)


According to our database1, Shenglong Zhou authored at least 25 papers between 2013 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Accretionary Learning With Deep Neural Networks With Applications.
IEEE Trans. Cogn. Commun. Netw., April, 2024

2023
Federated Learning Via Inexact ADMM.
IEEE Trans. Pattern Anal. Mach. Intell., August, 2023

FedGiA: An Efficient Hybrid Algorithm for Federated Learning.
IEEE Trans. Signal Process., 2023

Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications.
J. Comput. Appl. Math., 2023

New Environment Adaptation with Few Shots for OFDM Receiver and mmWave Beamforming.
CoRR, 2023

Few-Shot Learning for New Environment Adaptation.
Proceedings of the IEEE Global Communications Conference, 2023

2022
Computing One-Bit Compressive Sensing via Double-Sparsity Constrained Optimization.
IEEE Trans. Signal Process., 2022

Sparse SVM for Sufficient Data Reduction.
IEEE Trans. Pattern Anal. Mach. Intell., 2022

Support Vector Machine Classifier via $L_{0/1}$L0/1 Soft-Margin Loss.
IEEE Trans. Pattern Anal. Mach. Intell., 2022

Semismooth Newton-type method for bilevel optimization: global convergence and extensive numerical experiments.
Optim. Methods Softw., 2022

Exact Penalty Method for Federated Learning.
CoRR, 2022

0/1 Deep Neural Networks via Block Coordinate Descent.
CoRR, 2022

2021
Newton Hard-Thresholding Pursuit for Sparse Linear Complementarity Problem via A New Merit Function.
SIAM J. Sci. Comput., 2021

Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization.
SIAM J. Optim., 2021

Newton method for ℓ<sub>0</sub>-regularized optimization.
Numer. Algorithms, 2021

Global and Quadratic Convergence of Newton Hard-Thresholding Pursuit.
J. Mach. Learn. Res., 2021

An extended Newton-type algorithm for ℓ2-regularized sparse logistic regression and its efficiency for classifying large-scale datasets.
J. Comput. Appl. Math., 2021

Theoretical and numerical comparison of the Karush-Kuhn-Tucker and value function reformulations in bilevel optimization.
Comput. Optim. Appl., 2021

2020
Robust Euclidean embedding via EDM optimization.
Math. Program. Comput., 2020

Matrix Optimization Over Low-Rank Spectral Sets: Stationary Points and Local and Global Minimizers.
J. Optim. Theory Appl., 2020

2019
Support Vector Machine Classifier via L<sub>0/1</sub> Soft-Margin Loss.
CoRR, 2019

2018
A Fast Matrix Majorization-Projection Method for Penalized Stress Minimization With Box Constraints.
IEEE Trans. Signal Process., 2018

2015
A half thresholding projection algorithm for sparse solutions of LCPs.
Optim. Lett., 2015

2013
Exact Recovery for Sparse Signal via Weighted l_1 Minimization.
CoRR, 2013

New RIC Bounds via l_q-minimization with 0<q<=1 in Compressed Sensing.
CoRR, 2013


  Loading...