Shilpak Banerjee

Orcid: 0000-0003-1036-9576

According to our database1, Shilpak Banerjee authored at least 9 papers between 2020 and 2022.

Collaborative distances:
  • Dijkstra number2 of six.
  • Erdős number3 of five.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2022
SAU: Smooth Activation Function Using Convolution with Approximate Identities.
Proceedings of the Computer Vision - ECCV 2022, 2022

Smooth Maximum Unit: Smooth Activation Function for Deep Networks using Smoothing Maximum Technique.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022

ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions.
Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, 2022

2021
SMU: smooth activation function for deep networks using smoothing maximum technique.
CoRR, 2021

Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks.
CoRR, 2021

TanhSoft - Dynamic Trainable Activation Functions for Faster Learning and Better Performance.
IEEE Access, 2021

EIS - Efficient and Trainable Activation Functions for Better Accuracy and Performance.
Proceedings of the Artificial Neural Networks and Machine Learning - ICANN 2021, 2021

2020
EIS - a family of activation functions combining Exponential, ISRU, and Softplus.
CoRR, 2020

TanhSoft - a family of activation functions combining Tanh and Softplus.
CoRR, 2020


  Loading...