Jary Pomponi

Orcid: 0000-0003-3236-3941

According to our database1, Jary Pomponi authored at least 22 papers between 2020 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2025
Adaptive Semantic Token Communication for Transformer-based Edge Inference.
CoRR, May, 2025

Class Incremental Learning with probability dampening and cascaded gated classifier.
Neurocomputing, 2025

Adaptive layer and token selection for efficient fine-tuning of vision transformers.
Neurocomputing, 2025

2024
Conditional computation in neural networks: Principles and research trends.
Intelligenza Artificiale, 2024

Adaptive Layer Selection for Efficient Vision Transformer Fine-Tuning.
CoRR, 2024

Joint or Disjoint: Mixing Training Regimes for Early-Exit Models.
CoRR, 2024

Cascaded Scaling Classifier: class incremental learning with probability scaling.
CoRR, 2024

NACHOS: Neural Architecture Search for Hardware Constrained Early Exit Neural Networks.
CoRR, 2024

Adaptive Semantic Token Selection for AI-native Goal-oriented Communications.
Proceedings of the IEEE Globecom Workshops 2024, 2024

Goal-Oriented Communications Based on Recursive Early Exit Neural Networks.
Proceedings of the 58th Asilomar Conference on Signals, 2024

2023
Continual learning with invertible generative models.
Neural Networks, July, 2023

Rearranging Pixels is a Powerful Black-Box Attack for RGB and Infrared Deep Learning Models.
IEEE Access, 2023

2022
Centroids Matching: an efficient Continual Learning approach operating in the embedding space.
Trans. Mach. Learn. Res., 2022

A Probabilistic Re-Intepretation of Confidence Scores in Multi-Exit Models.
Entropy, 2022

Pixle: a fast and effective black-box attack based on rearranging pixels.
Proceedings of the International Joint Conference on Neural Networks, 2022

2021
Structured Ensembles: An approach to reduce the memory footprint of ensemble methods.
Neural Networks, 2021

Bayesian Neural Networks with Maximum Mean Discrepancy regularization.
Neurocomputing, 2021

Avalanche: an End-to-End Library for Continual Learning.
CoRR, 2021


2020
DeepRICH: learning deeply Cherenkov detectors.
Mach. Learn. Sci. Technol., 2020

Efficient continual learning in neural networks with embedding regularization.
Neurocomputing, 2020

Pseudo-Rehearsal for Continual Learning with Normalizing Flows.
CoRR, 2020


  Loading...