Gianna Paulin

Orcid: 0000-0002-1310-0911

According to our database1, Gianna Paulin authored at least 12 papers between 2017 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Marsellus: A Heterogeneous RISC-V AI-IoT End-Node SoC With 2-8 b DNN Acceleration and 30%-Boost Adaptive Body Biasing.
IEEE J. Solid State Circuits, January, 2024

2023
Marsellus: A Heterogeneous RISC-V AI-IoT End-Node SoC with 2-to-8b DNN Acceleration and 30%-Boost Adaptive Body Biasing.
CoRR, 2023

A 12.4TOPS/W @ 136GOPS AI-IoT System-on-Chip with 16 RISC-V, 2-to-8b Precision-Scalable DNN Acceleration and 30%-Boost Adaptive Body Biasing.
Proceedings of the IEEE International Solid- State Circuits Conference, 2023

ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers.
Proceedings of the IEEE/ACM International Symposium on Low Power Electronics and Design, 2023


2022
Vau Da Muntanialas: Energy-Efficient Multi-Die Scalable Acceleration of RNN Inference.
IEEE Trans. Circuits Syst. I Regul. Pap., 2022

CONVOLVE: Smart and seamless design of smart edge processors.
CoRR, 2022

Soft Tiles: Capturing Physical Implementation Flexibility for Tightly-Coupled Parallel Processing Clusters.
Proceedings of the IEEE Computer Society Annual Symposium on VLSI, 2022

MiniFloat-NN and ExSdotp: An ISA Extension and a Modular Open Hardware Unit for Low-Precision Training on RISC-V Cores.
Proceedings of the 29th IEEE Symposium on Computer Arithmetic, 2022

2021
RNN-Based Radio Resource Management on Multicore RISC-V Accelerator Architectures.
IEEE Trans. Very Large Scale Integr. Syst., 2021

2018
Chipmunk: A systolically scalable 0.9 mm<sup>2</sup>, 3.08Gop/s/mW @ 1.2 mW accelerator for near-sensor recurrent neural network inference.
Proceedings of the 2018 IEEE Custom Integrated Circuits Conference, 2018

2017
Chipmunk: A Systolically Scalable 0.9 mm<sup>2</sup>, 3.08 Gop/s/mW @ 1.2 mW Accelerator for Near-Sensor Recurrent Neural Network Inference.
CoRR, 2017


  Loading...