Julian Büchel
Orcid: 0000-0001-9495-7150
According to our database1,
Julian Büchel
authored at least 26 papers
between 2018 and 2025.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2025
CoRR, June, 2025
Efficient scaling of large language models with mixture of experts and 3D analog in-memory computing.
Nat. Comput. Sci., January, 2025
Nat. Mac. Intell., 2025
A framework for analog-digital mixed-precision neural network training and inference.
Proceedings of the IEEE International Symposium on Circuits and Systems, 2025
Proceedings of the IEEE International Symposium on Circuits and Systems, 2025
Analog AI Accelerators for Transformer-based Language Models: Hardware, Workload, and Power Performance.
Proceedings of the IEEE International Memory Workshop, 2025
Analog-AI Hardware Accelerators for Low-Latency Transformer-Based Language Models (Invited).
Proceedings of the IEEE Custom Integrated Circuits Conference, 2025
2024
Improving the Accuracy of Analog-Based In-Memory Computing Accelerators Post-Training.
Proceedings of the IEEE International Symposium on Circuits and Systems, 2024
Proceedings of the IEEE International Conference on Software Services Engineering, 2024
2023
Programming Weights to Analog In-Memory Computing Cores by Direct Minimization of the Matrix-Vector Multiplication Error.
IEEE J. Emerg. Sel. Topics Circuits Syst., December, 2023
Using the IBM Analog In-Memory Hardware Acceleration Kit for Neural Network Training and Inference.
CoRR, 2023
2022
ML-HW Co-Design of Noise-Robust TinyML Models and Always-On Analog Compute-in-Memory Edge Accelerator.
IEEE Micro, 2022
A 64-core mixed-signal in-memory compute chip based on phase-change memory for deep neural network inference.
CoRR, 2022
Proceedings of the Tenth International Conference on Learning Representations, 2022
2021
AnalogNets: ML-HW Co-Design of Noise-robust TinyML Models and Always-On Analog Compute-in-Memory Accelerator.
CoRR, 2021
CoRR, 2021
Supervised training of spiking neural networks for robust deployment on mixed-signal neuromorphic processors.
CoRR, 2021
Implementing Efficient Balanced Networks with Mixed-Signal Spike-Based Learning Circuits.
Proceedings of the IEEE International Symposium on Circuits and Systems, 2021
2018