Kitsuya Azuma
  • HOME
  • PUBLICATION
  • BOOKSHELF

TECH BLOG

blog_1
blog_2
blog_3
blog_4
blog_5
blog_6
blog_7
blog_8
blog_9
blog_10
blog_11
blog_12
blog_13
blog_14
blog_15
blog_16
blog_17
blog_18
blog_19

PAPERS

Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation

IEEE Transactions on Mobile Computing2026

We propose SCARLET, a communication-efficient distillation-based Federated Learning framework. By caching soft-labels to minimize redundant transmission and utilizing an enhanced Entropy Reduction Aggregation, it achieves up to 50% communication cost reduction while maintaining competitive accuracy.

TALKS

Beyond Multiprocessing: A Real-World ML Workload Speedup with Python 3.13+ Free-Threading

PyCon JP 2025Talk

研究室サーバーとKubeflowで実践するNotebook as a Service

CloudNative Days Summer 2025LT

rootlessコンテナのすゝめ - 研究室サーバーでもできる安全なコンテナ管理

学生と社会人LT会 #2LT
Kitsuya Azuma