PAPERS
Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation
IEEE Transactions on Mobile Computing2026
We propose SCARLET, a communication-efficient distillation-based Federated Learning framework. By caching soft-labels to minimize redundant transmission and utilizing an enhanced Entropy Reduction Aggregation, it achieves up to 50% communication cost reduction while maintaining competitive accuracy.
TALKS
Beyond Multiprocessing: A Real-World ML Workload Speedup with Python 3.13+ Free-Threading
PyCon JP 2025Talk


















