Notice détaillée
Titre
Lin, Tao
Sciper ID
243057
Laboratoires affiliés
MLO
Publications
Don't Use Large Mini-Batches, Use Local SGD
Dynamic Model Pruning with Feedback
Ensemble Distillation for Robust Model Fusion in Federated Learning
Extrapolation for Large-batch Training in Deep Learning
GA-Par: Dependable Microservice Orchestration Framework for Geo-Distributed Clouds
Hybrid Neural Networks for Learning the Trend in Time Series
On Pitfalls of Test-time Adaptation
On the Loss Landscape of Adversarial Training: Identifying Challenges and How to Overcome Them
Quasi-Global Momentum: Accelerating Decentralized Deep Learning on Heterogeneous Data
Training DNNs with Hybrid Block Floating Point
Voir toutes les publications (13)
Dynamic Model Pruning with Feedback
Ensemble Distillation for Robust Model Fusion in Federated Learning
Extrapolation for Large-batch Training in Deep Learning
GA-Par: Dependable Microservice Orchestration Framework for Geo-Distributed Clouds
Hybrid Neural Networks for Learning the Trend in Time Series
On Pitfalls of Test-time Adaptation
On the Loss Landscape of Adversarial Training: Identifying Challenges and How to Overcome Them
Quasi-Global Momentum: Accelerating Decentralized Deep Learning on Heterogeneous Data
Training DNNs with Hybrid Block Floating Point
Voir toutes les publications (13)
Toutes les ressources
Toutes les ressources
Le document apparaît dans
Authorities > People