放大镜
搜索加载器

Shyi-Ming Chen & Witold Pedrycz 
Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems 

支持
The book provides a timely coverage of the paradigm of knowledge distillation-an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher-student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.
€225.32
支付方式
语言 英语 ● 格式 EPUB ● ISBN 9783031320958 ● 编辑 Shyi-Ming Chen & Witold Pedrycz ● 出版者 Springer International Publishing ● 发布时间 2023 ● 下载 3 时 ● 货币 EUR ● ID 9060340 ● 复制保护 Adobe DRM
需要具备DRM功能的电子书阅读器

来自同一作者的更多电子书 / 编辑

16,155 此类电子书