Abstract: Data-free knowledge distillation (DFKD) is a promising approach for addressing issues related to model compression, security privacy, and transmission restrictions. Although the existing ...
Abstract: Data augmentation can mitigate overfitting problems in data exploration without increasing the size of the model. Existing cutmix-based data augmentation has been proven to significantly ...