Publications

CNN Mixture-of-Depths

Published in ACCV, 2024 [ArXiv]

We introduce Mixture-of-Depths (MoD) for Convolutional Neural Networks (CNNs), a novel approach that enhances the computational efficiency of CNNs by selectively processing channels based on their relevance to the current prediction.

Squeeze-and-Remember Block

Published in ICMLA, 2024 [ArXiv]

CNNs lack a dynamic feature retention mechanism similar to the human brain’s memory, limiting their ability to use learned information in new contexts. To bridge this gap, we introduce the “Squeeze-and-Remember” (SR) block, a novel architectural unit that gives CNNs dynamic memory-like functionalities.

Spectral Wavelet Dropout: Regularization in the Wavelet Domain

Published in ICMLA, 2024 [ArXiv]

This work introduces Spectral Wavelet Dropout (SWD), a novel regularization method that includes two variants: 1D-SWD and 2D-SWD. These variants improve CNN generalization by randomly dropping detailed frequency bands in the discrete wavelet decomposition of feature maps.

Spectral Batch Normalization: Normalization in the Frequency Domain

Published in IJCNN, 2023 [ArXiv]

Regularization is a set of techniques that are used to improve the generalization ability of deep neural networks. In this paper, we introduce spectral batch normalization (SBN), a novel effective method to improve generalization by normalizing feature maps in the frequency (spectral) domain.

Weight Compander: A Simple Weight Reparameterization for Regularization

Published in IJCNN, 2023 [ArXiv]

Regularization is a set of techniques that are used to improve the generalization ability of deep neural networks. In this paper, we introduce weight compander (WC), a novel effective method to improve generalization by reparameterizing each weight in deep neural networks using a nonlinear function.