Batch Normalization (BN)(Ioffe and Szegedy 2015) normalizes the features of
an input image via statistics of a batch of images and hence BN will bring the
noise to the gradient of the training loss. Previous works indicate that the
noise is important for the optimization and generalization of deep neural
networks, but too much noise will harm the performance of networks. In our
paper, we offer a new point of view that self-attention mechanism can help to
regulate the noise by enhancing instance-specific information to obtain a
better regularization effect. Therefore, we propose an attention-based BN
called Instance Enhancement Batch Normalization (IEBN) that recalibrates the
information of each channel by a simple linear transformation. IEBN has a good
capacity of regulating noise and stabilizing network training to improve
generalization even in the presence of two kinds of noise attacks during
training. Finally, IEBN outperforms BN with only a light parameter increment in
image classification tasks for different network structures and benchmark
datasets.