# Stable and Symmetric Filter Convolutional Neural Network

## Abstract

First we present a proof that convolutional neural networks (CNN) with max-norm regularization, max-pooling, and Relu non-linearity are stable to additive noise. Second, we explore the use of symmetric and antisymmetric filters in a baseline CNN model on digit classification, which enjoys the stability to additive noise. Experimental results indicate that the symmetric CNN outperforms the baseline model for nearly all training sizes and matches the state-of-the-art deep-net in the cases of limited training examples.

## Results

For a transformation, $\Phi$, to be stable to additive noise $x'(u) = x(u) + \epsilon(u)$, it needs a Lipschitz continuity condition as defined in [bruna2013invariant], $$||\Phi x-\Phi x'||_2 \leq C \cdot ||x-x'||_2$$ for a constant $C > 0$, and for all $x$ and $x'$. $\Phi x$ denotes the transformed feature. We have shown that a CNN with the following operations, 1) convolution with max-norm regularization, 2) element-wise Relu non-linearity, and 3) max-pooling satisfies the Lipschitz continuity condition. Next, we explored the use of symmetric and antisymmetric filters in a baseline CNN model on digit classification, which enjoys the stability to additive noise and linear-phase condition. Experimental results indicate that the symmetric CNN outperforms the baseline model for nearly all training sizes and matches the state-of-the-art deep-net in the cases of limited training examples on the MNIST digit recognition task.

Trained symmetric and anti-symmetric filters for the MNIST digit recognition task are shown below.

## Citation

@inproceedings{
yeh2016stable,
title={Stable and symmetric filter convolutional neural network},
author={Yeh, Raymond and Hasegawa-Johnson, Mark and Do, Mink N},
booktitle={2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
pages={2652--2656},
year={2016},
organization={IEEE}
}