We propose to study equivariance in deep neural networks through parameter symmetries. In particular, given a group that acts discretely on the input and output of a standard neural network layer we show that is equivariant with respect to -action iff explains the symmetries of the network parameters W. Inspired by this observation, we then propose two parameter-sharing schemes to induce the desirable symmetry on W. Our procedure for tying the parameters achieves -equivariance and, under some conditions on the action of , it guarantees sensitivity to all other permutation groups outside .