This paper presents a general framework for norm-based capacity control for weight normalized deep neural networks. We establish the upper bound on the Rademacher complexities of this family. With an normalization where and 1/p+1/p* = 1, we discuss properties of a width-independent capacity control, which only depends on the depth by a square root term. We further analyze the approximation properties of weight normalized deep neural networks. In particular, for an weight normalized network, the approximation error can be controlled by the norm of the output layer, and the corresponding generalization error only depends on the architecture by the square root of the depth.