Abstract For classifification with multiple labels, a common approach is to learn a classififier for each label. With a kernel-based classififier, there are two options to set up kernels: select a specifific kernel for each label or the same kernel for all labels. In this work, we present a unifified framework for multi-label multiple kernel learning, in which the above two approaches can be considered as two extreme cases. Moreover, our framework allows the kernels shared partially among multiple labels, enabling flflexible degrees of label commonality. We systematically study how the sharing of kernels among multiple labels affects the performance based on extensive experiments on various benchmark data including images and microarray data. Interesting fifindings concerning effificacy and effificiency are reported