Abstract We propose Rademacher complexity bounds for multi-class classififiers trained with a two-step semisupervised model. In the fifirst step, the algorithm partitions the partially labeled data and then identififies dense clusters containing κ predominant classes using the labeled training examples such that the proportion of their non-predominant classes is below a fifixed threshold stands for clustering consistency. In the second step, a classififier is trained by minimizing a margin empirical loss over the labeled training set and a penalization term measuring the disability of the learner to predict the κ predominant classes of the identifified clusters. The resulting data-dependent generalization error bound involves the margin distribution of the classififier, the stability of the clustering technique used in the fifirst step and Rademacher complexity terms corresponding to partially labeled training data. Our theoretical result exhibit convergence rates extending those proposed in the literature for the binary case, and experimental results show empirical evidence that supports the theory