Neural Processing Letters manuscript No. (will be inserted by the editor) Global Rademacher Complexity Bounds: From Slow to Fast Convergence Rates Luca Oneto · Alessandro Ghio · Sandro Ridella · Davide Anguita Received: date / Accepted: date Abstract Previous works in literature showed that performance estimations of learn- ing procedures can be characterized by a convergence rate ranging between O(n −1 ) (fast rate) and O(n − 1 / 2 ) (slow rate). In order to derive such result, some assump- tions on the problem are required; moreover, even when convergence is fast, the con- stants characterizing the bounds are often quite loose. In this work, we prove new Rademacher Complexity (RC) based bounds, which do not require any additional as- sumptions for achieving a fast convergence rate O(n −1 ) in the optimistic case and a slow rate O(n − 1 / 2 ) in the general case. At the same time, they are characterized by smaller constants with respect to other state-of-the-art RC fast converging alter- natives in literature. The results proposed in this work are obtained by exploiting the fundamental work of Talagrand on concentration inequalities for product measures and empirical processes. As a further issue, we also provide the extension of the re- sults to the semi-supervised learning framework, showing how additional unlabeled samples allow improving the tightness of the derived bound. Keywords Statistical Learning Theory · Performance Estimation · Rademacher Complexity · Fast Rates Luca Oneto · Alessandro Ghio · Sandro Ridella DITEN, University of Genova, Via Opera Pia 11a, I-16145 Genova, Italy Tel.: +39-(0)10-353 2691 Fax: +39-(0)10-353 2897 E-mail: {Luca.Oneto,Alessandro.Ghio,Sandro.Ridella}@unige.it Davide Anguita DIBRIS, University of Genova, Via Opera Pia 13, I-16145 Genova, Italy Tel.: +39-(0)10-353 2800 Fax: +39-(0)10-353 2897 E-mail: Davide.Anguita@unige.it