Y. Amit and D. Geman, Shape quantization and recognition with randomized trees, Neural Computation, vol.9, pp.1545-1588, 1997.

P. Assouad, Deux remarques sur l'estimation. Comptes Rendus de l'Académie des Sciences, vol.296, pp.1021-1024, 1983.

J. , Y. Audibert, and A. B. Tsybakov, Fast learning rates for plug-in classifiers, Ann. Statist, vol.35, issue.2, pp.608-633, 2007.
URL : https://hal.archives-ouvertes.fr/hal-00160849

S. Boucheron, O. Bousquet, and G. Lugosi, Theory of classification: a survey of some recent advances, ESAIM Probab. Stat, vol.9, pp.323-375, 2005.
URL : https://hal.archives-ouvertes.fr/hal-00017923

Z. I. Botev, J. F. Grotowski, and D. P. Kroese, Kernel density estimation via diffusion, The Annals of Statistics, vol.38, pp.2916-2957, 2010.

J. Bretagnole and C. Huber, Estimation des densités: risque minimax. Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete, vol.47, pp.119-137, 1979.

S. Boucheron, G. Lugosi, and P. Massart, Concentration Inequalities: A Nonasymptotic Theory of Independence, 2013.
URL : https://hal.archives-ouvertes.fr/hal-00794821

O. E. Barndorff-nielsen and D. R. Cox, Asymptotic techniques for use in statistics, Monographs on Statistics and Applied Probability, 1989.

S. Gadat, T. Klein, and C. Marteau,

L. Breiman, Random forests. Machine Learning, vol.45, pp.5-32, 2001.

T. Cannings, Nearest neighbour classification in the tails of a distribution, 2013.

A. , Annales de l'Institut Henri Poincare (B) Probability and Statistics, vol.43, pp.763-774, 2007.

K. Chaudhuri and S. Dasgupta, Rates of convergence for nearest neighbor classification

K. Q. Lawrence and . Weinberger, Advances in Neural Information Processing Systems, vol.27, pp.3437-3445, 2014.

F. Cérou and A. Guyader, Nearest neighbor classification in infinite dimension, ESAIM Probab. Stat, vol.10, pp.340-355, 2006.

L. Devroye, On the almost everywhere convergence of nonparametric regression function estimates, Ann. Statist, vol.9, issue.6, pp.1310-1319, 1981.

L. Devroye, L. Györfi, A. Krzy?-zak, and G. Lugosi, On the strong universal consistency of nearest neighbor regression function estimates, Ann. Statist, vol.22, issue.3, pp.1371-1385, 1994.

L. Devroye, L. Györfi, and G. Lugosi, A probabilistic theory of pattern recognition, Applications of Mathematics, vol.31

. Springer-verlag, , 1996.

E. Fix and J. L. Hodges, Discriminatory analysis, nonparametric discrimination, consistency properties, vol.4, 1951.

Y. Freund and R. E. Schapire, A decision-theoretic generalization of on-line learning and an application to boosting, Second Annual European Conference on Computational Learning Theory (EuroCOLT '95, vol.55, pp.119-139, 1995.

L. Györfi and M. Kohler, Adam Krzy? zak, and Harro Walk. A distribution-free theory of nonparametric regression, Springer Series in Statistics, 2002.

A. Goldenshluger and O. Lepski, On adaptive minimax density estimation on R d, vol.159, pp.479-543, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01265245

L. Gy?, On the rate of convergence of nearest neighbor rules, IEEE Trans. Inform. Theory, vol.24, issue.4, pp.509-512, 1978.

J. Hüsler, R. Y. Liu, and K. Singh, A formula for the tail probability of a multivariate normal distribution and its applications, J. Multivariate Anal, vol.82, issue.2, pp.422-430, 2002.

P. Hall, U. Byeong, R. J. Park, and . Samworth, Choice of neighbor order in nearest-neighbor classification, Ann. Statist, vol.36, issue.5, pp.2135-2152, 2008.

G. Lecué, Simultaneous adaptation to the margin and to complexity in classification, Ann. Statist, vol.35, issue.4, pp.1698-1721, 2007.

H. Lian, Convergence of functional k-nearest neighbor regression estimate with functional responses, Electron. J. Stat, vol.5, pp.31-40, 2011.

S. Loustau and C. Marteau, Minimax fast rates for discriminant analysis with errors in variables, Bernoulli, vol.21, issue.1, pp.176-208, 2015.
URL : https://hal.archives-ouvertes.fr/hal-00660383

W. V. Li and Q. Shao, Gaussian processes: inequalities, small ball probabilities and applications, Stochastic processes: theory and methods, pp.533-597, 2001.

E. Mammen and A. B. Tsybakov, Smooth discrimination analysis. NEAREST NEIGHBOR RULE ON GENERAL FINITE DIMENSION SPACES 25

, Ann. Statist, vol.27, issue.6, pp.1808-1829, 1999.

P. Reynaud, -. Bouret, V. Rivoirard, and C. Tuleau-malot, Adaptive density estimation: a curse of support?, J. Statist. Plann. Inference, vol.141, issue.1, pp.115-139, 2011.
URL : https://hal.archives-ouvertes.fr/hal-00634421

R. Samworth, Optimal weighted nearest neighbour classifiers, Annals of Statistics, vol.40, pp.2733-2763, 2012.

I. Steinwart, Consistency of support vector machines and other regularized kernel classifiers, IEEE Trans. Inform. Theory, vol.51, issue.1, pp.128-142, 2005.

C. J. Stone, With discussion and a reply by the author, Ann. Statist, vol.5, issue.4, pp.595-645, 1977.

A. B. Tsybakov, Optimal aggregation of classifiers in statistical learning, Ann. Statist, vol.32, issue.1, pp.135-166, 2004.
URL : https://hal.archives-ouvertes.fr/hal-00102142

A. B. Tsybakov, Revised and extended from the 2004 French original, Springer Series in Statistics, 2009.

V. N. Vapnik, Adaptive and Learning Systems for Signal Processing, Communications, and Control, 1998.

. De-brienne, E-mail: sebastien.gadat@math.univ-toulouse.fr Institut Mathématiques de Toulouse Université Toulouse 3-Paul Sabatier 118 route de Narbonne, p.31400