Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
35th (2021)
Session ID : 1G4-GS-2c-01
Conference information

A Study on Regression and Loss Functions for Multiscale k-Nearest Neighbour
*Ruixing CAOTakuma TANAKAAkifumi OKUNOHidetoshi SHIMODAIRA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

k-nearest neighbour (k-NN) takes label average over a query ball, whose radius r<sub>k</sub> increases with larger k, and the non-zero radius results in a bias of the k-NN estimator. To reduce the bias, multiscale k-NN (MS-k-NN) first solves ordinary least squares (OLS) to predict the k-NN estimator at some points k=k<sub>1</sub>, k<sub>2</sub>, ..., k<sub>V</sub> from even-degree polynomials of the radius r<sub>k</sub>, and extrapolates the estimator to r=0. However, there remain two practical problems: (i) The polynomial used for extrapolation is derived from asymptotic theory; in finite-sample situations, the MS-k-NN estimator with even-degree polynomials is not necessarily restricted to a proper range [0,1]. (ii) OLS implicitly assumes the independence of the k-NN estimators at k=k<sub>1</sub>, k<sub>2</sub>, ..., k<sub>V</sub>, whereas the estimators utilizing some same labels are dependent. To solve these problems, we propose employing sigmoid-based functions and generalized least squares. We also propose local radial logistic regression (LRLR), which is inspired by MS-k-NN.

Content from these authors
© 2021 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top