Host: The Japanese Society for Artificial Intelligence
Name : The 35th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 35
Location : [in Japanese]
Date : June 08, 2021 - June 11, 2021
k-nearest neighbour (k-NN) takes label average over a query ball, whose radius r<sub>k</sub> increases with larger k, and the non-zero radius results in a bias of the k-NN estimator. To reduce the bias, multiscale k-NN (MS-k-NN) first solves ordinary least squares (OLS) to predict the k-NN estimator at some points k=k<sub>1</sub>, k<sub>2</sub>, ..., k<sub>V</sub> from even-degree polynomials of the radius r<sub>k</sub>, and extrapolates the estimator to r=0. However, there remain two practical problems: (i) The polynomial used for extrapolation is derived from asymptotic theory; in finite-sample situations, the MS-k-NN estimator with even-degree polynomials is not necessarily restricted to a proper range [0,1]. (ii) OLS implicitly assumes the independence of the k-NN estimators at k=k<sub>1</sub>, k<sub>2</sub>, ..., k<sub>V</sub>, whereas the estimators utilizing some same labels are dependent. To solve these problems, we propose employing sigmoid-based functions and generalized least squares. We also propose local radial logistic regression (LRLR), which is inspired by MS-k-NN.