The K-nearest neighbours classifier (KNN) is a type of instance-based learning, or lazy learning, where the function is only approximated locally and all computation is deferred until classification. Thus, the KNN approach is among the simplest of all discriminative approaches, but this classifier is still especially effective for low-dimensional feature spaces.
The input for the KNN algorithm consists of the K closest training samples in the feature space and the output is a class label l. An observation (or testing sample) f(y) is classified by a majority vote of its neighbours, with the observation being labelled by the class most common among its K nearest neighbours. In case of K = 1 the class of that single nearest neighbour is simply
assigned to the observation f(y).
In order to estimate the potentials we consider the class of every neighbour as a vote for the most likely class of the observation. If the number of neighbours, having class l is Kl we can define the probability of the association potentials as:
\(\alpha = x_1\)