close
test_template

Limitations of K-nearest Neighbor Classification

About this sample

About this sample

close

Words: 541 |

Page: 1|

3 min read

Published: Apr 30, 2020

Words: 541|Page: 1|3 min read

Published: Apr 30, 2020

K-Nearest Neighbor (KNN) is one of the most popular algorithms for pattern recognition. Many researchers have found that the KNN algorithm accomplishes very good performance in their experiments on different data sets. The traditional KNN classification algorithm has three limitations: (i) calculation complexity due to the usage of all the training samples for classification, (ii) the performance is solely dependent on the training set, and selection of k.

'Why Violent Video Games Shouldn't Be Banned'?

Nearest neighbor search is one of the most popular learning and classification techniques introduced by Fix and Hodges, which has been proved to be a simple and powerful recognition algorithm. Cover and Hart showed that the decision rule performs well considering that no explicit knowledge of the data is available. A simple generalization of this method is called K-NN rule, in which a new pattern is classified into the class with the most members present among the K nearest neighbors.

The traditional KNN text classification has three limitations:

Get a custom paper now from our expert writers.

  1. High calculation complexity: To find out the k nearest neighbor samples, all the similarities between the of training samples is less, the KNN classifier is no longer optimal, but if the training set contains a huge number of samples, the KNN classifier needs more time to calculate the similarities. This problem can be solved in 3 ways: reducing the dimensions of the feature space; using smaller data sets; using improved algorithm which can accelerate to;
  2. Dependency on the training set: The classifier is generated only with the training samples and it does not use any additional data. This makes the algorithm to depend on the training set excessively; it needs recalculation even if there is a small change on training set;
  3. No weight difference between samples: All the training samples are treated equally; there is no difference between the samples with small number of data and huge number of data. So it doesn’t match the actual phenomenon where the samples have uneven distribution commonly.

Efficiency of kNNC depends largely upon the effective selection of k-Nearest Neighbors. The limitation of conventional kNNC is that once we choose the criteria for k-Nearest Neighbors selection, the criteria remain unchanged. But this characteristic of kNNC is not suitable for many cases if we want to make a correct prediction or classification in real life. An instance is described in the database by using a number of attributes and the corresponding values of those attributes. So similarity between any two instances is identified by the similarity of attribute values. But in real life data when we are describing two instances and are trying to find out the similarity between those two, similarities in different attributes do not weigh same with respect to a particular classification. Moreover, as with time more training data keeps on coming it may happen that similarity in a particular attribute value carries more or less importance than before. For example, say we are trying to predict the outcome of a soccer game based on the previous results. Now in that prediction, the place and the weather plays a very important role in the outcome of the game. But in future if all the soccer games are played in indoor stadiums then the field weather is no longer going to have same effect on the outcome of the game.

Image of Dr. Charlotte Jacobson
This essay was reviewed by
Dr. Charlotte Jacobson

Cite this Essay

Limitations Of K-Nearest Neighbor Classification. (2020, April 30). GradesFixer. Retrieved April 27, 2024, from https://gradesfixer.com/free-essay-examples/limitations-of-k-nearest-neighbor-classification/
“Limitations Of K-Nearest Neighbor Classification.” GradesFixer, 30 Apr. 2020, gradesfixer.com/free-essay-examples/limitations-of-k-nearest-neighbor-classification/
Limitations Of K-Nearest Neighbor Classification. [online]. Available at: <https://gradesfixer.com/free-essay-examples/limitations-of-k-nearest-neighbor-classification/> [Accessed 27 Apr. 2024].
Limitations Of K-Nearest Neighbor Classification [Internet]. GradesFixer. 2020 Apr 30 [cited 2024 Apr 27]. Available from: https://gradesfixer.com/free-essay-examples/limitations-of-k-nearest-neighbor-classification/
copy
Keep in mind: This sample was shared by another student.
  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours
Write my essay

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

close

Where do you want us to send this sample?

    By clicking “Continue”, you agree to our terms of service and privacy policy.

    close

    Be careful. This essay is not unique

    This essay was donated by a student and is likely to have been used and submitted before

    Download this Sample

    Free samples may contain mistakes and not unique parts

    close

    Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

    close

    Thanks!

    Please check your inbox.

    We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

    clock-banner-side

    Get Your
    Personalized Essay in 3 Hours or Less!

    exit-popup-close
    We can help you get a better grade and deliver your task on time!
    • Instructions Followed To The Letter
    • Deadlines Met At Every Stage
    • Unique And Plagiarism Free
    Order your paper now