In this tutorial, we will learn our next machine-learning model called nearest neighbors (NN), known as k-nearest Neighbors (k-NN). It belongs to the family of supervised learning algorithms, specifically within the category of instance-based learning (like SVM). Unlike many other models involving complex mathematical equations and intricate training processes, k-NN relies on a beautifully simple principle – that similar instances are likely to share common characteristics.
Traditional machine learning models go through a training phase where they learn patterns and relationships from the provided data. In contrast, k-NN doesn’t have a distinct training phase. It memorizes the entire training dataset, essentially storing it for later use in making predictions. This is the reason why it is called a lazy learning algorithm. Nearest neighbors can be used both for supervised and unsupervised tasks.
Let’s get started and learn about our next model called nearest neighbors.
Table of Contents
Prerequisites:
- Python, Numpy, Sklearn, Pandas and Matplotlib.
- Linear Algebra For Machine Learning.
- Statistics And Probability Theory.
Master AI: Access In Depth Tutorials & Be Part Of Our Community.
We value the time and dedication in creating our comprehensive ML tutorials. To unlock the full in-depth tutorial, purchase or use your unlock credit. Your support motivates us to continue delivering high-quality tutorials. Thank you for considering – your encouragement is truly appreciated!