Difference between revisions of "SVM"
(3 intermediate revisions by the same user not shown) | |||
Line 4: | Line 4: | ||
<u><b>The following are some useful YouTube videos:</b></u> | <u><b>The following are some useful YouTube videos:</b></u> | ||
− | |||
− | |||
− | |||
<html> | <html> | ||
<head> | <head> | ||
Line 13: | Line 10: | ||
<body> | <body> | ||
<div id="video-container"> | <div id="video-container"> | ||
− | |||
</div> | </div> | ||
Line 23: | Line 19: | ||
var iframe = document.createElement('iframe'); | var iframe = document.createElement('iframe'); | ||
iframe.src = videoUrl; | iframe.src = videoUrl; | ||
− | iframe.width = | + | iframe.width = 960; |
− | iframe.height = | + | iframe.height = 540; |
iframe.setAttribute('allow', 'autoplay'); | iframe.setAttribute('allow', 'autoplay'); | ||
container.innerHTML = ''; // Clear existing content | container.innerHTML = ''; // Clear existing content | ||
Line 30: | Line 26: | ||
} | } | ||
</script> | </script> | ||
+ | </body> | ||
+ | </html> | ||
+ | |||
+ | === <u>Part 1</u> === | ||
+ | <font><i>A run down of what SVMs are and what they can be used for. All of the necessary background to get started with using SVMs.</i></font> <br> | ||
+ | <html> | ||
+ | <head> | ||
+ | <title>YouTube Video Click-to-Play</title> | ||
+ | </head> | ||
+ | <body> | ||
+ | <div id="video-container"> | ||
+ | <button onclick="loadVideo('efR1C6CvhmE')">SVM: Main Ideas</button> <font size="1" color="blue">← Click-to-play</font> | ||
+ | </div> | ||
</body> | </body> | ||
</html> | </html> | ||
[https://www.youtube.com/watch?v=efR1C6CvhmE Support Vector Machines Main Ideas] <br> | [https://www.youtube.com/watch?v=efR1C6CvhmE Support Vector Machines Main Ideas] <br> | ||
− | |||
Line 44: | Line 52: | ||
<body> | <body> | ||
<div id="video-container"> | <div id="video-container"> | ||
− | <button onclick="loadVideo('Toet3EiSFcM')">SVM: The Polynomial Kernel</button> | + | <button onclick="loadVideo('Toet3EiSFcM')">SVM: The Polynomial Kernel</button> <font size="1" color="blue">← Click-to-play</font> |
</div> | </div> | ||
Line 54: | Line 62: | ||
var iframe = document.createElement('iframe'); | var iframe = document.createElement('iframe'); | ||
iframe.src = videoUrl; | iframe.src = videoUrl; | ||
− | iframe.width = | + | iframe.width = 960; |
− | iframe.height = | + | iframe.height = 540; |
iframe.setAttribute('allow', 'autoplay'); | iframe.setAttribute('allow', 'autoplay'); | ||
container.innerHTML = ''; // Clear existing content | container.innerHTML = ''; // Clear existing content | ||
Line 64: | Line 72: | ||
</html> | </html> | ||
[https://www.youtube.com/watch?v=Toet3EiSFcM The Polynomial Kernel] <br> | [https://www.youtube.com/watch?v=Toet3EiSFcM The Polynomial Kernel] <br> | ||
− | |||
Line 75: | Line 82: | ||
<body> | <body> | ||
<div id="video-container"> | <div id="video-container"> | ||
− | <button onclick="loadVideo('Qc5IyLW_hns')">SVM: The Radial (RBF) Kernel</button> | + | <button onclick="loadVideo('Qc5IyLW_hns')">SVM: The Radial (RBF) Kernel</button> <font size="1" color="blue">← Click-to-play</font> |
</div> | </div> | ||
Line 85: | Line 92: | ||
var iframe = document.createElement('iframe'); | var iframe = document.createElement('iframe'); | ||
iframe.src = videoUrl; | iframe.src = videoUrl; | ||
− | iframe.width = | + | iframe.width = 960; |
− | iframe.height = | + | iframe.height = 540; |
iframe.setAttribute('allow', 'autoplay'); | iframe.setAttribute('allow', 'autoplay'); | ||
container.innerHTML = ''; // Clear existing content | container.innerHTML = ''; // Clear existing content |
Latest revision as of 19:40, 22 May 2023
Support Vector Machines
Support Vector Machines (SVM) are a class of machine learning algorithms used for classification and regression tasks. In a classification problem, SVMs aim to find an optimal hyperplane that separates different classes of data points with the largest possible margin. The hyperplane is a decision boundary that maximizes the distance between the nearest data points from each class, known as support vectors. SVMs can handle both linearly separable and non-linearly separable data by employing various kernel functions. The idea behind SVMs is to transform input data into a higher-dimensional feature space using the kernel trick, which allows the algorithm to find a linear separation in this transformed space. This transformation enables SVMs to handle complex decision boundaries that may not be linear in the original feature space. During the training phase, SVMs aim to find the optimal hyperplane by solving a quadratic optimization problem. The objective is to minimize the classification error while maximizing the margin. The margin is defined as the perpendicular distance between the hyperplane and the support vectors. By maximizing the margin, SVMs tend to produce a more robust and generalized model. In addition to binary classification, SVMs can be extended to handle multi-class problems using techniques such as one-vs-one or one-vs-all classification. SVMs also have a formulation for regression tasks, known as Support Vector Regression (SVR), where the objective is to fit the data while limiting the deviations within a certain margin. SVMs have several advantages, such as their ability to handle high-dimensional data, effective generalization, and resistance to overfitting. They can be computationally expensive, particularly with large datasets, and the selection of the appropriate kernel and its parameters can be challenging.
The following are some useful YouTube videos:
Part 1
A run down of what SVMs are and what they can be used for. All of the necessary background to get started with using SVMs.
Support Vector Machines Main Ideas
Part 2
A mathematical description of the Polynomial Kernel and how SVMs use this type of kernel to classify data. This video describes how the Polynomial Kernel classifies data in a set number of dimensions.
Part 3
A mathematical description of the Radial Kernel and how SVMs use this type of kernel to classify data. This video describes how the RBF classifies data in infinite dimensions.