ENG 1001 Math Video - Vectors

1 of
Published on Video
Go to video
Download PDF version
Download PDF version
Embed video
Share video
Ask about this video

Page 1 (0s)

[Audio] Hi Prof Elisa, in this video we will be discussing on Vectors..

Page 2 (4s)

[Audio] These are the contents that we will be covering in this video..

Page 3 (8s)

[Audio] Vectors They are quantities with two independent properties of magnitude and direction. They are usually represented by an arrow whose direction is the same as that of the quantity and the length is proportional to the quantity magnitude. Some formulas are as shown below and they include finding of magnitude, dot and cross product..

Page 4 (29s)

[Audio] So….., What are Support Vector Machines and how can vectors be incorporated with Support Vector Machines?.

Page 5 (37s)

[Audio] Support Vector Machine is a supervised machine learning algorithm usually used for classification. It makes use of the known data points to create hyperplane that best separate the different class in a dataset. The support Vector is the set of data point nearest to the hyperplane and is critical in determining the margin between hyperplane. Kernel is a type of function that help to transform the dataset into a set of classifier through the use of dot or cross product or other mathematical operation. It first plot out the known datapoint, determine which data point would be the support vector and determine the hyperplane based on the margin given. So how does SVM find margin?.

Page 6 (1m 22s)

[Audio] Let assume we have a hyperplane where the equation of the line is Y = X and that W is normal to the hyperplane. Point A is at 0 comma 2 and the angle theta is the angle between W and A. With that, we can find our margin..

Page 7 (1m 41s)

[Audio] First, we find the magnitude of W which is square root 2 using Pythagoras theorem. Now we need to find the unit vector V in the direction of W which is vector W divide by length of W. Next, we need to find the shortest point A touches W. We can use dot product to find the shortest distance like the formula above. Now that we find the above formula, we just need to multiply it by 2 for the margin..

Page 8 (2m 12s)

[Audio] Moving on, will be the main purpose of SVM which is the classifying of data. By classifying data, it identifies a hyperplane within a high-dimensional space to effectively separate data into distinct class. This is to find the maximum distance from the hyperplane and closest data point..

Page 9 (2m 34s)

[Audio] Now, let us go through the practical applications of Support Vector Machines (SVMs) in real-world scenarios. In text classification, each word or n-gram in a document is treated as a feature, creating a high-dimensional vector space. SVMs work well in high-dimensional spaces and effectively separate data points into different categories or classes. SVMs can be applied to image classification, particularly in scenarios like Optical Character Recognition (OCR). Handwritten character recognition is an application where SVMs can classify characters in images. SVMs work by finding the optimal decision boundary to separate different characters or objects in the image. SVMs can also be used in face detection. The SVM model is trained with positive samples, like images containing faces, and negative samples, like images without faces. Once trained, it can efficiently classify image regions as faces or non-faces, often creating a bounding box around detected faces. SVMs are adaptable and can handle linear and nonlinear problems using kernel functions, which is especially valuable in real-world circumstances where data may not be linearly separable..

Page 10 (3m 58s)

[Audio] Moving on to the advantages of SVM. The first advantage would be having high accuracy as SVMs maximizes the margins between different class. The second advantage would be that SVMs are robust to overfitting. What this means is that it is less prone to overfit as compared to other machine learning algorithms. Next, SVMs can be interpreted easily as it finds the data points which affect the decision boundary and therefore makes it easier to understand the importance of different data points in the model. Finally, it is a well-studied theory as they are based on solid mathematical foundations and uses a well-developed theoretical framework for easy comprehension..

Page 11 (4m 43s)

[Audio] There are three limitations of SVMs that our group had concluded. First, being that it is sensitive to noisy data due to the prevailing outliers. Secondly, if there are more than two expected outputs binary classification will be less efficient. Lastly, imbalance dataset can happen if more data of one classification happens..

Page 12 (5m 9s)

[Audio] To summarise, we can see how Vectors can be used to incorporate with Support Vector machines. We can recall that these are the formulas used simulate how margins are calculated in a support vector..

Page 13 (5m 23s)

[Audio] Thank you for your attention, have a great day ahead!.

Page 14 (5m 28s)

[Audio] . Citations. Saeed, M. (2022). Method of lagrange multipliers: The theory behind Support Vector Machines (part 1: The separable case). MachineLearningMastery.com. Available at:https://machinelearningmastery.com/method-of-lagrange-multipliers-the-theory-behind-support-vector-machines-part-1-the-separable-case/ (Accessed: 22 October 2023) Support Vector Machine in machine learning (2023) GeeksforGeeks. Available at: https://www.geeksforgeeks.org/support-vector-machine-in-machine-learning/ (Accessed: 21 October 2023). Sethi, A. (2023, September 14). Support Vector Regression tutorial for machine learning. Analytics Vidhya. https://www.analyticsvidhya.com/blog/2020/03/support-vector-regression-tutorial-for-machine-learning/ (Accessed: 23 October 2023) Kowalczyk, A. 2020. SVM - Understanding the math - What is a vector? Available at: https://www.svm-tutorial.com/2014/11/svm-understanding-math-part-2/ (Accessed: 28 October 2023) Kowalczyk, A. 2023. SVM - Understanding the math : the optimal hyperplane. Available at: https://www.svm-tutorial.com/2015/06/svm-understanding-math-part-3/ (Accessed: 28 October 2023).