Support vector machine
Support Vector Machine (SVM) is a supervised machine learning algorithm widely used for solving classification and regression problems. It is based on the concept of decision planes that define decision boundaries. A decision plane is one that separates a set of objects having different class memberships.
Overview[edit | edit source]
SVM works by mapping data to a high-dimensional feature space so that data points can be categorized, even when the data are not otherwise linearly separable. A separator between the categories is found, then the data is transformed in such a way that the separator could be drawn as a hyperplane. Following this, characteristics of new data can be used to predict the group to which a new record should belong.
History[edit | edit source]
The origins of SVM can be traced back to the work of Vladimir Vapnik and Alexey Chervonenkis in the 1960s. Initially, it was developed for binary classification problems. Over time, the algorithm has been extended to solve multi-class classification and regression problems, known as Support Vector Regression (SVR).
Mathematical Formulation[edit | edit source]
At its core, the SVM algorithm seeks to find the hyperplane that best separates the classes in feature space. If the input features are mapped linearly to the high-dimensional space, the separating hyperplane can be written as the set of points \(x\) satisfying:
\[ \mathbf{w} \cdot \mathbf{x} - b = 0 \]
where \(\mathbf{w}\) is the normal vector to the hyperplane, and \(b\) is the offset from the origin. The best hyperplane is the one that leaves the maximum margin from both classes. SVM uses hinge loss function and incorporates slack variables to handle non-linearly separable cases and soft margins.
Kernel Trick[edit | edit source]
The kernel trick is a method used by SVMs to transform the input space into a higher dimensional space where a linear separator is sufficient to separate the classes. Common kernels include the linear, polynomial, radial basis function (RBF), and sigmoid.
Applications[edit | edit source]
SVMs are used in a variety of applications such as face detection, handwriting recognition, image classification, and bioinformatics. They are preferred due to their effectiveness in high-dimensional spaces and their ability to handle different types of data.
Advantages and Disadvantages[edit | edit source]
Advantages of SVMs include their accuracy in high-dimensional spaces and their versatility through the kernel trick. However, SVMs can be computationally intensive, particularly with large datasets, and their performance heavily depends on the choice of kernel and regularization parameters.
See Also[edit | edit source]
- Machine Learning
- Classification (machine learning)
- Regression analysis
- Kernel (computing)
- Feature space
This article is a stub. You can help WikiMD by registering to expand it. |
Search WikiMD
Ad.Tired of being Overweight? Try W8MD's physician weight loss program.
Semaglutide (Ozempic / Wegovy and Tirzepatide (Mounjaro / Zepbound) available.
Advertise on WikiMD
WikiMD's Wellness Encyclopedia |
Let Food Be Thy Medicine Medicine Thy Food - Hippocrates |
Translate this page: - East Asian
中文,
日本,
한국어,
South Asian
हिन्दी,
தமிழ்,
తెలుగు,
Urdu,
ಕನ್ನಡ,
Southeast Asian
Indonesian,
Vietnamese,
Thai,
မြန်မာဘာသာ,
বাংলা
European
español,
Deutsch,
français,
Greek,
português do Brasil,
polski,
română,
русский,
Nederlands,
norsk,
svenska,
suomi,
Italian
Middle Eastern & African
عربى,
Turkish,
Persian,
Hebrew,
Afrikaans,
isiZulu,
Kiswahili,
Other
Bulgarian,
Hungarian,
Czech,
Swedish,
മലയാളം,
मराठी,
ਪੰਜਾਬੀ,
ગુજરાતી,
Portuguese,
Ukrainian
Medical Disclaimer: WikiMD is not a substitute for professional medical advice. The information on WikiMD is provided as an information resource only, may be incorrect, outdated or misleading, and is not to be used or relied on for any diagnostic or treatment purposes. Please consult your health care provider before making any healthcare decisions or for guidance about a specific medical condition. WikiMD expressly disclaims responsibility, and shall have no liability, for any damages, loss, injury, or liability whatsoever suffered as a result of your reliance on the information contained in this site. By visiting this site you agree to the foregoing terms and conditions, which may from time to time be changed or supplemented by WikiMD. If you do not agree to the foregoing terms and conditions, you should not enter or use this site. See full disclaimer.
Credits:Most images are courtesy of Wikimedia commons, and templates Wikipedia, licensed under CC BY SA or similar.
Contributors: Prab R. Tumpati, MD