Since their introduction in 1992, Support Vector Machines marked a breakthrough in the theory of learning systems. Rooted in the Statistical Learning Theory developed by Vladimir Vapnik, Support Vector Machines quickly gained attention from the pattern recognition community due to its theoretical and computational merits.
Health Discovery Corporation's SVM technology outperforms even advanced statistical modeling methodologies such as neural networks. Neural networks suffer from a limited ability to handle data and can only analyze the data from two or three dimensions. Support Vector Machines, however, are able to process infinite amounts of data and to analyze the data to find separations and delineations high dimensionality.
Statistical Learning Theory, the backbone of Support Vector Machines, provides a new framework for modeling learning algorithms, merges the fields of machine learning and statistics, and inspires algorithms that overcome all of the above difficulties. A new generation of learning algorithms - or equivalently of statistical methods - has recently been developed, based on this theory.
The Company’s SVM technology is commonly considered within the context of artificial intelligence. This is a branch of computer science concerned with giving computers the ability to perform functions normally associated with human intelligence, such as reasoning and optimization through experience. Machine learning is a type of artificial intelligence that enables the development of algorithms and techniques that allow computers to learn. Pattern recognition is machine learning with a wide spectrum of applications including medical diagnosis, bioinformatics, classifying DNA sequences, detecting credit card fraud, stock market analysis, object recognition in computer vision, and robot locomotion.