Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
research [2015/03/31 21:48]
Nicolas Pasquier [Concept Lattices]
research [2015/07/06 17:27] (current)
Frederic Precioso [Boosting]
Line 1: Line 1:
 ====== Research Topics ====== ====== Research Topics ======
  
-MinD research group aims at developping algorithms for data mining and machine learning with a focus on large-scale. In particular, MinD has an expertise on [[research#​Concept Lattices|Concept Lattices]], [[research#​Evolutionary Computation|Evolutionary Computation]],​ [[research#​Multi-Agent Systems|Multi-Agent Systems]], [[research#​Naïve Bayes|Naïve Bayes]], [[research#​Random Forests|Random Forests]], [[research#​Support Vector Machines|Support Vector Machines]], ...+MinD research group aims at developping algorithms for data mining and machine learning with a focus on large-scale. In particular, MinD has an expertise on [[research#​Concept Lattices|Concept Lattices]], [[research#​Evolutionary Computation|Evolutionary Computation]],​ [[research#​Multi-Agent Systems|Multi-Agent Systems]], [[research#​Naïve Bayes|Naïve Bayes]], [[research#​Random Forests|Random Forests]], [[research#​Support Vector Machines|Support Vector Machines]], [[research#​Boosting|Boosting]],​ [[research#​Deep Learning|Deep Learning]], ...
 Those methods are used to extract knowledge from [[wp>Big Data]] for : Those methods are used to extract knowledge from [[wp>Big Data]] for :
   * Association rule learning   * Association rule learning
Line 39: Line 39:
 ---- ----
 ===== Support Vector Machines ====== ===== Support Vector Machines ======
 +In machine learning, [[wp>​support vector machines]] (SVMs, also support vector networks[1]) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. Given a set of training examples, each marked for belonging to one of two categories, an SVM training algorithm builds a model that assigns new examples into one category or the other, making it a non-probabilistic binary linear classifier.
 +In addition to performing linear classification,​ SVMs can efficiently perform a non-linear classification using what is called the kernel trick, implicitly mapping their inputs into high-dimensional feature spaces.
 +----
 +===== Boosting ======
 +[[https://​en.wikipedia.org/​wiki/​Boosting_(machine_learning)|Boosting]] is a machine learning ensemble meta-algorithm for reducing bias primarily and also variance[1] in supervised learning, and a family of machine learning algorithms which convert weak learners to strong ones.[2] Boosting is based on the question posed by Kearns and Valiant (1988, 1989):​[3][4] Can a set of weak learners create a single strong learner? A weak learner is defined to be a classifier which is only slightly correlated with the true classification (it can label examples better than random guessing). In contrast, a strong learner is a classifier that is arbitrarily well-correlated with the true classification.
 +----
 +===== Deep Learning ======
 ---- ----