Learning

Feature Extraction
Takes input and extracts set of feature/value pairs.  
  • Ex. Takes abc@gmail.com and extracts features from it (contains_@ = 1, endsWith_.org = 0, length>10 = 1, etc)  
Uses a template 
Feature Engineering
Intuition: extract lots of features which might be relevant. 
Ex. natural language: words, parts-of-speech, capitalization pattern, etc
Features Notation
Notation is simply a vector (feature vector) 
A point in a high demensional space
Parameter Tuning
Process parameters and tune
Linear predictors
Have a weight vector that you dot product with the given feature vector that gives you a "score" (weighted combination of features). Margin is how correct the vector was (score multiplied by 1/-1) 
  • Binary linear classifier essentially looks at the sign of the score and decides YAY or NAY 
Algorithm 
Perceptron Algorithm
For each training example, predict y' and if made a mistake then update parameters. Else don't do anything ("if it ain't broke, don't fix it")
  • The update: w -> w + O(x)y
Least squares regression
For homework? w that minizes tranining loss is mean y (
   Login to remove ads X
Feedback | How-To