Lecture 8 | Machine Learning (Stanford)

Lecture by Professor Andrew Ng for Machine Learning (CS 229) in the Stanford Computer Science department. Professor Ng continues his lecture about support vector machines, including soft margin optimization and kernels.

This course provides a broad introduction to machine learning and statistical pattern recognition. Topics include supervised learning, unsupervised learning, learning theory, reinforcement learning and adaptive control. Recent applications of machine learning, such as to robotic control, data mining, autonomous navigation, bioinformatics, speech recognition, and text and web data processing are also discussed.

Complete Playlist for the Course:

CS 229 Course Website:

Stanford University:

Stanford University Channel on YouTube:


  1. bloom987654322 says:

    Does anyone know how to findthe phi(x) corresponding to Gaussian kernel?
    This is needed if we want to calculate w after solving the soft margin
    optimization problem. 

  2. kmmbvnr says:

    @bloom987654322 too late, but i just want to check my conjecture: Is phi(x)
    for Gaussian just Taylor series for exponent?

  3. astrophilip says:

    I wish that Prof Ng would have done Gaussian processes before SVM. Kernel
    methods make so much more sense to me in the context of Gaussian processes
    than in the context of SVM

  4. shakesbeer00 says:

    It is Coordinate Ascent, since we are maximizing W(alpha). It is more
    commonly called “Coordinate Descent” for minimization problems. 

Comments are closed.