Scikit Learn Linear SVC Example Machine Learning Tutorial with Python p. 11

In this sklearn with Python for machine learning tutorial, we cover how to do a basic linear SVC example with scikit-learn.

sample code:

Bitcoin donations: 1GV7srgR4NJx4vrk7avCmmVQQrqmv87ty6


  1. Soya Bjorlie says:

    The capital X (I think) is used by convention to indicate an array or
    matrix. When its a simple linear model: y = mx + b for the vector x. In the
    case of multiple predictors stats guys use y = X*(beta) + (epsilon) for a
    data matrix X.

  2. Alexandre Hubert says:


    Anyway, great stuf. Thank you for your time and dedication.

  3. Lucas Pelegrino says:

    Hey +sentdex, thanks for the awesome video.

    I haven’t watched your whole series yet (I’m on part 15), but this video
    was the one that made me closer to understanding how to solve my particular
    problem, but I’m not quite there yet. Would you mind giving me some hints?

    If so, let me explain my situation:

    I have an e-commerce catalog with product title (e.g ‘iPhone 6’, ‘Smart TV
    DR-7874 with Wi-fi’) and their respective categories (e.g ‘Smartphones’ and
    ‘TV’). I need to predict in which category an uncategorized product fits
    best. Thats how I’m imagining I would predict category “c” of a new product

    The goal is doing something like:
    predict(‘Nokia Lumia 630/635 Anti-glare Screen Protector’) # > Smartphones

    That’s the steps I would make:

    1. Selecting my candidates, my dataset would be comprehended only of
    products that share at least one word with “p” (maybe using some tf-idf to
    filter useless words)
    2. Create a list of all words (bag of words) from the selected candidates.
    3. Each word could be a feature, 0 would mean that this feature was missing
    in one particular product, and 1 the feature was present. This step I’m
    kind lost, not sure if that makes sense. (maybe an inverted index
    word-product might help doing these look ups?
    4. I’m guessing X would be comprehended of a list of lists, each inner list
    corresponds to a particular set of features.
    5. Labeling the data, y would be the categories of the items (this comes
    from my database)

    Sorry for the bad english and long post. Some steps I don’t know if are
    going to be necessary, and after 3 I’m just trying to imagine how to get
    this done, not sure if the approach is correct.

  4. RAC C (thiirane) says:

    Wow… This was very helpful. Like you I am self taught in python and only
    now learning machine learning. I would like to leverage you experience in
    learning scikit-learn specially for images. I have been looking for
    information on importing Histogram Oriented Gradients into scikit-learn. As
    you say the hardest part of machine learning is formatting the data to put
    into the svm.SVC. I have a web camera where I have collected hundreds of
    images of sedans, vans, trucks, SUV that pass in by my house. I have
    cropped these images and classified them as such. I think I need to use
    this command to get the HOG features into a format that can be used by
    scikit learn
    >> fd, hog_image = hog(greyscale, orientations=8, pixels_per_cell=(16, 16),
    cells_per_block=(1, 1), visualise=True)
    where X values are fd (x,y pixel values for each HOG image). I am not sure
    how to compile all of the images into a single np.array.
    My labels (y values) I guess would be for Sedan would be [1,0,0,0], for van
    would you agree? It would be great if you could do a video on this topic.
    What you have done here appears to be applicable in a simple way. Anyway.
    I would be grateful for your thoughts on this.

  5. DeadWalker44 says:

    Very cool and useful video series so far, really enjoy it. I have quick
    question: How do you do the multi-line commenting @12:47?

  6. unique raj says:

    Very nice video, +Sentdex Indicator , have you made video like this for
    other supervise alogorithm like decision tree or random forest.? here you
    have not talked about what kind of real feature can be x and y. and is not
    it imp to show support vector as well while making classification

  7. Osama Abbas says:

    Do you have any tutorial I can watch that uses svm in sklearn with opencv
    for classification? or can you recommend anything else?

Comments are closed.