Feature Selection in Machine learning| Variable selection| Dimension Reduction

Feature selection is an important step in machine learning model building process. The performance of models depends in the following : Choice of algorithm
Feature Selection
Feature Creation
Model Selection

So feature selection is one important reason for good performance. They are primarily of three types:

Filter Methods
Wrapper Methods
Embedded Methods

You will learn a number of techniques such as variable selection through Correlation matrix, subset selection, stepwise forward, stepwise backward, hybrid method etc. You will also learn regularization (shrinkage) methods such as lasso and Ridge regression that can well be used for variable selection.

Finally you will learn difference between variable selection and dimension reduction

ANalytics Study Pack :

Analytics University on Twitter :

Analytics University on Facebook :

Logistic Regression in R:

Logistic Regression in SAS:

Logistic Regression Theory:

Time Series Theory :

Time ARIMA Model in R :

Survival Model :

Data Science Career :

Machine Learning :

Data Science Case Study :

Big Data & Hadoop & Spark:

One comment

Comments are closed.