What is Logistic Regression ?

Logistic Regression is a classification algorithm. It is used to predict a binary outcome (1 / 0, Yes / No, True / False) given a set of independent variables. To represent binary / categorical outcome, we use dummy variables. You can also think of logistic regression as a special case of linear regression when the outcome variable is categorical, where we are using log of odds as dependent variable. In simple words, it predicts the probability of occurrence of an event by fitting data to a logit function.

In previous tutorial we derive the equation of generalized linear model click here.

In this tutrial we will learn how to make a glm model using R.

So let’s start building a logistic model.

We will generate our own data in R-console as stated below

`> Fr <- c(68,42,42,30, 37,52,24,43, 66,50,33,23, 47,55,23,47, 63,53,29,27, 57,49,19,29)`

`> Temp <- gl(2, 2, 24, labels = c("Low", "High"))`

`> Comfort <- gl(3, 8, 24, labels = c("Hard","Medium","Soft"))`

`> M.user <- gl(2, 4, 24, labels = c("N", "Y"))`

> Brand <- gl(2, 1, 24, labels = c(“X”, “M”))

**The function gl()(‘generates levels’) is useful when you want to encode long vectors of factor level. The syntax for three arguments is “gl(2 2 24 ) :- means ‘upto’, ‘with repeat of’, ‘total length'”.**

Here in the above assumed data set we have find out whether

`>deter<-data.frame(Fr,Temp,Comfort,M.user,Brand)`

now check the “deter” variable

>deter

Now we have our complete dataset.

>deter.model <- glm(Fr ~ M.user*Temp*Comfort + Brand, family = poisson, data = deter)

`>summary(deter.model)`

Number of Fisher Scoring iterations: 4

`deter.model1<-glm(terms(Fr ~ M.user*Temp*Comfort+Brand*M.user*Temp,keep.order = TRUE), family = poisson, data = deter)`

> summary(deter.model1)

Above we can see that two deviances **NULL and Residual**. Here Value of NULL deviance can be read as 118.627 on 23 degrees of freedom and Residual deviance as 5.656 on 8 degrees of freedom. **Deviance is a measure of goodness of fit of a model. Higher numbers always indicates bad fit.**

The null deviance shows how well the response variable is predicted by a model that includes only the intercept (grand mean) where as residual with inclusion of independent variables.

Above, you can see that addition of 15 (23-8 =15) independent variables decreased the deviance to 118.627 from 5.656, a significant reduction in deviance.The Residual Deviance has reduced by 112.971 with a loss of fifteen degrees of freedom.

**Degree of freedom** : degree of freedom implies the how many independent random variables you have.

If your Null Deviance is really small, it means that the Null Model explains the data pretty well. Likewise with your Residual Deviance.

`summary(deter.model1, correlation = TRUE, symbolic.cor = TRUE)`

check the correlation between the variables of deter.model1

**Fisher Scoring**

What about the Fisher scoring algorithm? Fisher’s scoring algorithm is a derivative of Newton’s method for solving maximum likelihood problems numerically.

For “deter.model and deter.model1” we see that Fisher’s Scoring Algorithm needed four iterations to perform the fit.

This doesn’t really tell you a lot that you need to know, other than the fact that the model did indeed converge, and had no trouble doing it.

**Information Criteria**

The **Akaike Information Criterion (AIC)** provides a method for assessing the quality of your model through comparison of related models. It’s based on the Deviance, but penalizes you for making the model more complicated. Much like adjusted R-squared, it’s intent is to prevent you from including irrelevant predictors.

However, unlike adjusted R-squared, the number itself is not meaningful. If you have more than one similar candidate models (where all of the variables of the simpler model occur in the more complex models), then you should select the model that has the smallest AIC.

So it’s useful for comparing models, but isn’t interpretable on its own.

Categories: Machine Learning, R

nice one

here something else about logistic regresion

1. http://python-apuntes.blogspot.com.ar/2016/07/regresion-logistica-con-sklearn.html

2. http://apuntes-r.blogspot.com.ar/2015/06/regresion-logistica.html

LikeLike