I am not an engineer
Must read:Is there a real future in data analysis for self-learners without a math degree?

Fitting a function

If you see an error in the article, please comment or drop me an email. How to fit functions using linear models \[Y_i = \beta_0 + \beta_1 X_i + \sum_{k=1}^d (x_i – \xi_k)_+ \gamma_k + \epsilon_{i}\] Simulated example Source: https://github.com/DataScienceSpecialization/courses Separate the n values into k+1 spans. (k standing for knots) Create a basis: a

Logistic Regression

If you see an error in the article, please comment or drop me an email. Logistic regression is a generalized linear model where the outcome is a categorical variable. Logistic regression can be binomial (using binary independent variables), ordinal (if categories are ordered) or multinomial (with more than two categories). Binary Generalized Linear Models Binary

Poisson regression

If you see an error in the article, please comment or drop me an email. Poisson regression In statistics, Poisson regression is a generalized linear model form of regression analysis used to model count data and contingency tables. Poisson regression assumes the response variable Y has a Poisson distribution, and assumes the logarithm of its

Generalized Linear Models (Intro)

If you see an error in the article, please comment or drop me an email. Generalized linear models include linear models, but they go beyond to handle many of the issues with linear models. Limitations of Linear Models Additive response models don’t make much sense if the response is discrete (for instance binary data) Additive

Inference for Multiple Linear Regression

If you see an error in the article, please comment or drop me an email. Inference for Multiple Linear Regression #Load the data cognitive <- read.csv("http://bit.ly/dasi_cognitive") Let us start with the full model, thus including all variables: #Fit the full model and show the summary cog_full <- lm(kid_score ~ mom_hs + mom_iq + mom_work +

Multiple Linear Regression

If you see an error in the article, please comment or drop me an email. Conditions for multiple linear regression linear relationship between each (numerical) explanatory variable and the response – checked using scatterplots of y vs. each x, and residuals plots of residuals vs. each x nearly normal residuals with mean 0 – checked using a

Model Selection

If you see an error in the article, please comment or drop me an email. Scott Zeger: “a model is a lense through which to look at your data”. George Box: “All models are wrong, some are useful.” Collinearity and parsimony Collinearity: a high correlation between two independent variables such that the two variables contribute

Linear Regression Intro

If you see an error in the article, please comment or drop me an email. The basics of linear regression Linear regression is one form of regression among others. It is probably the most intuitive and easiest one. The reason for regression is to 1) predict values for which there are no observed values and