Start Date: 02/23/2020
Course Type: Common Course |
Course Link: https://www.coursera.org/learn/linear-regression-business-statistics
Explore 1600+ online courses from top universities. Join Coursera today to learn data science, programming, business strategy, and more.Regression Analysis is perhaps the single most important Business Statistics tool used in the industry. Regression is the engine behind a multitude of data analytics applications used for many forms of forecasting and prediction. This is the fourth course in the specialization, "Business Statistics and Analysis". The course introduces you to the very important tool known as Linear Regression. You will learn to apply various procedures such as dummy variable regressions, transforming variables, and interaction effects. All these are introduced and explained using easy to understand examples in Microsoft Excel. The focus of the course is on understanding and application, rather than detailed mathematical derivations. Note: This course uses the ‘Data Analysis’ tool box which is standard with the Windows version of Microsoft Excel. It is also standard with the 2016 or later Mac version of Excel. However, it is not standard with earlier versions of Excel for Mac. WEEK 1 Module 1: Regression Analysis: An Introduction In this module you will get introduced to the Linear Regression Model. We will build a regression model and estimate it using Excel. We will use the estimated model to infer relationships between various variables and use the model to make predictions. The module also introduces the notion of errors, residuals and R-square in a regression model. Topics covered include: • Introducing the Linear Regression • Building a Regression Model and estimating it using Excel • Making inferences using the estimated model • Using the Regression model to make predictions • Errors, Residuals and R-square WEEK 2 Module 2: Regression Analysis: Hypothesis Testing and Goodness of Fit This module presents different hypothesis tests you could do using the Regression output. These tests are an important part of inference and the module introduces them using Excel based examples. The p-values are introduced along with goodness of fit measures R-square and the adjusted R-square. Towards the end of module we introduce the ‘Dummy variable regression’ which is used to incorporate categorical variables in a regression. Topics covered include: • Hypothesis testing in a Linear Regression • ‘Goodness of Fit’ measures (R-square, adjusted R-square) • Dummy variable Regression (using Categorical variables in a Regression) WEEK 3 Module 3: Regression Analysis: Dummy Variables, Multicollinearity This module continues with the application of Dummy variable Regression. You get to understand the interpretation of Regression output in the presence of categorical variables. Examples are worked out to re-inforce various concepts introduced. The module also explains what is Multicollinearity and how to deal with it. Topics covered include: • Dummy variable Regression (using Categorical variables in a Regression) • Interpretation of coefficients and p-values in the presence of Dummy variables • Multicollinearity in Regression Models WEEK 4 Module 4: Regression Analysis: Various Extensions The module extends your understanding of the Linear Regression, introducing techniques such as mean-centering of variables and building confidence bounds for predictions using the Regression model. A powerful regression extension known as ‘Interaction variables’ is introduced and explained using examples. We also study the transformation of variables in a regression and in that context introduce the log-log and the semi-log regression models. Topics covered include: • Mean centering of variables in a Regression model • Building confidence bounds for predictions using a Regression model • Interaction effects in a Regression • Transformation of variables • The log-log and semi-log regression models
Linear Regression for Business Statistics This course provides an introduction to linear regression in business statistics. We cover the basics of regression as well as its applications in business statistics, such as as as means-end functions, logistic regression, and conditional logistic regression. We also introduce the linear model, and its application to business problems. By giving you the all-clear for this course, you'll be able to: • Understand how to write a linear regression model • Explain the linear model • Explain the logistic regression model • Model the business cycle Upon successful completion of this course, you'll be able to: • Write a regression model using XLSX • Explain the linear model • Model the business cycle Linear regression is used in a wide variety of fields, including statistics, bioinformatics, computer graphics, data science, computer modeling, and many others. This course is suitable for an advanced undergraduate, graduate student, and professional data scientists and engineers.Course Overview and Linear Regression Linear Model Model Input and Output Logistic Regression Logistic Regression Model Machine Learning and Deep Learning This course will cover the topic of deep learning and Machine Learning, covering the topics from the linear model (linearization) to deep neural networks (deep convolutional). We will also cover the successes and
Article | Example |
---|---|
Business statistics | A typical business statistics course is intended for business majors, and covers statistical study, descriptive statistics (collection, description, analysis, and summary of data), probability, and the binomial and normal distributions, test of hypotheses and confidence intervals, linear regression, and correlation. |
Linear regression | In statistics, linear regression is an approach for modeling the relationship between a scalar dependent variable "y" and one or more explanatory variables (or independent variables) denoted "X". The case of one explanatory variable is called "simple linear regression". For more than one explanatory variable, the process is called "multiple linear regression". (This term is distinct from "multivariate linear regression", where multiple correlated dependent variables are predicted, rather than a single scalar variable.) |
Bayesian multivariate linear regression | In statistics, Bayesian multivariate linear regression is a |
Bayesian linear regression | In statistics, Bayesian linear regression is an approach to linear regression in which the statistical analysis is undertaken within the context of Bayesian inference. When the regression model has errors that have a normal distribution, and if a particular form of prior distribution is assumed, explicit results are available for the posterior probability distributions of the model's parameters. |
Linear regression | The very simplest case of a single scalar predictor variable "x" and a single scalar response variable "y" is known as "simple linear regression". The extension to multiple and/or vector-valued predictor variables (denoted with a capital "X") is known as "multiple linear regression", also known as "multivariable linear regression". Nearly all real-world regression models involve multiple predictors, and basic descriptions of linear regression are often phrased in terms of the multiple regression model. Note, however, that in these cases the response variable "y" is still a scalar. Another term "multivariate linear regression" refers to cases where "y" is a vector, i.e., the same as "general linear regression". |
Linear regression | Some of the more common estimation techniques for linear regression are summarized below. |
Linear regression | In statistics and numerical analysis, the problem of numerical methods for linear least squares is an important one because linear regression models are one of the most important types of model, both as formal statistical models and for exploration of data sets. The majority of statistical computer packages contain facilities for regression analysis that make use of linear least squares computations. Hence it is appropriate that considerable effort has been devoted to the task of ensuring that these computations are undertaken efficiently and with due regard to numerical precision. |
Nonlinear regression | The nonlinear regression statistics are computed and used as in linear regression statistics, but using J in place of X in the formulas. The linear approximation introduces bias into the statistics. Therefore more caution than usual is required in interpreting statistics derived from a nonlinear model. |
Linear regression | The capital asset pricing model uses linear regression as well as the concept of beta for analyzing and quantifying the systematic risk of an investment. This comes directly from the beta coefficient of the linear regression model that relates the return on the investment to the return on all risky assets. |
Simple linear regression | In statistics, simple linear regression is a linear regression model with a single explanatory variable. That is, it concerns two-dimensional sample points with one independent variable and one dependent variable (conventionally, the "x" and "y" coordinates in a Cartesian coordinate system) and finds a linear function (a non-vertical straight line) that, as accurately as possible, predicts the dependent variable values as a function of the independent variables. |
Linear regression | In linear regression, the relationships are modeled using linear predictor functions whose unknown model parameters are estimated from the data. Such models are called "linear models". Most commonly, the conditional mean of "y" given the value of "X" is assumed to be an affine function of "X"; less commonly, the median or some other quantile of the conditional distribution of "y" given "X" is expressed as a linear function of "X". Like all forms of regression analysis, linear regression focuses on the conditional probability distribution of "y" given "X", rather than on the joint probability distribution of "y" and "X", which is the domain of multivariate analysis. |
Linear regression | The following are the major assumptions made by standard linear regression models with standard estimation techniques (e.g. ordinary least squares): |
Linear regression | Linear regression has many practical uses. Most applications fall into one of the following two broad categories: |
Bayesian linear regression | A similar analysis can be performed for the general case of the multivariate regression and part of this provides for Bayesian estimation of covariance matrices: see Bayesian multivariate linear regression. |
Poisson regression | In statistics, Poisson regression is a generalized linear model form of regression analysis used to model count data and contingency tables. Poisson regression assumes the response variable "Y" has a Poisson distribution, and assumes the logarithm of its expected value can be modeled by a linear combination of unknown parameters. A Poisson regression model is sometimes known as a log-linear model, especially when used to model contingency tables. |
Linear regression | Linear regression models are often fitted using the least squares approach, but they may also be fitted in other ways, such as by minimizing the "lack of fit" in some other norm (as with least absolute deviations regression), or by minimizing a penalized version of the least squares loss function as in ridge regression ("L"-norm penalty) and lasso ("L"-norm penalty). Conversely, the least squares approach can be used to fit models that are not linear models. Thus, although the terms "least squares" and "linear model" are closely linked, they are not synonymous. |
Least-angle regression | In statistics, least-angle regression (LARS) is an algorithm for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani. |
Contrast (statistics) | In statistics, particularly in analysis of variance and linear regression, a contrast is a linear combination of variables (parameters or statistics) whose coefficients add up to zero, allowing comparison of different treatments. |
Linear regression | Numerous extensions of linear regression have been developed, which allow some or all of the assumptions underlying the basic model to be relaxed. |
Logistic regression | The basic setup of logistic regression is the same as for standard linear regression. |