Deep Learning Specialization on Coursera

Course Introduction

Regression Analysis is perhaps the single most important Business Statistics tool used in the indust

Course Tag

Log–Log Plot Interaction (Statistics) Linear Regression Regression Analysis

Related Wiki Topic

Article Example
Business statistics A typical business statistics course is intended for business majors, and covers statistical study, descriptive statistics (collection, description, analysis, and summary of data), probability, and the binomial and normal distributions, test of hypotheses and confidence intervals, linear regression, and correlation.
Linear regression In statistics, linear regression is an approach for modeling the relationship between a scalar dependent variable "y" and one or more explanatory variables (or independent variables) denoted "X". The case of one explanatory variable is called "simple linear regression". For more than one explanatory variable, the process is called "multiple linear regression". (This term is distinct from "multivariate linear regression", where multiple correlated dependent variables are predicted, rather than a single scalar variable.)
Bayesian multivariate linear regression In statistics, Bayesian multivariate linear regression is a
Bayesian linear regression In statistics, Bayesian linear regression is an approach to linear regression in which the statistical analysis is undertaken within the context of Bayesian inference. When the regression model has errors that have a normal distribution, and if a particular form of prior distribution is assumed, explicit results are available for the posterior probability distributions of the model's parameters.
Linear regression The very simplest case of a single scalar predictor variable "x" and a single scalar response variable "y" is known as "simple linear regression". The extension to multiple and/or vector-valued predictor variables (denoted with a capital "X") is known as "multiple linear regression", also known as "multivariable linear regression". Nearly all real-world regression models involve multiple predictors, and basic descriptions of linear regression are often phrased in terms of the multiple regression model. Note, however, that in these cases the response variable "y" is still a scalar. Another term "multivariate linear regression" refers to cases where "y" is a vector, i.e., the same as "general linear regression".
Linear regression Some of the more common estimation techniques for linear regression are summarized below.
Linear regression In statistics and numerical analysis, the problem of numerical methods for linear least squares is an important one because linear regression models are one of the most important types of model, both as formal statistical models and for exploration of data sets. The majority of statistical computer packages contain facilities for regression analysis that make use of linear least squares computations. Hence it is appropriate that considerable effort has been devoted to the task of ensuring that these computations are undertaken efficiently and with due regard to numerical precision.
Nonlinear regression The nonlinear regression statistics are computed and used as in linear regression statistics, but using J in place of X in the formulas. The linear approximation introduces bias into the statistics. Therefore more caution than usual is required in interpreting statistics derived from a nonlinear model.
Linear regression The capital asset pricing model uses linear regression as well as the concept of beta for analyzing and quantifying the systematic risk of an investment. This comes directly from the beta coefficient of the linear regression model that relates the return on the investment to the return on all risky assets.
Simple linear regression In statistics, simple linear regression is a linear regression model with a single explanatory variable. That is, it concerns two-dimensional sample points with one independent variable and one dependent variable (conventionally, the "x" and "y" coordinates in a Cartesian coordinate system) and finds a linear function (a non-vertical straight line) that, as accurately as possible, predicts the dependent variable values as a function of the independent variables.
Linear regression In linear regression, the relationships are modeled using linear predictor functions whose unknown model parameters are estimated from the data. Such models are called "linear models". Most commonly, the conditional mean of "y" given the value of "X" is assumed to be an affine function of "X"; less commonly, the median or some other quantile of the conditional distribution of "y" given "X" is expressed as a linear function of "X". Like all forms of regression analysis, linear regression focuses on the conditional probability distribution of "y" given "X", rather than on the joint probability distribution of "y" and "X", which is the domain of multivariate analysis.
Linear regression The following are the major assumptions made by standard linear regression models with standard estimation techniques (e.g. ordinary least squares):
Linear regression Linear regression has many practical uses. Most applications fall into one of the following two broad categories:
Bayesian linear regression A similar analysis can be performed for the general case of the multivariate regression and part of this provides for Bayesian estimation of covariance matrices: see Bayesian multivariate linear regression.
Poisson regression In statistics, Poisson regression is a generalized linear model form of regression analysis used to model count data and contingency tables. Poisson regression assumes the response variable "Y" has a Poisson distribution, and assumes the logarithm of its expected value can be modeled by a linear combination of unknown parameters. A Poisson regression model is sometimes known as a log-linear model, especially when used to model contingency tables.
Linear regression Linear regression models are often fitted using the least squares approach, but they may also be fitted in other ways, such as by minimizing the "lack of fit" in some other norm (as with least absolute deviations regression), or by minimizing a penalized version of the least squares loss function as in ridge regression ("L"-norm penalty) and lasso ("L"-norm penalty). Conversely, the least squares approach can be used to fit models that are not linear models. Thus, although the terms "least squares" and "linear model" are closely linked, they are not synonymous.
Least-angle regression In statistics, least-angle regression (LARS) is an algorithm for fitting linear regression models to high-dimensional data, developed by Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani.
Contrast (statistics) In statistics, particularly in analysis of variance and linear regression, a contrast is a linear combination of variables (parameters or statistics) whose coefficients add up to zero, allowing comparison of different treatments.
Linear regression Numerous extensions of linear regression have been developed, which allow some or all of the assumptions underlying the basic model to be relaxed.
Logistic regression The basic setup of logistic regression is the same as for standard linear regression.