Concept of Multi collinearinty and Auto correlation

SAS Analytics Linear Regression
7 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€65.24
List Price:  €93.20
You save:  €27.96
£55.93
List Price:  £79.91
You save:  £23.97
CA$95.58
List Price:  CA$136.56
You save:  CA$40.97
A$107.13
List Price:  A$153.05
You save:  A$45.92
S$95.25
List Price:  S$136.08
You save:  S$40.82
HK$547.90
List Price:  HK$782.75
You save:  HK$234.85
CHF 63.85
List Price:  CHF 91.21
You save:  CHF 27.36
NOK kr770.13
List Price:  NOK kr1,100.23
You save:  NOK kr330.10
DKK kr486.51
List Price:  DKK kr695.05
You save:  DKK kr208.53
NZ$117.65
List Price:  NZ$168.07
You save:  NZ$50.42
د.إ257.06
List Price:  د.إ367.25
You save:  د.إ110.18
৳7,679.19
List Price:  ৳10,970.75
You save:  ৳3,291.55
₹5,834.32
List Price:  ₹8,335.10
You save:  ₹2,500.78
RM333.81
List Price:  RM476.90
You save:  RM143.08
₦91,422.33
List Price:  ₦130,608.93
You save:  ₦39,186.60
₨19,487.31
List Price:  ₨27,840.21
You save:  ₨8,352.89
฿2,585.90
List Price:  ฿3,694.31
You save:  ฿1,108.40
₺2,277.50
List Price:  ₺3,253.72
You save:  ₺976.21
B$361.10
List Price:  B$515.88
You save:  B$154.78
R1,320.17
List Price:  R1,886.05
You save:  R565.87
Лв127.48
List Price:  Лв182.13
You save:  Лв54.64
₩96,293.43
List Price:  ₩137,567.94
You save:  ₩41,274.51
₪266.53
List Price:  ₪380.78
You save:  ₪114.24
₱4,038.21
List Price:  ₱5,769.12
You save:  ₱1,730.90
¥10,975.59
List Price:  ¥15,680.08
You save:  ¥4,704.49
MX$1,206.90
List Price:  MX$1,724.22
You save:  MX$517.31
QR255.26
List Price:  QR364.68
You save:  QR109.41
P965.80
List Price:  P1,379.77
You save:  P413.97
KSh9,448.65
List Price:  KSh13,498.65
You save:  KSh4,050
E£3,352.63
List Price:  E£4,789.68
You save:  E£1,437.04
ብር3,985.10
List Price:  ብር5,693.24
You save:  ብር1,708.14
Kz58,489.70
List Price:  Kz83,560.30
You save:  Kz25,070.60
CLP$66,385.51
List Price:  CLP$94,840.51
You save:  CLP$28,455
CN¥507.19
List Price:  CN¥724.59
You save:  CN¥217.40
RD$4,111.01
List Price:  RD$5,873.12
You save:  RD$1,762.11
DA9,398.61
List Price:  DA13,427.17
You save:  DA4,028.55
FJ$158.17
List Price:  FJ$225.97
You save:  FJ$67.79
Q544.39
List Price:  Q777.73
You save:  Q233.34
GY$14,639.41
List Price:  GY$20,914.34
You save:  GY$6,274.92
ISK kr9,791.60
List Price:  ISK kr13,988.60
You save:  ISK kr4,197
DH708.11
List Price:  DH1,011.63
You save:  DH303.52
L1,243.72
List Price:  L1,776.81
You save:  L533.09
ден4,016.98
List Price:  ден5,738.79
You save:  ден1,721.81
MOP$563.94
List Price:  MOP$805.67
You save:  MOP$241.72
N$1,340.22
List Price:  N$1,914.68
You save:  N$574.46
C$2,575.16
List Price:  C$3,678.96
You save:  C$1,103.79
रु9,316.74
List Price:  रु13,310.19
You save:  रु3,993.45
S/261.07
List Price:  S/372.97
You save:  S/111.90
K269.52
List Price:  K385.04
You save:  K115.52
SAR262.50
List Price:  SAR375.02
You save:  SAR112.51
ZK1,842.03
List Price:  ZK2,631.59
You save:  ZK789.55
L324.63
List Price:  L463.78
You save:  L139.14
Kč1,640.70
List Price:  Kč2,343.96
You save:  Kč703.26
Ft25,605.88
List Price:  Ft36,581.40
You save:  Ft10,975.51
SEK kr762.91
List Price:  SEK kr1,089.92
You save:  SEK kr327.01
ARS$61,153.48
List Price:  ARS$87,365.86
You save:  ARS$26,212.38
Bs484.56
List Price:  Bs692.26
You save:  Bs207.70
COP$275,828.59
List Price:  COP$394,057.74
You save:  COP$118,229.14
₡35,172.98
List Price:  ₡50,249.28
You save:  ₡15,076.29
L1,727.65
List Price:  L2,468.18
You save:  L740.53
₲519,796.34
List Price:  ₲742,598.03
You save:  ₲222,801.68
$U2,684.12
List Price:  $U3,834.62
You save:  $U1,150.50
zł281.95
List Price:  zł402.81
You save:  zł120.85
Already have an account? Log In

Transcript

In this video we will be discussing about the concept of multicollinearity and the concept of autocorrelation. Now what is multi current multicollinearity generally occurs when there are high correlations between two or more creditor variables. In other words, one factor variable can be used to predict the other creditor variable an easy way to detect multicollinearity is to calculate correlation coefficients for all pairs of predictor variables. The predictors in a regression model are often called the independent variables, but this term does not imply that the predictors are themselves independent statistically from one another. In fact, for natural systems, though predators can be Highly inter correlated MySQL linearity is a term reserved to describe the case when the inter correlation of predictor variables is high. It has been noted that the variance of the estimated regression coefficients depends on the inter correlation of characters.

Therefore, we can say that multicollinearity occurs when the predictor variables or the independent variables get influenced by each other they are they are very much correlated with respect to each other. Now, multicollinearity has the following negative effects the first effect is the variance of the regression coefficients can be inflated so much that the individual coefficients are not statistically significant, even though the overall regression equation is strong, and the predictive ability is good. The second effect, the relative magnitudes and even the science of the coefficients may defy the interpretation, the third effect the values of the differential regression coefficients may change radically with the removal or addition of a predictor variable in the equation. In fact, the sign of the coefficient might even switch now, let us discuss about the signs of multi collinear first sign, when there are high correlation between pairs of predictor variables, this denotes multicollinearity when the regression coefficients whose signs or magnitudes do not make good physical sense, this is another sign of multicollinearity statistically non significant regression coefficients are important predictors of our classical linear regression model this is another sign of multicollinearity extra sensitivity of sign or magnitude of regression coefficients to insertion or deletion of predictor variables.

So, this is another sign of multi collinear. Now, what is V is when there is multicollinearity in your classical linear regression model, that is when the predictor variables are getting influenced by each other then the concept of variable variable inflationary factor comes where the variance gets inflated. So, VA is measuring how much the variance of the estimator regression coefficients are inflated as compared to when the predictor variables are not linearly related, it is used to explain how much multicollinearity that is the correlation between predictors exists in a regression analysis. So, the variance inflation factor is a statistic that can be used to identify multicollinearity in a matrix of predictor variables variance inflation reference here to the mentioned the effect of multicollinearity on the variance of estimated regression coefficients multicollinearity depends not just on the by variate, but also on the multivariate heritability of any one creditor from the other creditors.

Accordingly, the VI F is based on the multiple coefficient of determination of regression model for each predictor in multivariate linear regression model on all the other predictors Now the V vi F or the variance inflation factor can be represented by the following formula vi a Vi is equal to one by one minus r i squared, where r square is the multiple coefficient of determination in a regression of the is predicted on all other predictors. And VA s is the variance inflation factor associated with the is predicted. If the if character is independent of the other predictors, the variance inflation factor is one. Why is the if character can be almost perfectly predicted from the other predators the variance inflation factor approaches to infinity, the variance of the estimated regression coefficients is unbounded. multicollinearity is said to be a problem when the variance inflation factors of one or more predictors becomes very large.

Some researchers uses a vi F of five or 10 as a critical threshold. The vi F is closely related to a statistic called tolerance Which is one by vi now, what are the remedies or what are the solutions of Vi first is obtain more data so as to reduce the standard errors. Next obtain better data where the predators are less correlated example by conducting an experiment called remedy record the predictors in a way that reduces correlations. Now, let's discuss the concept of autocorrelation. autocorrelation is a statistical measure that indicates the degree of correlation of a random variable, we could say it measures the relationship between a value in a time series and those that occur before and after. So, autocorrelation is a mathematical representation of the degree of similarity between a given time series and the lagged version of itself over successive time intervals.

It is the same as calculating the correlation between two different time series except that that same time series is used twice, once in its original form, and once lack one On more time periods autocorrelation is calculated to detect patterns in the data according to our assumption of the classical linear regression model, the error terms should not be correlated with respect to time that is error term at a time period that is, he should not be correlated to et minus one should not be correlated to et minus two and so on. There should not be any autocorrelation with respect to their terms. Now, let's move to the concept of Durbin Watson test Durbin Watson test is done to check for autocorrelation there may ah not or not hypothesis no autocorrelation and may alternative hypothesis H one is autocorrelation exists the DW statistic which is denoted by P is equal to summation of within bracket Ei minus EI minus one whole square divided by summation of a square or D is equal to two into one minus rho, where rho is my autocorrelation coefficient.

Now, if my d value is equal to two then there is no autocorrelation that is rho equals to zero if d was to zero, then autocorrelation is one that is rho equals to one Because to four then autocorrelation is minus one that is rho is equal to minus one if the value of my D w statistic lies between 1.5 to 2.5. Then it denotes that there is no accumulation. I'm ending this video over here. Thank you, goodbye. See you all for the next

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.