When choosing multiple linear regression analysis, we include at least two independent variables into the model. To obtain the best linear unbiased estimator, we must test the assumptions. One of the assumptions that need to be tested is the multicollinearity test.

On this occasion, I will discuss how to test multicollinearity and interpret the results. This assumption test is intended to determine whether the independent variables included in the model have a strong correlation or not.

It is essential to do a multicollinearity test so that the regression estimation results are not biased. Detection of multicollinearity can be done by looking at the value of the correlation coefficient between independent variables or by looking at the value of the Variance Inflation Factor (VIF).

The multicollinearity test stages are pretty easy to do independently. You can use statistical software to test multicollinearity. You can even calculate the value of the correlation coefficient or VIF manually to test whether there is multicollinearity or not.

**Non-Multicollinearity Assumption Hypothesis**

When conducting multicollinearity tests, we need to formulate the hypothesis first. Based on the article I wrote earlier, the hypothesis can be divided into the null hypothesis (H_{0}) and the alternative hypothesis (H_{1}).

Therefore, we arrange the null hypothesis and the alternative hypothesis first to make it easier to test multicollinearity. We can formulate the hypothesis as follows:

H_{0}: There is no multicollinearity between independent variables

H_{1}: Multicollinearity occurs between independent variables

**Non-Multicollinearity Assumption Hypothesis Test Criteria**

After we formulate the hypothesis, we need to understand well how the criteria for testing the hypothesis are. In the previous paragraph, I have stated that the multicollinearity test can be conducted by looking at the correlation coefficient or the Variance Inflation Factor (VIF) value.

**1. Correlation Coefficient Criteria**

To obtain the correlation coefficient value, you must perform a correlation analysis. Due to the assumption of linear regression using interval/ratio scale data.

We need to understand that the multicollinearity test aims to determine whether there is a strong correlation between the independent variables. The correlation coefficient value is close to 1 (coefficient value > 0.8), indicating multicollinearity between independent variables. Thus, if the correlation coefficient value is > 0.8, the null hypothesis (Ho) is rejected.

**2. Variance Inflation Factor Criteria**

The value of Variance Inflation Factor (VIF) is widely chosen for multicollinearity detection because of its convenience. We can also use the tolerance value. However, the results will also be the same because the VIF value is obtained from 1/Tolerance.

If the VIF value is greater, it indicates the potential for multicollinearity. On the other hand, if the VIF value is small, it indicates no multicollinearity.

Multicollinearity occurs if the VIF value > 10. If the VIF value is > 10, the null hypothesis is rejected. On the other hand, if the VIF value is < 10, it can be concluded that there is no multicollinearity.

In the multicollinearity assumption test, the null hypothesis (Ho) is expected to be accepted. If H_{0} is accepted, it means that there is no multicollinearity.

**How to Test Multicollinearity**

On this occasion, I will use statistical software to test multicollinearity. I chose to use SPSS in this example case.

Please enter the data you have. Then click Analyze > Regression > Linear until the linear regression window appears. Then move the dependent variable into the dependent box and move the independent variable into the independent box.

In the next step, click Statistics until the Linear Regression: Statistics window appears on the top right. To display the VIF value, you need to activate the Collinearity Diagnostics checklist as shown in this image:

You click Continue, then click Ok. The SPSS output will appear immediately.

For the multicollinearity test, I used two independent variables. The results of the analysis using SPSS can be seen as shown below:

**Interpretation of Analysis Results**

Based on the results above, the FIV value is 2.906. This VIF value is less than 10, so the null hypothesis (H_{o}) is accepted. Thus, there is no multicollinearity between the Income (X_{1}) and Total Population (X_{2}) variables.

Well, that’s how to test for multicollinearity in multiple linear regression. See you in the following article!

[…] 1. How to Test the Multicollinearity in Multiple Linear Regression […]