What Is Cointegration? (With Techniques and Examples)

By Indeed Editorial Team

Published May 14, 2022

The Indeed Editorial Team comprises a diverse and talented team of writers, researchers and subject matter experts equipped with Indeed's data and insights to deliver useful tips to help guide your career journey.

Across most industries, researchers collect and analyze various forms of data to gain more industry knowledge. In certain cases, researchers compare two or more variables over a certain period to see how they interact with each other. Understanding what cointegration is can aid your career as a statistician or researcher. In this article, we discuss what it means to cointegrate variables, explore its history to offer more insight, identify various methods for cointegrating data, and provide examples of its application in various industries.

What does cointegration mean?

Cointegration is a statistical method that you can use to understand the correlation between two or more time-based variables, over a certain period. Researchers use this concept to determine the extent of the relationship between several variables in a time series. A time series is a collection of data that has time as one of its variables. When professionals can establish a correlation between two or more variables over time, they can make predictions that have many industrial applications. A popular example of cointegrating variables in a time series is the relationship between supply, demand, and product prices.

The method of cointegrating variables in a time-based series emerged in 1987. Before this, the preferred method to evaluate the correlation between time series was linear regression. Many economists stated that linear progression allowed for the chance of arriving at spurious or misleading correlations. Spurious correlations occur when two variables appear to have a relationship due to a hidden third variable or coincidences. To remedy this, Engle and Granger published the cointegrating approach in a 1987 paper. The paper supported the integration of several variables in a time series such that they can't deviate from equilibrium over a long period.

Related: Example of Positive Correlation (And How to Calculate It)

Techniques for testing if variables can cointegrate

Here are some of the methods for testing if multiple variables in a time series can cointegrate:

The two-step Engle-Granger method

This is one of the first methods of testing to see if several variables cointegrate. While using this method, statisticians assess the regression of static units to make residuals. Then, they analyze these residuals to see if there are unit roots. Next, this method involves using statistical tests like the augmented Dickey-Fuller test to assess if the time-based series is stationary. Where the series contains variables that cointegrate, this method reveals that the residuals are stationary. You can determine this method by using MATLAB or STAT software.

As this method was one of the first ways to test if several variables cointegrate, it has some faults. The Engle-Granger technique doesn't work well when analyzing variables that are more than two, as you may obtain beyond two cointegrating correlations. Similarly, this method is also likely to produce incorrect results if you apply it to two time-based series that depend on each other. As a result, this method has some limits. Still, applying other cointegrating methods can resolve these issues.

Phillips-Ouliaris test

This method is an upgrade from the Engle-Granger test, as it resolved some of its faults. It resolves the inability of the Engle-Granger method to produce correct results if two time-based series depend on each other. As this method can recognize that relationship, it factors it into the final result. Still, this method also doesn't work well if you apply it to more than two variables. Like the Engel-Granger test, the Phillips-Ouliaris test uses two variables. They include:

  • H0 means that two variables don't cointegrate.

  • H1 means that two variables cointegrate.

Related: What is Quantitative Analysis?

Gregory and Hansen test

This cointegrating method is particularly advantageous for working with time-based series that have structural breaks. The original Gregory and Hansen method only works with time-based series that have a single structural break. Still, subsequent improvements on the method, like the Maki and Hatemi-J tests, make it possible to work on time-based series with more than one break. While the Hatemi-J test can resolve time-based series with two structural breaks, the Maki test can resolve time-based series with unlimited structural breaks.

Johansen test

This method seeks to improve the Engle-Granger and Phillips-Ouliaris methods. The Johansen method allows you to work with more than two variables, while also being able to understand several time-based series that depend on each other. Still, this method works best with large sample sizes, as small ones may generate faulty results. This method uses two subsets, which are the trace test and the maximum eigenvalue test:

Trace test

The trace subtest is the first subtest when using the Johansen method. It uses two hypotheses to determine if several time-based series can cointegrate. Then, this subtest uses three stages to ascertain whether the variable X is more than or equals zero, one, and two. For example, you can use the hypotheses this way:

  • H0 means a variable X is lesser or equal to n-1.

  • H1 means a variable X is more than n-1.

  • First stage: H0: X = ) and H1: X > ).

  • Second stage: H0: X < 1 and H1: X > 1.

  • Third stage: H0: X < 2 and H1: X > 2.

Maximum eigeenvalue test

This is the second subtest of the Johansen method. Still, it's not accurate with structural breaks in time-based series. It measures the chance that the value of a variable is X, as opposed to the alternative hypothesis that it's X + 1. The two hypotheses are as follows:

  • H0 means chances that the variables cointegrate is X

  • H1 means chances that the variables cointegrate is X + 1

Examples of cointegrating time-based variables

Various industries cointegrate time-based variables to solve problems or make forecasts. They include:


Economics is the study of how societies create, transfer, and consume wealth or value. It involves the study of both national and international markets, and how activities in these markets affect society and vice versa. Economists often cointegrate various time-based variables to understand market movements and make predictions. By analyzing how various factors affect the market, economists can make and recommend effective policies to governments and private institutions. For example, an economist may compare the purchase of certain goods with the average income of people in a particular community. These are both time-based variables, which can predict consumer behaviour.

Related: Types of Variables in Statistics and Research (With FAQs)


The finance industry is vast and involves the study of various financial assets, including stocks, bonds, and other securities. Finance professionals perform various analyses to predict the movement of the financial market and help their clients or employers make a profit. They cointegrate time-based variables to understand what factors affect the value of financial assets and make accurate forecasts. For example, an investment manager may compare changes in dividends and stock prices across a certain period. Similarly, a loan officer may compare changes in income with loan amounts in a certain community.


Medicine involves the study, treatment, and prevention of diseases. It also includes the study of the human anatomy to promote wellbeing and prolong life. The medical industry often benefits from cointegrating various time-based variables, as it helps them understand how human behaviour affects health. Medical researchers also cointegrate time-based variables to preempt health crises and recommend policies to governments. For example, medical professionals can compare the rate of certain diseases to the rate of consumption of certain products to understand the relationship between them. Similarly, they may compare the age of patients to the mortality rate of a disease.


Statistics involves collecting and analyzing numerical data to find relationships between several variables. This field has its applications in various industries, as it helps reveal how certain variables affect others. Cointegrating time-based variables is a statistical method, which statisticians can apply in various contexts. For example, a statistician working with a pharmacist may analyze the effectiveness of a certain medicine on different demographics over a specific period. Similarly, a statistician working with economists may compare the likelihood to purchase certain goods within specific income brackets in a particular community.


Marketing involves the promotion of goods and services to increase sales within a certain market. It involves the study of market behaviour and leverages those insights to help businesses make more profit. Market researchers can cointegrate several time-based variables to develop a better understanding of the market. For example, a market researcher working for a car manufacturer may compare the likelihood of certain demographics to purchase a vehicle. Similarly, the marketing team of a soda company may analyze the likelihood of soda consumption across genders or ages.

Related: How to Become a Brand Strategist (With Eight Steps)


Epidemiology is the study of the causes and effects of diseases, particularly relating to disease outbreaks and pandemics. Epideiomologists analyze several variables, including social events and human behaviour, to discover how to reduce the incidence of certain diseases. For example, an epidemiologist may compare the incidence of lung cancer with the frequency of smoking. Similarly, they may assess the incidence of heart diseases in certain categories of people. This helps them develop and recommend policies to the government. Additionally, it enables them to give practical medical advice to the general public to reduce their risk for certain diseases.

Explore more articles