Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Autocorrelation | Time Series Processing
Time Series Analysis
course content

Зміст курсу

Time Series Analysis

Time Series Analysis

1. Time Series: Let's Start
2. Time Series Processing
3. Time Series Visualization
4. Stationary Models
5. Non-Stationary Models
6. Solve Real Problems

Autocorrelation

The next characteristic we will analyze is autocorrelation.

Autocorrelation measures how much future values in a time series depend linearly on past values. What examples can we give?

The graph above shows the popularity of the names "Maria" and "Olivia" over 140 years. Olivia's autocorrelation decays much faster than Maria's: this can be explained by the fact that the popularity of the name Olivia was very low until 1980 and then increased very sharply. While the popularity of the name Maria did not have such sharp jumps and developed approximately the same over time.

Let's visualize the autocorrelation:

Let's see how to interpret this chart. The graph shows the last 22 values from the dataset (they are shown as vertical lines). If these lines fall within the shaded blue area, this means that they do not have a significant correlation with the previous values.

As you can see on the graph, the first 13 values are correlated with the previous ones, while the next ones are not.

In summary, autocorrelation is useful for identifying statistically significant relationships between values in a time series.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Перейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів

Все було зрозуміло?

Секція 2. Розділ 3
toggle bottom row

Autocorrelation

The next characteristic we will analyze is autocorrelation.

Autocorrelation measures how much future values in a time series depend linearly on past values. What examples can we give?

The graph above shows the popularity of the names "Maria" and "Olivia" over 140 years. Olivia's autocorrelation decays much faster than Maria's: this can be explained by the fact that the popularity of the name Olivia was very low until 1980 and then increased very sharply. While the popularity of the name Maria did not have such sharp jumps and developed approximately the same over time.

Let's visualize the autocorrelation:

Let's see how to interpret this chart. The graph shows the last 22 values from the dataset (they are shown as vertical lines). If these lines fall within the shaded blue area, this means that they do not have a significant correlation with the previous values.

As you can see on the graph, the first 13 values are correlated with the previous ones, while the next ones are not.

In summary, autocorrelation is useful for identifying statistically significant relationships between values in a time series.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Перейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів

Все було зрозуміло?

Секція 2. Розділ 3
toggle bottom row

Autocorrelation

The next characteristic we will analyze is autocorrelation.

Autocorrelation measures how much future values in a time series depend linearly on past values. What examples can we give?

The graph above shows the popularity of the names "Maria" and "Olivia" over 140 years. Olivia's autocorrelation decays much faster than Maria's: this can be explained by the fact that the popularity of the name Olivia was very low until 1980 and then increased very sharply. While the popularity of the name Maria did not have such sharp jumps and developed approximately the same over time.

Let's visualize the autocorrelation:

Let's see how to interpret this chart. The graph shows the last 22 values from the dataset (they are shown as vertical lines). If these lines fall within the shaded blue area, this means that they do not have a significant correlation with the previous values.

As you can see on the graph, the first 13 values are correlated with the previous ones, while the next ones are not.

In summary, autocorrelation is useful for identifying statistically significant relationships between values in a time series.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Перейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів

Все було зрозуміло?

The next characteristic we will analyze is autocorrelation.

Autocorrelation measures how much future values in a time series depend linearly on past values. What examples can we give?

The graph above shows the popularity of the names "Maria" and "Olivia" over 140 years. Olivia's autocorrelation decays much faster than Maria's: this can be explained by the fact that the popularity of the name Olivia was very low until 1980 and then increased very sharply. While the popularity of the name Maria did not have such sharp jumps and developed approximately the same over time.

Let's visualize the autocorrelation:

Let's see how to interpret this chart. The graph shows the last 22 values from the dataset (they are shown as vertical lines). If these lines fall within the shaded blue area, this means that they do not have a significant correlation with the previous values.

As you can see on the graph, the first 13 values are correlated with the previous ones, while the next ones are not.

In summary, autocorrelation is useful for identifying statistically significant relationships between values in a time series.

Завдання

Visualize the autocorrelation of the following dataset air_quality_no2_long.csv for 30 records.

  1. Import the plot_acf function from statsmodels.graphics.tsaplots.
  2. Visualize the autocorrelation for 30 "value" records of the data DataFrame.

Перейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
Секція 2. Розділ 3
Перейдіть на комп'ютер для реальної практикиПродовжуйте з того місця, де ви зупинились, використовуючи один з наведених нижче варіантів
We're sorry to hear that something went wrong. What happened?
some-alt