Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Recursive Forecasting | Multi-Step Forecasting Strategies
Quizzes & Challenges
Quizzes
Challenges
/
Machine Learning for Time Series Forecasting

bookRecursive Forecasting

Recursive forecasting, also known as the autoregressive strategy, is a common approach for multi-step time series prediction. In this method, you train a one-step-ahead forecasting modelβ€”such as a regression model that predicts the next value based on previous values. To generate forecasts several steps into the future, you use the model's own predictions as inputs for subsequent predictions. This means that after predicting the first future value, you append that predicted value to your input sequence and use it to predict the next step, and so on, until you reach the desired forecast horizon.

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748
import numpy as np import pandas as pd from sklearn.ensemble import RandomForestRegressor # Create a synthetic time series np.random.seed(42) n = 200 time = np.arange(n) series = 0.5 * np.sin(0.2 * time) + np.random.normal(scale=0.3, size=n) # Prepare lagged features def make_lagged_features(series, lags=3): df = pd.DataFrame({'y': series}) for lag in range(1, lags + 1): df[f'lag_{lag}'] = df['y'].shift(lag) df = df.dropna() return df lags = 3 df = make_lagged_features(series, lags=lags) # Train/test split train_size = 150 train_df = df.iloc[:train_size] test_df = df.iloc[train_size:] X_train = train_df.drop('y', axis=1).values y_train = train_df['y'].values # Fit a one-step model model = RandomForestRegressor(n_estimators=100, random_state=42) model.fit(X_train, y_train) # Recursive forecasting: predict the next 10 steps last_known = df.iloc[train_size][1:].values # last lags n_steps = 10 predictions = [] current_input = last_known.copy() for _ in range(n_steps): pred = model.predict(current_input.reshape(1, -1))[0] predictions.append(pred) # Update the input: drop oldest lag, append new prediction current_input = np.roll(current_input, -1) current_input[-1] = pred print("Recursive predictions for next 10 steps:") print(np.round(predictions, 3))
copy

Recursive forecasting is straightforward to implement and leverages existing one-step models, but it comes with notable trade-offs. On the plus side, this approach is simple and flexible, allowing you to use any regression model trained for one-step prediction. However, a key disadvantage is error accumulation: because each new prediction depends on previous predictions rather than true observed values, small errors can compound rapidly as you forecast further into the future. This can lead to deteriorating forecast accuracy, especially for long horizons. Nevertheless, recursive forecasting remains useful when you have strong one-step models or limited data, and when the forecast horizon is not too long.

1. What is a main drawback of recursive forecasting for multi-step prediction?

2. When might recursive forecasting be preferred over direct forecasting?

question mark

What is a main drawback of recursive forecasting for multi-step prediction?

Select the correct answer

question mark

When might recursive forecasting be preferred over direct forecasting?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 3. ChapterΒ 1

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

Suggested prompts:

Can you explain how the lagged features are constructed in this example?

What are some ways to reduce error accumulation in recursive forecasting?

How does recursive forecasting compare to other multi-step forecasting strategies?

bookRecursive Forecasting

Swipe to show menu

Recursive forecasting, also known as the autoregressive strategy, is a common approach for multi-step time series prediction. In this method, you train a one-step-ahead forecasting modelβ€”such as a regression model that predicts the next value based on previous values. To generate forecasts several steps into the future, you use the model's own predictions as inputs for subsequent predictions. This means that after predicting the first future value, you append that predicted value to your input sequence and use it to predict the next step, and so on, until you reach the desired forecast horizon.

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748
import numpy as np import pandas as pd from sklearn.ensemble import RandomForestRegressor # Create a synthetic time series np.random.seed(42) n = 200 time = np.arange(n) series = 0.5 * np.sin(0.2 * time) + np.random.normal(scale=0.3, size=n) # Prepare lagged features def make_lagged_features(series, lags=3): df = pd.DataFrame({'y': series}) for lag in range(1, lags + 1): df[f'lag_{lag}'] = df['y'].shift(lag) df = df.dropna() return df lags = 3 df = make_lagged_features(series, lags=lags) # Train/test split train_size = 150 train_df = df.iloc[:train_size] test_df = df.iloc[train_size:] X_train = train_df.drop('y', axis=1).values y_train = train_df['y'].values # Fit a one-step model model = RandomForestRegressor(n_estimators=100, random_state=42) model.fit(X_train, y_train) # Recursive forecasting: predict the next 10 steps last_known = df.iloc[train_size][1:].values # last lags n_steps = 10 predictions = [] current_input = last_known.copy() for _ in range(n_steps): pred = model.predict(current_input.reshape(1, -1))[0] predictions.append(pred) # Update the input: drop oldest lag, append new prediction current_input = np.roll(current_input, -1) current_input[-1] = pred print("Recursive predictions for next 10 steps:") print(np.round(predictions, 3))
copy

Recursive forecasting is straightforward to implement and leverages existing one-step models, but it comes with notable trade-offs. On the plus side, this approach is simple and flexible, allowing you to use any regression model trained for one-step prediction. However, a key disadvantage is error accumulation: because each new prediction depends on previous predictions rather than true observed values, small errors can compound rapidly as you forecast further into the future. This can lead to deteriorating forecast accuracy, especially for long horizons. Nevertheless, recursive forecasting remains useful when you have strong one-step models or limited data, and when the forecast horizon is not too long.

1. What is a main drawback of recursive forecasting for multi-step prediction?

2. When might recursive forecasting be preferred over direct forecasting?

question mark

What is a main drawback of recursive forecasting for multi-step prediction?

Select the correct answer

question mark

When might recursive forecasting be preferred over direct forecasting?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 3. ChapterΒ 1
some-alt