1 Forecasting Forecasting Terminology Simple Moving Average
Weighted Moving Average Exponential Smoothing Simple Linear
Regression Model Holts Trend Model Seasonal Model (No Trend)
Winters Model for Data with Trend and Seasonal Components Slide 2 2
Evaluating Forecasts Visual Review Errors Errors Measure MPE and
MAPE Tracking Signal Slide 3 3 Historical Data 0 50 100 150 200 250
300 350 400 01020304050 Forecasting Terminology Initialization
ExPost Forecast Historical Data Slide 4 4 We are now looking at a
future from here, and the future we were looking at in February now
includes some of our past, and we can incorporate the past into our
forecast. 1993, the first half, which is now the past and was the
future when we issued our first forecast, is now over Laura DAndrea
Tyson, Head of the Presidents Council of Economic Advisors, quoted
in November of 1993 in the Chicago Tribune, explaining why the
Administration reduced its projections of economic growth to 2
percent from the 3.1percent it predicted in February. Forecasting
Terminology Slide 5 5 Forecasting Problem Suppose your
fraternity/sorority house consumed the following number of cases of
beer for the last 6 weekends: 8, 5, 7, 3, 6, 9 How many cases do
you think your fraternity / sorority will consume this weekend?
Slide 6 6 0 1 2 3 4 5 6 7 8 9 10 012345678 Week Cases Forecasting:
Simple Moving Average Method Using a three period moving average,
we would get the following forecast: Slide 7 7 0 1 2 3 4 5 6 7 8 9
10 012345678 Week Cases Forecasting: Simple Moving Average Method
What if we used a two period moving average? Slide 8 8 The number
of periods used in the moving average forecast affects the
responsiveness of the forecasting method: 0 1 2 3 4 5 6 7 8 9 10
012345678 Week Cases Forecasting: Simple Moving Average Method 2
Periods 3 Periods 1 Period Slide 9 9 Forecasting Terminology
Applying this terminology to our problem using the Moving Average
forecast: Initialization ExPost Forecast Model Evaluation Slide 10
10 Rather than equal weights, it might make sense to use weights
which favor more recent consumption values. With the Weighted
Moving Average, we have to select weights that are individually
greater than zero and less than 1, and as a group sum to 1: Valid
Weights: (.5,.3,.2), (.6,.3,.1), ( 1/2, 1/3, 1/6) Invalid Weights:
(.5,.2,.1), (.6, -.1,.5), (.5,.4,.3,.2) Forecasting: Weighted
Moving Average Method Slide 11 11 Forecasting: Weighted Moving
Average Method A Weighted Moving Average forecast with weights of
(1/6, 1/3, 1/2), is performed as follows: How do you make the
Weighted Moving Average forecast more responsive? Slide 12 12
Exponential Smoothing is designed to give the benefits of the
Weighted Moving Average forecast without the cumbersome problem of
specifying weights. In Exponential Smoothing, there is only one
parameter ( ): = smoothing constant (between 0 and 1) Forecasting:
Exponential Smoothing Slide 13 13 Initialization: Forecasting:
Exponential Smoothing Slide 14 14 tA(t)F(t) 18 256.5 375.9 436.34
565 695.4 7 6.84 8 9 10 6.84 Forecasting: Exponential Smoothing
Using = 0.4, Initialization ExPost Forecast Slide 15 15
Forecasting: Exponential Smoothing Slide 16 16 Forecasting:
Exponential Smoothing 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
1234567 Period Weight = 0.1 = 0.3 = 0.5 = 0.7 = 0.9 Slide 17 17
Outliers (eloping point) Outlier Slide 18 18 Data with Trends Slide
19 19 0 1 2 3 4 5 6 7 8 9 10 1234567 A(t) Data with Trends = 0.3 =
0.5 = 0.7 = 0.9 Slide 20 20 Forecasting: Simple Linear Regression
Model Simple linear regression can be used to forecast data with
trends D is the regressed forecast value or dependent variable in
the model, a is the intercept value of the regression line, and b
is the slope of the regression line. a D 0 1 2 3 4 5 I b Slide 21
21 Forecasting: Simple Linear Regression Model In linear
regression, the squared errors are minimized Error Slide 22 22
Forecasting: Simple Linear Regression Model Slide 23 23 0 50 100
150 200 250 0246810121416 Limitations in Linear Regression Model As
with the simple moving average model, all data points count equally
with simple linear regression. Slide 24 24 Forecasting: Holts Trend
Model To forecast data with trends, we can use an exponential
smoothing model with trend, frequently known as Holts model: L(t) =
A(t) + (1- ) F(t) T(t) = L(t) - L(t-1) ] + (1- ) T(t-1) F(t+1) =
L(t) + T(t) We could use linear regression to initialize the model
Slide 25 25 Holts Trend Model: Initialization First, well
initialize the model: L(4) = 20.5+4(9.9)=60.1 T(4) = 9.9 Slide 26
26 Holts Trend Model: Updating 52 L(t) = A(t) + (1- ) F(t) L(5) =
0.3 (52) + 0.7 (70)=64.6 T(t) = [L(t) - L(t-1) ] + (1- ) T(t-1)
T(5) = 0.4 [64.6 60.1] + 0.6 (9.9) = 7.74 F(t+1) = L(t) + T(t) F(6)
= 64.6 + 7.74 = 72.34 64.6 7.74 72.346 Slide 27 27 Holts Trend
Model: Updating 63 L(6) = 0.3 (63) + 0.7 (72.34)=69.54 T(6) = 0.4
[69.54 64.60] + 0.6 (7.74) = 6.62 F(7) = 69.54 + 6.62 = 76.16 69.54
6.62 76.167 72 Slide 28 28 Holts Model Results Initialization
ExPost Forecast Slide 29 29 Regression 0 50 100 150 200 250 300 350
05101520 Initialization ExPost Forecast Forecast Holts Model
Results Slide 30 30 Forecasting: Seasonal Model (No Trend) Slide 31
31 L(t) = A(t) / S(t-p) + (1- ) L(t-1) S(t) = A(t) / L(t)] + (1- )
S(t-p) Seasonal Model Formulas p is the number of periods in a
season Quarterly data: p = 4 Monthly data: p = 12 F(t+1) = L(t) *
S(t+1-p) Slide 32 32 Seasonal Model Initialization S(5) = 0.60 S(6)
= 1.00 S(7) = 1.55 S(8) = 0.85 L(8) = 26.5 Quarter Average 16.0
26.5 41.0 22.5 Seasonal Factor S(t) 0.60 1.00 1.55 0.85 Average
Sales per Quarter =26.5 A(t) 2003Spring16 Summer27 Fall39 Winter22
2004Spring16 Summer26 Fall43 Winter23 Slide 33 33 Seasonal Model
Forecasting 26.711.0325.18 26.621.5541.32
25.180.5916.002005Spring14 Summer29 Fall41 Winter22 26.340.8422.60
2006Spring Summer Fall Winter 15.53 27.02 40.69 22.25 A(t)L(t)
Seasonal Factor S(t)F(t) 2004Spring160.60 Summer261.00 Fall431.55
Winter2326.500.85 Slide 34 34 0 5 10 15 20 25 30 35 40 45 50
0246810121416 Seasonal Model Forecasting Slide 35 35 Forecasting:
Winters Model for Data with Trend and Seasonal Components L(t) =
A(t) / S(t-p) + (1- )[L(t-1)+T(t-1)] T(t) = L(t) - L(t-1)] + (1- )
T(t-1) S(t) = A(t) / L(t)] + (1- ) S(t-p) F(t+1) = L(t) + T(t)]
S(t+1-p) Slide 36 36 Seasonal-Trend Model Decomposition To
initialize Winters Model, we will use Decomposition Forecasting,
which itself can be used to make forecasts. Slide 37 37
Decomposition Forecasting There are two ways to decompose forecast
data with trend and seasonal components: Use regression to get the
trend, use the trend line to get seasonal factors Use averaging to
get seasonal factors, de- seasonalize the data, then use regression
to get the trend. Slide 38 38 Decomposition Forecasting The
following data contains trend and seasonal components: Slide 39 39
Decomposition Forecasting The seasonal factors are obtained by the
same method used for the Seasonal Model forecast:
PeriodQuarterSales 1Spring90 2Summer157 3Fall123 4Winter93
5Spring128 6Summer211 7Fall163 8Winter122 Average =135.9 Average to
1 Qtr. Ave. 109 184 143 107.5 Seas. Factor 0.80 1.35 1.05 0.79 1.00
Slide 40 40 Decomposition Forecasting With the seasonal factors,
the data can be de- seasonalized by dividing the data by the
seasonal factors: Regression on the De-seasonalized data will give
the trend Slide 41 41 Decomposition Forecasting Regression Results
Slide 42 42 Decomposition Forecast Regression on the
de-seasonalized data produces the following results: Slope (m) =
7.71 Intercept (b) = 101.2 Forecasts can be performed using the
following equation [mx + b](seasonal factor) Slide 43 43 0 50 100
150 200 250 300 123456789101112 Decomposition Forecasting Slide 44
44 Winters Model Initialization We can use the decomposition
forecast to define the following Winters Model parameters: L(n) = b
+ m (n) T(n) = m S(j) = S(j-p) L(8) = 101.2 + 8 (7.71) = 162.88
T(8) = 7.71 S(5) = 0.80 S(6) = 1.35 S(7) = 1.05 S(8) = 0.79 So from
our previous model, we have Slide 45 45 Winters Model Example
176.4110.040.81136.47 197.8514.601.39251.71 215.0015.621.06223.07
9Spring152 10Summer303 11Fall232 12Winter171 226.3713.920.78182.19
13Spring 14Summer 15Fall 16Winter 195.19 352.41 283.09 220.87 = 0.3
= 0.4 = 0.2 PeriodQuarterSalesL(t)T(t)S(t)F(t) 1Spring90 2Summer157
3Fall123 4Winter93 5Spring1280.8 6Summer2111.35 7Fall1621.05
8Winter122162.887.710.79 Slide 46 46 0 50 100 150 200 250 300 350
400 12345678910111213141516 Winters Model Example Slide 47 47
Evaluating Forecasts Trust, but Verify Ronald W. Reagan Computer
software gives us the ability to mess up more data on a greater
scale more efficiently While software like SAP can automatically
select models and model parameters for a set of data, and usually
does so correctly, when the data is important, a human should
review the model results One of the best tools is the human eye
Slide 48 48 0 10 20 30 40 50 60 123456789101112131415 Visual Review
How would you evaluate this forecast? Slide 49 49 0 50 100 150 200
250 300 350 400 01020304050 Forecast Evaluation Initialization
ExPost Forecast Where Forecast is Evaluated Do not include
initialization data in evaluation Slide 50 50 0 100 150 200 250 300
350 400 2025303540 Errors All error measures compare the forecast
model to the actual data for the ExPost Forecast region Slide 51 51
Errors Measure All error measures are based on the comparison of
forecast values to actual values in the ExPost Forecast regiondo
not include data from initialization. Slide 52 52 Bias and MAD
Slide 53 53 Bias tells us whether we have a tendency to over- or
under-forecast. If our forecasts are in the middle of the data,
then the errors should be equally positive and negative, and should
sum to 0. MAD (Mean Absolute Deviation) is the average error,
ignoring whether the error is positive or negative. Errors are bad,
and the closer to zero an error is, the better the forecast is
likely to be. Error measures tell how well the method worked in the
ExPost forecast region. How well the forecast will work in the
future is uncertain. Bias and MAD Slide 54 54 Absolute vs. Relative
Measures Forecasts were made for two sets of data. Which forecast
was better? Data Set 1 Bias = 18.72 MAD = 43.99 Data Set 2 Bias =
182 MAD = 912.5 Data Set 1 Data Set 2 Slide 55 55 MPE and MAPE When
the numbers in a data set are larger in magnitude, then the error
measures are likely to be large as well, even though the fit might
not be as good. Mean Percentage Error (MPE) and Mean Absolute
Percentage Error (MAPE) are relative forms of the Bias and MAD,
respectively. MPE and MAPE can be used to compare forecasts for
different sets of data. Slide 56 56 MPE and MAPE Mean Percentage
Error (MPE) Mean Absolute Percentage Error (MAPE) Slide 57 57 MPE
and MAPE Data Set 1 Slide 58 58 MPE and MAPE Data Set 2 Slide 59 59
MPE and MAPE Data Set 2 Data Set 1 Slide 60 60 0 10 20 30 40 50 60
123456789101112131415 Tracking Signal Whats happened in this
situation? How could we detect this in an automatic forecasting
environment? Slide 61 61 Tracking Signal The tracking signal can be
calculated after each actual sales value is recorded. The tracking
signal is calculated as: The tracking signal is a relative measure,
like MPE and MAPE, so it can be compared to a set value (typically
4 or 5) to identify when forecasting parameters and/or models need
to be changed. Slide 62 62 Tracking Signal Slide 63 63 0 10 20 30
40 50 60 123456789101112131415 Tracking Signal TS = -5.78