14.2 Measures of Forecast Accuracy
LEARNING OBJECTIVES
- Calculate and interpret forecast errors, including mean absolute deviation, mean square error, and mean absolute percent error.
Using time series data, we want to build a model from that data in order to make forecasts and predictions about the future. There are many different time series models, each giving a different forecast for the same data. But how accurate is the forecast obtained from a time series model, and how can we tell which of the models will give us the most accurate forecast?
Forecast accuracy is measured through the errors in the forecast. We will look at three different error analysis techniques—mean absolute deviation ([latex]MAD[/latex]), mean square error ([latex]MSE[/latex]), and mean absolute percent error ([latex]MAPE[/latex]). Each of these techniques examines the errors in the forecast in slightly different ways, but they all involve working with the forecast error, which is the difference between the actual value in the data and the value forecasted by the model.
[latex]\text{Forecast Error}=\text{Actual Value}-\text{Forecast Value}[/latex]
Mean Absolute Deviation
The mean absolute deviation, denoted [latex]MAD[/latex], is the average of the absolute value of the forecasting errors.
[latex]\displaystyle{MAD=\frac{\sum|\text{forecast error}|}{\text{number of forecast errors}}}[/latex]
To calculate the mean absolute deviation:
- Calculate the forecast errors.
- Calculate the absolute value of each forecast error.
- Sum up the absolute values of the errors found in Step 2.
- Divide the sum in Step 3 by the number of errors.
The mean absolute deviation tells us the average difference between the actual values and the forecast values. In general, the smaller the mean absolute deviation, the better the model is at forecasting. A small [latex]MAD[/latex] indicates that the average error is small, which means that, on average, the actual values and the forecast values are close in value.
NOTES
- There will be an error for each time period that has both an actual value and a forecast value. If a time period has only an actual value and no forecast value, then there is no error for that time period. Similarly, if a time period has only a forecast value and no actual value, then there is no error for that time period.
- The mean absolute deviation for a time series is similar to the standard deviation for a set of data. The standard deviation for a set of data tells us the average deviation of the data from the mean. The mean absolute deviation tells us the average deviation of the time series from the forecast.
- Why are we calculating the average of the absolute value of the errors and not the average of the errors themselves? Some of the forecast errors will be positive (when the actual value is greater than the forecast value), and some forecast errors will be negative (when the actual value is less than the forecast value). Consequently, the positive and negative errors have a tendency to cancel each other out, which means the average of the errors will be small regardless of how good or bad the model is at forecasting the time series. By taking the absolute value of the errors, the differences between the positive and negative errors are eliminated because the absolute values of the errors are all positive.
EXAMPLE
A local company produces and sells a certain product. The number of units sold each month for a year is recorded in the table below. The company created a forecast to predict the number of units sold each month.
Month | Actual Number of Units Sold | Forecasted Number of Units Sold |
---|---|---|
January | 83 | 82 |
February | 102 | 105 |
March | 75 | 82 |
April | 125 | 117 |
May | 100 | 95 |
June | 103 | 111 |
July | 118 | 115 |
August | 101 | 97 |
September | 70 | 78 |
October | 99 | 90 |
November | 120 | 125 |
December | 97 | 103 |
- Calculate the mean absolute deviation for the forecast.
- Interpret the mean absolute deviation.
Solution
-
Month Actual Number of Units Sold Forecasted Number of Units Sold Forecast Error |Forecast Error| January [latex]83[/latex] [latex]82[/latex] [latex]1[/latex] [latex]1[/latex] February [latex]102[/latex] [latex]105[/latex] [latex]-3[/latex] [latex]3[/latex] March [latex]75[/latex] [latex]82[/latex] [latex]-7[/latex] [latex]7[/latex] April [latex]125[/latex] [latex]117[/latex] [latex]8[/latex] [latex]8[/latex] May [latex]100[/latex] [latex]95[/latex] [latex]5[/latex] [latex]5[/latex] June [latex]103[/latex] [latex]111[/latex] [latex]-8[/latex] [latex]8[/latex] July [latex]118[/latex] [latex]115[/latex] [latex]3[/latex] [latex]3[/latex] August [latex]101[/latex] [latex]97[/latex] [latex]4[/latex] [latex]4[/latex] September [latex]70[/latex] [latex]78[/latex] [latex]-8[/latex] [latex]8[/latex] October [latex]99[/latex] [latex]90[/latex] [latex]9[/latex] [latex]9[/latex] November [latex]120[/latex] [latex]125[/latex] [latex]-5[/latex] [latex]5[/latex] December [latex]97[/latex] [latex]103[/latex] [latex]-6[/latex] [latex]6[/latex] Sum [latex]67[/latex] [latex]\begin{eqnarray*}MAD&=&\frac{\sum|\text{forecast error}|}{\text{number of forecast errors}}\\&=&\frac{67}{12}\\&=&5.583\end{eqnarray*}[/latex]
- On average, the forecasted values differ by [latex]5.583[/latex] units sold from the actual values.
NOTES
- To calculate the [latex]MAD[/latex], add up the absolute value of the errors column and divide the sum by the number of errors. In this example, there are [latex]12[/latex] errors, so we divide the column sum by [latex]12[/latex].
- The mean absolute deviation is just the average (or mean) of the absolute values of the errors. After calculating the absolute value of the errors, calculate the mean of the absolute value of the errors. In the above example, just calculate the mean of the absolute value of the errors column to get the [latex]MAD[/latex].
- The units of the [latex]MAD[/latex] are the same units as the time series variable. In this case, the time series variable is measuring the number of units sold, so the units of the [latex]MAD[/latex] are also the number of units sold.
- We interpret the [latex]MAD[/latex] similar to how we interpret the standard deviation. The [latex]MAD[/latex] tells us the average difference between the forecasted values and the actual values. When interpreting the [latex]MAD[/latex], be specific to the context of the question and include units with the [latex]MAD[/latex].
Mean Square Error
The mean square error, denoted [latex]MSE[/latex], is the average of the squared forecasting errors.
[latex]\displaystyle{MSE=\frac{\sum(\text{forecast error})^2}{\text{number of forecast errors}}}[/latex]
To calculate the mean square error:
- Calculate the forecast errors.
- Calculate the square of each forecast error.
- Sum up the squared errors found in Step 2.
- Divide the sum in Step 3 by the number of errors.
In general, the smaller the mean square error, the better the model is at forecasting. A small [latex]MSE[/latex] indicates that the average squared error is small, which means that, on average, the actual values and the forecast values are close in value.
NOTES
- There will be an error for each time period that has both an actual value and a forecast value. If a time period has only an actual value and no forecast value, then there is no error for that time period. Similarly, if a time period has only a forecast value and no actual value, then there is no error for that time period.
- The mean square error for a time series is similar to the variance for a set of data.
- Why are we calculating the average of the squared errors? As noted in the discussion above about mean absolute deviation, some of the forecast errors will be positive and some forecast errors will be negative. Consequently, the positive and negative errors have a tendency to cancel each other out. By squaring the errors, the differences between the positive and negative errors are eliminated because all of the squared errors are positive.
EXAMPLE
A local company produces and sells a certain product. The number of units sold each month for a year is recorded in the table below. The company created a forecast to predict the number of units sold each month. Calculate the mean square error for the forecast.
Month | Actual Number of Units Sold | Forecasted Number of Units Sold |
---|---|---|
January | 83 | 82 |
February | 102 | 105 |
March | 75 | 82 |
April | 125 | 117 |
May | 100 | 95 |
June | 103 | 111 |
July | 118 | 115 |
August | 101 | 97 |
September | 70 | 78 |
October | 99 | 90 |
November | 120 | 125 |
December | 97 | 103 |
Solution
Month | Actual Number of Units Sold | Forecasted Number of Units Sold | Forecast Error | (Forecast Error)2 |
---|---|---|---|---|
January | [latex]83[/latex] | [latex]82[/latex] | [latex]1[/latex] | [latex]1[/latex] |
February | [latex]102[/latex] | [latex]105[/latex] | [latex]-3[/latex] | [latex]9[/latex] |
March | [latex]75[/latex] | [latex]82[/latex] | [latex]-7[/latex] | [latex]49[/latex] |
April | [latex]125[/latex] | [latex]117[/latex] | [latex]8[/latex] | [latex]64[/latex] |
May | [latex]100[/latex] | [latex]95[/latex] | [latex]5[/latex] | [latex]25[/latex] |
June | [latex]103[/latex] | [latex]111[/latex] | [latex]-8[/latex] | [latex]64[/latex] |
July | [latex]118[/latex] | [latex]115[/latex] | [latex]3[/latex] | [latex]9[/latex] |
August | [latex]101[/latex] | [latex]97[/latex] | [latex]4[/latex] | [latex]16[/latex] |
September | [latex]70[/latex] | [latex]78[/latex] | [latex]-8[/latex] | [latex]64[/latex] |
October | [latex]99[/latex] | [latex]90[/latex] | [latex]9[/latex] | [latex]81[/latex] |
November | [latex]120[/latex] | [latex]125[/latex] | [latex]-5[/latex] | [latex]25[/latex] |
December | [latex]97[/latex] | [latex]103[/latex] | [latex]-6[/latex] | [latex]36[/latex] |
Sum | [latex]443[/latex] |
[latex]\begin{eqnarray*}MSE&=&\frac{\sum(\text{forecast error})^2}{\text{number of forecast errors}}\\&=&\frac{443}{12}\\&=&36.917\end{eqnarray*}[/latex]
NOTES
- To calculate the [latex]MSE[/latex], add up the squared errors column and divide the sum by the number of errors. In this example, there are [latex]12[/latex] errors, so we divide the column sum by [latex]12[/latex].
- The mean square error is just the average (or mean) of the squared errors. After calculating the squared errors, calculate the mean of the squared errors. In the above example, just calculate the mean of the squared errors column to get the [latex]MSE[/latex].
- The units of the [latex]MSE[/latex] are the squared units of the time series variable. In this case, the time series variable is measuring the number of units sold, so the units of the [latex]MSE[/latex] are the number of units sold squared.
- Because the units of the [latex]MSE[/latex] are squared units, it can be difficult to intuitively interpret the meaning of the [latex]MSE[/latex].
Mean Absolute Percent Error
Both the mean absolute deviation and the mean square error depend on the scale of the data, which makes it difficult to make comparisons between time series measured on different time intervals or between different time series. To make such comparisons, we need to compare the percent errors, which positions the errors on the same relative scale. The percent error is the forecast error divided by the actual value corresponding to that error.
[latex]\displaystyle{\text{Percent Error}=\frac{\text{Forecast Error}}{\text{Actual Value}}}[/latex]
The mean absolute percent error, denoted [latex]MAPE[/latex], is the average of the absolute value of the percent errors.
[latex]\displaystyle{MAPE=\frac{\sum|\text{Percent Error}|}{\text{number of forecast errors}}\times100\%}[/latex]
To calculate the mean absolute percent error:
- Calculate the forecast errors.
- Divide each forecast error by its corresponding actual value.
- Calculate the absolute value of the percent errors found in Step 2.
- Sum up the absolute values of the percent errors found in Step 3.
- Divide the sum in Step 4 by the number of errors.
- Multiply by [latex]100\%[/latex] to convert the result in step 5 to a percent.
The mean absolute percent error tells us the average percent difference between the actual values and the forecast values. In general, the smaller the mean absolute percent error, the better the model is at forecasting. A small [latex]MAPE[/latex] indicates that the average percent error is small, which means that, on average, the actual values and the forecast values are close in value.
NOTES
- There will be an error for each time period that has both an actual value and a forecast value. If a time period has only an actual value and no forecast value, then there is no error for that time period. Similarly, if a time period has only a forecast value and no actual value, then there is no error for that time period.
- The mean absolute percent error tells us the average percent deviation of the time series values from the forecast values.
- Why are we calculating the average of the absolute value of the percent errors and not the average of the percent errors themselves? As noted in the discussion above about mean absolute deviation, some of the forecast errors will be positive and some forecast errors will be negative. Consequently, the positive and negative errors have a tendency to cancel each other out. By taking the absolute value of the percent errors, the differences between the positive and negative errors are eliminated because the absolute value ensures all the percent errors are positive.
EXAMPLE
A local company produces and sells a certain product. The number of units sold each month for a year is recorded in the table below. The company created a forecast to predict the number of units sold each month.
Month | Actual Number of Units Sold | Forecasted Number of Units Sold |
---|---|---|
January | 83 | 82 |
February | 102 | 105 |
March | 75 | 82 |
April | 125 | 117 |
May | 100 | 95 |
June | 103 | 111 |
July | 118 | 115 |
August | 101 | 97 |
September | 70 | 78 |
October | 99 | 90 |
November | 120 | 125 |
December | 97 | 103 |
- Calculate the mean absolute percent error for the forecast.
- Interpret the mean absolute percent error.
Solution
-
Month Actual Number of Units Sold Forecasted Number of Units Sold Forecast Error |Percent Error| January [latex]83[/latex] [latex]82[/latex] [latex]1[/latex] [latex]0.0120\ldots[/latex] February [latex]102[/latex] [latex]105[/latex] [latex]-3[/latex] [latex]0.0294\ldots[/latex] March [latex]75[/latex] [latex]82[/latex] [latex]-7[/latex] [latex]0.0933\ldots[/latex] April [latex]125[/latex] [latex]117[/latex] [latex]8[/latex] [latex]0.064[/latex] May [latex]100[/latex] [latex]95[/latex] [latex]5[/latex] [latex]0.05[/latex] June [latex]103[/latex] [latex]111[/latex] [latex]-8[/latex] [latex]0.0776\ldots[/latex] July [latex]118[/latex] [latex]115[/latex] [latex]3[/latex] [latex]0.0254\ldots[/latex] August [latex]101[/latex] [latex]97[/latex] [latex]4[/latex] [latex]0.0396\ldots[/latex] September [latex]70[/latex] [latex]78[/latex] [latex]-8[/latex] [latex]0.1142\ldots[/latex] October [latex]99[/latex] [latex]90[/latex] [latex]9[/latex] [latex]0.0909\ldots[/latex] November [latex]120[/latex] [latex]125[/latex] [latex]-5[/latex] [latex]0.0416\ldots[/latex] December [latex]97[/latex] [latex]103[/latex] [latex]-6[/latex] [latex]0.0618\ldots[/latex] Sum [latex]0.7002\ldots[/latex] [latex]\begin{eqnarray*}MAPE&=&\frac{\sum|\text{percent error}|}{\text{number of forecast errors}}\times100\%\\&=&\frac{0.7002\ldots}{12}\times100\%\\&=&5.84\%\end{eqnarray*}[/latex]
- On average, the forecasted values differ by [latex]5.84\%[/latex] from the actual values.
NOTES
- To calculate the percent error, divide the forecast error by the actual value. In the above example, the percent error for January is [latex]\displaystyle{\frac{\text{forecast error}}{\text{actual value}}=\frac{1}{83}=0.0120\ldots}[/latex].
- To calculate the [latex]MAPE[/latex], add up the absolute value of the percent errors column, divide the sum by the number of errors and then multiply by [latex]100[/latex] to convert the result to a percent. In this example, there are [latex]12[/latex] errors, so we divide the column sum by [latex]12[/latex].
- When calculating the [latex]MAPE[/latex], keep all of the decimals throughout the calculation to prevent any round-off error.
- The mean absolute percent error is just the average (or mean) of the absolute values of the percent errors. After calculating the absolute value of the percent errors, calculate the mean of the absolute value of the percent errors. In the above example, just calculate the mean of the absolute value of the percent errors column to get the [latex]MAPE[/latex].
- The [latex]MAPE[/latex] tells us the average percent difference between the forecasted values and the actual values.
Assessing Forecasts Using the Measures of Forecast Accuracy
We can use the measures of forecast accuracy to assess how good or bad the forecast is at modelling the known time series data. For all three measures of forecast accuracy, the smaller the value of the measure of forecast accuracy, the better the forecast is at modelling the data, and conversely, the larger the value of the measure of forecast accuracy, the worse the forecast is at modelling the data. For example, if a forecast produced a large [latex]MAD[/latex], then the forecast is probably a poor fit for the data.
Each of the above measures of forecast accuracy measure, albeit in different ways, how well the forecast is able to forecast the known values in the time series. But, we want to use the forecast to predict the values of a future time period where the actual value is unknown. How can we tell if this estimated value is accurate? In general, if a forecast works well on the known time series values and we expect the pattern present in the time series to continue, we expect the forecasted values for the future time periods to be relatively accurate.
We can also use the measures of forecast accuracy to compare different forecasts for the same time series, which will allow us to pick the best forecast for that time series. For example, we can compare the [latex]MAD[/latex]s for different forecasts and then pick the forecast with the best (i.e. smallest) [latex]MAD[/latex]. Similarly, we can compare the [latex]MSE[/latex]s or [latex]MAPE[/latex]s for different forecasts. When comparing measures of forecast accuracy for different forecasts, we have to compare the same measure for each forecast to get a meaningful comparison. We cannot compare the [latex]MAD[/latex] from one forecast with the [latex]MSE[/latex] from another forecast because the [latex]MAD[/latex] and [latex]MSE[/latex] are analyzing the errors in different ways.
EXAMPLE
A local company produces and sells a certain product. The number of units sold each month for a year is recorded in the table below. The company created two forecasts to predict the number of units sold each month. Which forecast should the company use?
Month | Actual Number of Units Sold | Forecast 1 | Forecast 2 |
---|---|---|---|
January | 83 | 82 | 75 |
February | 102 | 105 | 105 |
March | 75 | 82 | 83 |
April | 125 | 117 | 120 |
May | 100 | 95 | 107 |
June | 103 | 111 | 99 |
July | 118 | 115 | 113 |
August | 101 | 97 | 104 |
September | 70 | 78 | 75 |
October | 99 | 90 | 97 |
November | 120 | 125 | 123 |
December | 97 | 103 | 100 |
Solution
The [latex]MAD[/latex] for Forecast 1 is [latex]5.583[/latex]. The [latex]MAD[/latex] for Forecast 2 is [latex]4.667[/latex]. Because the [latex]MAD[/latex] for Forecast 2 is smaller, it suggests that Forecast 2 is more accurate. So, the company should use Forecast 2.
NOTES
- We could also compare the [latex]MSE[/latex]‘s or [latex]MAPE[/latex]‘s for these forecasts. The [latex]MSE[/latex] for Forecast 1 is [latex]36.917[/latex] and the [latex]MSE[/latex] for Forecast 2 is [latex]25.667[/latex]. So, based on the [latex]MSE[/latex]s, the company should use Forecast 2. The [latex]MAPE[/latex] for Forecast 1 is [latex]5.84\%[/latex] and the [latex]MAPE[/latex] for Forecast 2 is [latex]5.01\%[/latex]. So, based on the [latex]MAPE[/latex]s, the company should use Forecast 2.
- In this example, the three measures of forecast accuracy all agree that Forecast 2 is the most accurate. But it is possible that the measures of forecast accuracy do not agree on which forecast is most accurate. That is, one measure of forecast accuracy might indicate one forecast is more accurate, but another measure of forecast accuracy might indicate a different forecast is most accurate. We do not need to compare all three measures of forecast accuracy. In general, pick one of the measures and use that measure to assess the different forecasts.
When assessing different forecasts for the same time series, comparing the measures of forecast accuracy is an important tool to assess how well the forecast fits the data. But the measures of forecast accuracy should not be the only factor we consider when assessing a forecast. We also need to rely on our own judgement and consider other factors, such as current business or economic conditions, that might affect the forecast.
TRY IT
Two different forecasts were created for the following time series.
Period | Actual Value | Forecast 1 | Forecast 2 |
---|---|---|---|
1 | 52 | 50 | |
2 | 48 | 55 | 47 |
3 | 51 | 58 | 44 |
4 | 51 | 46 | 54 |
5 | 60 | 55 | 55 |
6 | 59 | 50 | 53 |
7 | 40 | 47 | 49 |
8 | 45 | 48 | 52 |
- Calculate the [latex]MAD[/latex] for each forecast.
- Based on the [latex]MAD[/latex]s, which forecast is most accurate?
- Calculate the [latex]MSE[/latex] for each forecast.
- Based on the [latex]MSE[/latex]s, which forecast is most accurate?
- Calculate the [latex]MAPE[/latex] for each forecast.
- Based on the [latex]MAPE[/latex]s, which forecast is most accurate?
Click to see Solution
-
Period Actual Value Forecast 1 |Error| (Forecast 1) Forecast 2 |Error| (Forecast 2) 1 [latex]52[/latex] [latex]50[/latex] [latex]2[/latex] 2 [latex]48[/latex] [latex]55[/latex] [latex]7[/latex] [latex]47[/latex] [latex]1[/latex] 3 [latex]51[/latex] [latex]58[/latex] [latex]7[/latex] [latex]44[/latex] [latex]7[/latex] 4 [latex]51[/latex] [latex]46[/latex] [latex]5[/latex] [latex]54[/latex] [latex]3[/latex] 5 [latex]60[/latex] [latex]55[/latex] [latex]5[/latex] [latex]55[/latex] [latex]5[/latex] 6 [latex]59[/latex] [latex]50[/latex] [latex]9[/latex] [latex]53[/latex] [latex]6[/latex] 7 [latex]40[/latex] [latex]47[/latex] [latex]7[/latex] [latex]49[/latex] [latex]9[/latex] 8 [latex]45[/latex] [latex]48[/latex] [latex]3[/latex] [latex]52[/latex] [latex]7[/latex] Sum [latex]45[/latex] Sum [latex]38[/latex] [latex]\begin{eqnarray*}MAD \text{ for Forecast 1}&=&\frac{\sum|\text{forecast error}|}{\text{number of forecast errors}}\\&=&\frac{45}{8}\\&=&5.625\\\\MAD \text{ for Forecast 2}&=&\frac{\sum|\text{forecast error}|}{\text{number of forecast errors}}\\&=&\frac{38}{7}\\&=&5.429\end{eqnarray*}[/latex]
- Forecast 2 because it has the smaller [latex]MAD[/latex].
-
Period Actual Value Forecast 1 (Error)2 (Forecast 1) Forecast 2 (Error)2 (Forecast 2) 1 [latex]52[/latex] [latex]50[/latex] [latex]4[/latex] 2 [latex]48[/latex] [latex]55[/latex] [latex]49[/latex] [latex]47[/latex] [latex]1[/latex] 3 [latex]51[/latex] [latex]58[/latex] [latex]49[/latex] [latex]44[/latex] [latex]49[/latex] 4 [latex]51[/latex] [latex]46[/latex] [latex]25[/latex] [latex]54[/latex] [latex]9[/latex] 5 [latex]60[/latex] [latex]55[/latex] [latex]25[/latex] [latex]55[/latex] [latex]25[/latex] 6 [latex]59[/latex] [latex]50[/latex] [latex]81[/latex] [latex]53[/latex] [latex]36[/latex] 7 [latex]40[/latex] [latex]47[/latex] [latex]49[/latex] [latex]49[/latex] [latex]81[/latex] 8 [latex]45[/latex] [latex]48[/latex] [latex]9[/latex] [latex]52[/latex] [latex]49[/latex] Sum [latex]291[/latex] Sum [latex]250[/latex] [latex]\begin{eqnarray*}MSE \text{ for Forecast 1}&=&\frac{\sum(\text{forecast error})^2}{\text{number of forecast errors}}\\&=&\frac{291}{8}\\&=&36.375\\\\MSE \text{ for Forecast 2}&=&\frac{\sum(\text{forecast error})^2}{\text{number of forecast errors}}\\&=&\frac{250}{7}\\&=&35.714\end{eqnarray*}[/latex]
- Forecast 2 because it has the smaller [latex]MSE[/latex].
-
Period Actual Value Forecast 1 |Percent Error| (Forecast 1) Forecast 2 |Percent Error| (Forecast 2) 1 [latex]52[/latex] [latex]50[/latex] [latex]0.0384\ldots[/latex] 2 [latex]48[/latex] [latex]55[/latex] [latex]0.1458\ldots[/latex] [latex]47[/latex] [latex]0.0208\ldots[/latex] 3 [latex]51[/latex] [latex]58[/latex] [latex]0.1372\ldots[/latex] [latex]44[/latex] [latex]0.1372\ldots[/latex] 4 [latex]51[/latex] [latex]46[/latex] [latex]0.0980\ldots[/latex] [latex]54[/latex] [latex]0.0588\ldots[/latex] 5 [latex]60[/latex] [latex]55[/latex] [latex]0.0833\ldots[/latex] [latex]55[/latex] [latex]0.0833\ldots[/latex] 6 [latex]59[/latex] [latex]50[/latex] [latex]0.1525\ldots[/latex] [latex]53[/latex] [latex]0.1016\ldots[/latex] 7 [latex]40[/latex] [latex]47[/latex] [latex]0.175[/latex] [latex]49[/latex] [latex]0.225[/latex] 8 [latex]45[/latex] [latex]48[/latex] [latex]0.0666\ldots[/latex] [latex]52[/latex] [latex]0.1555\ldots[/latex] Sum [latex]0.8971\ldots[/latex] Sum [latex]0.7824\ldots[/latex] [latex]\begin{eqnarray*}MAPE \text{ for Forecast 1}&=&\frac{\sum|\text{percent error}|}{\text{number of forecast errors}}\times100\%\\&=&\frac{0.8971\ldots}{8}\times100\%\\&=&11.21\%\\\\MAPE\text{ for Forecast 2}&=&\frac{\sum|\text{percent error}|}{\text{number of forecast errors}}\times100\%\\&=&\frac{0.7824\ldots}{7}\times100\%\\&=&11.18\%\end{eqnarray*}[/latex]
- Forecast 2 because it has the smaller [latex]MAPE[/latex].
Exercises
- The number of pizzas sold each week at a local pizza restaurant, along with two different forecasts, are given in the table below.
Week Number of Pizzas Sold Forecast 1 Forecast 2 1 120 123 2 135 126 3 140 129 128 4 115 132 138 5 130 135 128 6 145 139 123 7 160 142 138 8 170 145 153 9 125 148 165 10 130 151 148 11 150 155 128 12 140 158 140 13 155 161 144 14 180 164 148 15 165 167 168 16 170 171 173 17 190 174 168 - Calculate the [latex]MAD[/latex], [latex]MSE[/latex], and [latex]MAPE[/latex] for Forecast 1.
- Interpret the [latex]MAD[/latex] for Forecast 1.
- Calculate the [latex]MAD[/latex], [latex]MSE[/latex], and [latex]MAPE[/latex] for Forecast 2.
- Interpret the [latex]MAPE[/latex] for Forecast 2.
- Based on the [latex]MAD[/latex], which forecast is more accurate? Why?
Click to see Answer
- [latex]MAD=11.88[/latex], [latex]MSE=198.94[/latex], [latex]MAPE=8.20\%[/latex]
- On average, the forecasted values differ by [latex]11.88[/latex] pizzas sold from the actual values.
- [latex]MAD=16.53[/latex], [latex]MSE=397.87[/latex], [latex]MAPE=11.26\%[/latex]
- On average, the forecasted values differ by [latex]11.26\%[/latex] from the actual values.
- Forecast 1 because it has the smaller [latex]MAD[/latex].
- The number of coffees sold each day at a local coffee shop, along with two different forecasts, are given in the table below.
Day Number of Coffees Sold Forecast 1 Forecast 2 1 120 2 100 3 115 4 130 111 5 110 116 123 6 150 114 117 7 165 126 136 8 110 139 155 9 120 134 131 10 125 136 122 11 135 130 122 12 130 123 131 13 152 128 131 14 160 136 144 15 110 144 155 16 120 138 129 17 125 136 121 18 127 129 122 19 130 121 126 20 155 126 129 21 162 134 145 22 120 144 157 23 125 142 136 24 130 141 127 25 135 134 128 26 137 128 133 27 160 132 136 28 155 141 151 29 130 147 155 30 140 146 141 31 145 146 139 32 139 143 142 33 142 139 141 34 165 142 141 35 163 148 156 36 130 152 162 37 120 150 143 38 125 145 127 39 135 135 124 40 145 128 131 41 160 131 140 42 170 141 153 43 130 153 165 44 137 151 145 45 130 149 138 46 135 142 132 47 132 133 134 48 163 134 133 49 168 140 151 50 125 150 163 51 130 147 142 52 120 147 132 53 115 136 124 54 125 123 118 55 156 123 122 56 161 129 143 - Calculate the [latex]MAD[/latex], [latex]MSE[/latex], and [latex]MAPE[/latex] for Forecast 1.
- Calculate the [latex]MAD[/latex], [latex]MSE[/latex], and [latex]MAPE[/latex] for Forecast 2.
- Based on the [latex]MSE[/latex], which forecast is more accurate? Why?
Click to see Answer
- [latex]MAD=17.77[/latex], [latex]MSE=431.12[/latex], [latex]MAPE=12.79\%[/latex]
- [latex]MAD=15.53[/latex], [latex]MSE=382.23[/latex], [latex]MAPE=11.41\%[/latex]
- Forecast 2 because it has the smaller [latex]MSE[/latex].
- The quarterly demand, in [latex]1000[/latex]s, for a particular product, along with two different forecasts, are recorded in the table below.
Quarter Demand (in [latex]1000[/latex]s) Forecast 1 Forecast 2 1 50 32 2 75 80 3 85 88 4 70 72 70 5 60 51 77 6 95 99 72 7 100 107 75 8 85 91 85 9 70 70 93 10 115 118 85 11 125 126 90 12 110 110 103 13 80 89 117 14 140 137 105 15 145 145 110 16 130 129 122 17 90 108 138 18 165 156 122 19 175 164 128 20 155 148 143 - Calculate the [latex]MAD[/latex], [latex]MSE[/latex], and [latex]MAPE[/latex] for Forecast 1.
- Interpret the [latex]MAPE[/latex] for Forecast 1.
- Calculate the [latex]MAD[/latex], [latex]MSE[/latex], and [latex]MAPE[/latex] for Forecast 2.
- Interpret the [latex]MAD[/latex] for Forecast 2.
- Based on the [latex]MAPE[/latex], which forecast is more accurate? Why?
Click to see Answer
- [latex]MAD=5.8[/latex], [latex]MSE=61[/latex], [latex]MAPE=6.81\%[/latex]
- On average, the forecasted values differ by [latex]6.81\%[/latex] from the actual values.
- [latex]MAD=25[/latex], [latex]MSE=855[/latex], [latex]MAPE=22.73\%[/latex]
- On average, the forecasted values differ by [latex]25,000[/latex] from the actual values.
- Forecast 1 because it has the smaller [latex]MAPE[/latex].