ces() - Complex Exponential Smoothing

This vignette covers ces() and auto.ces() functions, which are part of smooth package.

Let’s load the necessary packages:

require(smooth)

ces() function allows constructing Complex Exponential Smoothing either with no seasonality, or with simple / partial / full seasonality. A simple call for ces() results in estimation of non-seasonal model:

For the same series from M3 dataset ces() can be constructed using:

ces(BJsales, h=12, holdout=TRUE, silent=FALSE)
## Time elapsed: 0.04 seconds
## Model estimated: CES(n)
## a0 + ia1: 1.9981+1.0034i
## Initial values were produced using backcasting.
## 
## Loss function type: likelihood; Loss function value: 249.4613
## Error standard deviation: 1.4914
## Sample size: 138
## Number of estimated parameters: 3
## Number of degrees of freedom: 135
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 504.9227 505.1018 513.7045 514.1457 
## 
## Forecast errors:
## MPE: 0%; sCE: 0.7%; Asymmetry: -5%; MAPE: 0.4%
## MASE: 0.857; sMAE: 0.4%; sMSE: 0%; rMAE: 0.329; rRMSE: 0.338

This output is very similar to ones printed out by es() function. The only difference is complex smoothing parameter values which are printed out instead of persistence vector in es().

If we want automatic model selection, then we use auto.ces() function:

auto.ces(BJsales, h=12, holdout=TRUE, interval="p", silent=FALSE)
## Time elapsed: 0.05 seconds
## Model estimated: CES(n)
## a0 + ia1: 1.9981+1.0034i
## Initial values were produced using backcasting.
## 
## Loss function type: likelihood; Loss function value: 249.4613
## Error standard deviation: 1.4914
## Sample size: 138
## Number of estimated parameters: 3
## Number of degrees of freedom: 135
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 504.9227 505.1018 513.7045 514.1457 
## 
## 95% parametric prediction interval was constructed
## 100% of values are in the prediction interval
## Forecast errors:
## MPE: 0%; sCE: 0.7%; Asymmetry: -5%; MAPE: 0.4%
## MASE: 0.857; sMAE: 0.4%; sMSE: 0%; rMAE: 0.329; rRMSE: 0.338

Note that prediction interval are too narrow and do not include 95% of values. This is because CES is pure additive model and it cannot take into account possible heteroscedasticity.

If for some reason we want to optimise initial values then we call:

auto.ces(BJsales, h=12, holdout=TRUE, initial="o", interval="sp")
## Time elapsed: 0.09 seconds
## Model estimated: CES(n)
## a0 + ia1: 2.007+1.0035i
## Initial values were optimised.
## 
## Loss function type: likelihood; Loss function value: 249.3039
## Error standard deviation: 1.5009
## Sample size: 138
## Number of estimated parameters: 5
## Number of degrees of freedom: 133
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 508.6077 509.0622 523.2440 524.3638 
## 
## 95% semiparametric prediction interval was constructed
## 100% of values are in the prediction interval
## Forecast errors:
## MPE: 0%; sCE: 0.7%; Asymmetry: -5.7%; MAPE: 0.4%
## MASE: 0.857; sMAE: 0.4%; sMSE: 0%; rMAE: 0.329; rRMSE: 0.337

Now let’s introduce some artificial exogenous variables:

x <- cbind(rnorm(length(BJsales),50,3),rnorm(length(BJsales),100,7))

ces() allows using exogenous variables and different types of prediction interval in exactly the same manner as es():

auto.ces(BJsales, h=12, holdout=TRUE, xreg=x, regressors="select", interval="p")
## Time elapsed: 0.15 seconds
## Model estimated: CES(n)
## a0 + ia1: 1.9981+1.0034i
## Initial values were produced using backcasting.
## 
## Loss function type: likelihood; Loss function value: 249.4613
## Error standard deviation: 1.4914
## Sample size: 138
## Number of estimated parameters: 3
## Number of degrees of freedom: 135
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 504.9227 505.1018 513.7045 514.1457 
## 
## 95% parametric prediction interval was constructed
## 100% of values are in the prediction interval
## Forecast errors:
## MPE: 0%; sCE: 0.7%; Asymmetry: -5.3%; MAPE: 0.4%
## MASE: 0.857; sMAE: 0.4%; sMSE: 0%; rMAE: 0.329; rRMSE: 0.338