2.6 Time Series

2.6 Time Series

Question 1

For a certain time series, you have produced a correlogram with an autocorrelation function that includes twenty four monthly observations; m = degrees    of    freedom = 24 m = \text{degrees\;of\;freedom} = 24 m=degreesoffreedom=24. Your calculated Box-Pierce Q-statistic is 19.50 19.50 19.50 and your calculated Ljung-Box Q-statistic is 27.90 27.90 27.90. You want to determine if the series is white noise. Which is your best conclusion(given CHISQ.INV ( 0.95 , 24 ) = 36.41 \text{CHISQ.INV}(0.95, 24) = 36.41 CHISQ.INV(0.95,24)=36.41)?

A. With 95.0 % 95.0\% 95.0% confidence, you accept the series as white noise (more accurately, you fail to reject the null)
B. With 95.0 % 95.0\% 95.0% confidence, you accept the series as partial white noise (due to Box-Pierce) but reject the null (due to Ljung-Box)
C. With 95.0 % 95.0\% 95.0% confidence, you reject both null hypotheses and conclude the series is not white noise
D. With 95.0 % 95.0\% 95.0% confidence, you reject both null hypotheses but conclude the series is white noise because the sum of the statistics is greater than the critical value

Answer: A
Testing autocorrelation in the residuals is a standard specification check applied after fitting a model.
Null hypothesis: All its autocorrelations are jointly 0. (i.e., H 0 : ρ 1 = ρ 2 = ⋯ = ρ h H_0 : \rho_1 =\rho_2=\cdots=\rho_h H0:ρ1=ρ2==ρh).
Alternative hypothesis: At least one is non-zero. (i.e., H 1 : ρ j ≠ 0 H_1 : \rho_j\neq 0 H1:ρj=0 for some j j j).

The Box-Pierce test statistic is scaled by the sample size T T T, Q B P = T ∑ i = 1 h ρ i ^ 2 ∼ χ h 2 Q_{BP}=T\sum^h_{i=1}\hat{\rho_i}^2\sim \chi_h^2 QBP=Ti=1hρi^2χh2
The Liung-Pierce test statistic works better in smaller samples, Q L P = T ∑ i = 1 h ( T + 2 T − i ) ρ i ^ 2 ∼ χ h 2 Q_{LP}=T\sum^h_{i=1}(\frac{T+2}{T-i})\hat{\rho_i}^2\sim \chi_h^2 QLP=Ti=1h(TiT+2)ρi^2χh2

Values of the test statistic larger than the critical value indicate that the autocorrelations are not zero.


Question 2

All of the following traits characterize the covariance stationary of a time series process, except:

A. Stability of the mean.
B. Stability of the covariance structure.
C. A non-constant variance in the time series.
D. Stability of the autocorrelation

Answer: C
A time series is covariance stationary if its first two moments satisfy three key properties:

  • The mean is constant and does not change over time.(i.e., E [ Y t ] = μ E[Y_t]=\mu E[Yt]=μ for all t t t)
  • The variance is finite and does not change over time.(i.e., V [ Y t ] = γ 0 < ∞ V[Y_t]=\gamma_0<\infty V[Yt]=γ0<)
  • The autocovariance is finite, does not change over time, and only depends on the distance between observations( h h h).(i.e., C o v [ Y t , Y t − h ] = γ h Cov[Y_t, Y_{t-h}]=\gamma_h Cov[Yt,Yth]=γh for all t t t)

Question 3

The following are statements about a moving average( MA \text{MA} MA) representation, and an autoregressive( AR \text{AR} AR) process. Which one describes the main difference between MA \text{MA} MA representation and AR \text{AR} AR process.

A. Moving average ( MA \text{MA} MA) representation shows an evidence of autocorrelation cutoff.
B. The autoregressive ( AR \text{AR} AR) process will never be covariance stationary.
C. The autoregressive ( AR \text{AR} AR) process shows evidence of autocorrelation cutoff.
D. An unadjusted moving average ( MA \text{MA} MA) process shows a clear evidence of a gradual autocorrelation decay.

Answer: A
Autocorrelation function(ACF): γ h γ 0 \frac{\gamma_h}{\gamma_0} γ0γh, refers to the degree of correlation and interdependency between data points in a time series.

Autoregressive( AR \text{AR} AR) model is defined as the current value of a series is linearly related to its past values, plus an additive stochastic shock.
A first order AR \text{AR} AR can be denoted AR ( 1 ) \text{AR}(1) AR(1), Y t = δ + ϕ Y t − 1 + ϵ t Y_t=\delta+\phi Y_{t-1}+\epsilon_t Yt=δ+ϕYt1+ϵt, where ϵ t ∼ W N ( 0 , σ 2 ) \epsilon_t \sim WN(0,\sigma^2) ϵtWN(0,σ2)

  • AR ( 1 ) \text{AR}(1) AR(1) is covariance stationary when ∣ ϕ ∣ < 1 |\phi|<1 ϕ<1.
  • Expectation: E [ Y t ] = E [ Y t − h ] = μ = δ 1 − ϕ E[Y_t]=E[Y_{t-h}]=\mu=\frac{\delta}{1-\phi} E[Yt]=E[Yth]=μ=1ϕδ
  • Variance: V [ Y t ] = V [ Y t − h ] = γ 0 = σ 2 1 − ϕ 2 V[Y_t]=V[Y_{t-h}]=\gamma_0=\frac{\sigma^2}{1-\phi^2} V[Yt]=V[Yth]=γ0=1ϕ2σ2
  • Autocovariance: C o v [ Y t , Y t h ] = γ ( h ) = ϕ ∣ h ∣ γ 0 Cov[Y_t, Y_{t h}]=\gamma(h)=\phi^{|h|}\gamma_0 Cov[Yt,Yth]=γ(h)=ϕhγ0
  • Autocorrelation(ACF): ρ ( h ) = ϕ ∣ h ∣ \rho(h)=\phi^{|h|} ρ(h)=ϕh, the ACF geometrically decays to zero as h h h increases, it also oscillates between negative and positive if − 1 < ϕ < 0 -1<\phi<0 1<ϕ<0
  • Partial autocorrelation(PACF): α ( h ) = ϕ ∣ h ∣ ,    h ∈ { 0 , ± 1 } \alpha(h)=\phi^{|h|},\;h\in \lbrace0,\pm1\rbrace α(h)=ϕh,h{0,±1}, α ( h ) = 0 ,    h ≥ 2 \alpha(h)=0,\;h\geq2 α(h)=0,h2, the PACF is non-zero only for the first lag.

The moving average ( MA \text{MA} MA) model is defined as the observed value of Y t Y_t Yt depends on both the contemporaneous shock ϵ t \epsilon_t ϵt and the previous shock.
A first order MA \text{MA} MA can be denoted MA ( 1 ) \text{MA}(1) MA(1), Y t = μ + θ ϵ t − 1 + ϵ t Y_t=\mu+\theta \epsilon_{t-1}+\epsilon_t Yt=μ+θϵt1+ϵt, where ϵ t ∼ W N ( 0 , σ 2 ) \epsilon_t \sim WN(0,\sigma^2) ϵtWN(0,σ2)

  • MA \text{MA} MA is always covariance stationary.
  • Expectation: E [ Y t ] = E [ Y t − h ] = μ E[Y_t]=E[Y_{t-h}]=\mu E[Yt]=E[Yth]=μ
  • Variance: V [ Y t ] = V [ Y t − h ] = γ 0 = ( 1 + θ 2 ) σ 2 V[Y_t]=V[Y_{t-h}]=\gamma_0=(1+\theta^2)\sigma^2 V[Yt]=V[Yth]=γ0=(1+θ2)σ2
  • Autocovariance: C o v [ Y t , Y t h ] = γ ( h ) = 0 Cov[Y_t, Y_{t h}]=\gamma(h)=0 Cov[Yt,Yth]=γ(h)=0, for h > 1 h>1 h>1
  • Autocorrelation: ρ ( h ) = 1 \rho(h)=1 ρ(h)=1, when h = 0 h=0 h=0, ρ ( h ) = 0 \rho(h)=0 ρ(h)=0, when h ≥ 2 h\geq2 h2, ρ ( h ) = θ 1 + θ 2 \rho(h)=\frac{\theta}{1+\theta^2} ρ(h)=1+θ2θ, when h = 1 h=1 h=1
  • Partial autocorrelation: The PACF has non-zero values at all lags and decays towards zero.

Question 4

PE2018Q21 / PE2019Q21 / PE2020Q21 / PE2021Q21 / PE2022PSQ14 / PE2022Q21
A risk manager at a major global bank is conducting a time series analysis of equity returns. The manager wants to know whether the time series is covariance stationary. Which of the following statements describes one of the requirements for a time series to be covariance stationary?

A. The distribution of a time series should have a kurtosis value near 3 3 3, ensuring no fat tails will distort stationarity.
B. The distribution of a time series should have a skewness value near 0 0 0, so that its mean will fall in the center of the distribution.
C. The autocovariance of a covariance stationary time series depends only on displacement term, τ \tau τ, not on time.
D. When the autocovariance function is asymmetric with respect to displacement, τ \tau τ, forward looking stationarity can be achieved.

Answer: C
Learning Objective: Describe the requirements for a series to be covariance stationary.

C is correct. One requirement for a series to be covariance stationary is that its covariance structure be stable over time. If the covariance structure is stable, then the autocovariances depend only on displacement, τ \tau τ, not on time, t t t. Also, covariance stationarity does not place restrictions on other aspects of the distributions or the series, such as kurtosis and skewness.

A and B are incorrect. Covariance stationarity does not place restrictions on other aspects of the distributions or the series, such as kurtosis and skewness.

D is incorrect. Covariance stationarity does not depend on the symmetry of the autocovariance function.


Question 5

PE2020Q70 / PE2021Q70 / PE2022Q70
A market risk manager would like to analyze and forecast a security performance and has obtained the historical time series for that security. The manager consults a colleague from the quantitative analytic team who provides the following Partial Autocorrelation Function ( PACF \text{PACF} PACF) plot:

在这里插入图片描述

Based on the plot above, which of the following is the best regression approach for the security?

A. AR ( 1 ) \text{AR}(1) AR(1)
B. MA ( 1 ) \text{MA}(1) MA(1)
C. AR ( 2 ) \text{AR}(2) AR(2)
D. MA ( 2 ) \text{MA}(2) MA(2)

Answer: C
Learning Objective: Define and describe the properties of autoregressive ( AR \text{AR} AR) processes.

The PACF \text{PACF} PACF cuts off after the second lag. This behavior indicates an AR ( 2 ) \text{AR}(2) AR(2) process.

ACF \text{ACF} ACF PACF \text{PACF} PACF
MA ( 1 ) \text{MA}(1) MA(1) ModelCutoffDecay
AR( 1 ) \text{AR(}1) AR(1) ModelDecayCutoff

Question 6

In regard to white noise, each of the following statements is true except which is false?

A. If a process is zero-mean white noise, then is must be Gaussian white noise.
B. If a process is Gaussian (aka, normal) white noise, then it must be (zero-mean) white noise.
C. If a process is Gaussian (aka, normal) white noise, then it must be independent white noise.
D. If a process is stationary, has zero mean, has constant variance and it serially uncorrelated, then the process is white noise.

Answer: A
White noise is denoted ϵ t ∼ WN ( 0 , σ 2 ) \epsilon_t\sim\text{WN}(0,\sigma^2) ϵtWN(0,σ2)

  • Mean zero (i.e., E [ ϵ t ] = 0 E[\epsilon_t]=0 E[ϵt]=0)
  • Constant and finite variance (i.e., V [ ϵ t ] = σ 2 < ∞ V[\epsilon_t]=\sigma^2<\infin V[ϵt]=σ2<)
  • No autocorrelation or autocovariance (i.e., Cov [ ϵ t , ϵ t − h ] = 0 \text{Cov}[\epsilon_t,\epsilon_{t-h}]=0 Cov[ϵt,ϵth]=0 for all h ≠ 0 h\neq0 h=0)

Even though a white noise process is serially uncorrelated, it may not be serially independent or normally distributed.
If ϵ t \epsilon_t ϵt is serially independent, then we say ϵ t \epsilon_t ϵt is independent white noise.
If ϵ t \epsilon_t ϵt is serially uncorrelated and normally distributed, then we say ϵ t \epsilon_t ϵt is normal or Gaussian white noise.


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值