Testing the independence of violations, Section 8.3.2 p. 155-6 Financial risk forecasting errata

Testing the independence of violations, Section 8.3.2 p. 155-6

December 27, 2017

My FM320 students Hongshen Chen, Yida Li and Yanfei Zhou pointed out that the discussion in Section 8.3.2 could be more clear, so I repeat the relevant parts of the section here with more clarifications.

We need to calculate the probabilities of two consecutive violations, \(p_{11}\), as well the probability of a violation, if there was no violation on the previous day, i.e. \(p_{01}\). More generally, where \(i\) and \(j\) are either 0 or 1: \begin{equation*} p_{ij}=\Pr \left( \eta_{t}=j|\eta_{t-1}=i\right). \end{equation*} The violation process can be represented as a Markov chain with two states, so the first order transition probability matrix is defined as: \begin{equation*} \Pi_1=\left( \begin{array}{cc} 1-p_{01} & p_{01} \ 1-p_{11} & p_{11} \end{array} \right) . \end{equation*} The likelihood function is: \begin{equation} L_1(\Pi_1) =\left( 1-p_{01}\right) ^{v_{00}}p_{01}^{v_{01}}\left( 1-p_{11}\right) ^{v_{10}}p_{11}^{v_{11}} \tag{8.5}\label{eq:risk2:lik:bt:int} \end{equation} where \(v_{ij}\) is the number of observations where \(j\) follows \(i\).

The maximum likelihood (ML) estimates are obtained by maximizing the likelihood function which is simple since the parameters are the ratios of the counts of the outcomes: \begin{gather*} \hat{\Pi}_{1}= \begin{pmatrix} \frac{v_{00}}{v_{00}+v_{01}} & \frac{v_{01}}{v_{00}+v_{01}} \\\\[-2mm] \frac{v_{10}}{v_{10}+v_{11}} & \frac{v_{11}}{v_{10}+v_{11}} \\ \end{pmatrix} . \end{gather*} Under the null hypothesis of no clustering, the probability of a violation tomorrow does not depend on today being a violation, then \\(p_{01}=p_{11}=p\\) and the transition matrix is simply: \begin{align*} \Pi_{2} & =\left( \begin{array}{cc} 1-p & p \\ 1-p & p \end{array} \right) \end{align*} and the ML estimate is: \[ \hat{p} =\frac{v_{01}+v_{11}}{v_{00}+v_{10}+v_{01}+v_{11}}. \] so \begin{align*} \hat{\Pi}_{2} & =\left( \begin{array}{cc} 1-\hat{p} & \hat{p} \\ 1-\hat{p} & \hat{p} \end{array} \right) \end{align*}

The likelihood function then is \begin{equation} L_2(\Pi_2) =\left( 1-p\right) ^{v_{00}+v_{10}} p^{v_{01}+v_{11}} .\tag{8.6} \label{eq:risk2:lik:bt:int2} \end{equation}

Note in \eqref{eq:risk2:lik:bt:int2} we impose independence but do not in \eqref{eq:risk2:lik:bt:int}. Replace the \\(\Pi\\) by the estimated numbers, \\(\hat{\Pi}\\). The LR test is then: \begin{equation*} LR=2\left( \log L_1\left( \hat{\Pi}_{1}\right) -\log L_2\left( \hat{\Pi}_{2}\right) \right) \overset{\rm asymptotic}{\sim}\chi _{\left( 1\right) }^{2}. \end{equation*}


Example 4.4
Sequential moments


Financial Risk Forecasting
Market risk forecasting with R, Julia, Python and Matlab. Code, lecture slides, implementation notes, seminar assignments and questions.
© All rights reserved, Jon Danielsson,