still a student of this
With bivariate GARCH you would forecast covariance directly, based on past covariances,
covar(x,y)(time t) = beta * covar(x,y)(time t-1) + coefficient * (long run level) + alpha * (innovation term)
There is no need to use a correlation coefficient, because you wouldn't be using the formula covar = sigma * sigma * correlation to forecast the covariance, with the individual variances coming from a univariate GARCH.
You would just forecast next period covariance directly from past covariances each period, return(X) * return(Y). The innovations or errors on the covariance are cross-products of the innovations in the univariate variance processes. For GARCH(1,1):
h
11,t = long-run 11 + a11 e1^2 t-1 + b11 * h11,t-1
h
22,t = long-run 22 + a22 e2^2 t-1 + b22 * h22,t-1
h
12,t = long-run 12 + a12 e1^2 t-1 * e2^2 t-1 t-1 + b12 * h12,t-1
And to find the coefficents in the above formula, you use maximum likelihood (with an assumption about the functional form of the distribution, rightly or wrongly). This gets rid of the need to guess at the size of the moving average window that you need to calculate a correlation with (eg. 22 days, 44 days, 1000 days, etc). The coefficients can be constrained to sum to 1. For an entire covariance matrix, I believe that you can estimate them all together using matrix methods, but I will admit that my grasp of this is quite uncertain, as I have never done this myself professionally. I honestly regret I cannot help more.
I believe that you would only back into a correlation forecast after the fact, by taking your covariance forecast and dividing by your univariate variance forecasts, but like you said, I am also uncertain if the resulting correlation forecast would be guaranteed to be bounded by [-1,1], or if the matrix is guaranteed to be positive definite.
This is an interesting question and I am trying to learn more about it myself. If you or anyone finds a very clear web resource for estimating an entire covariance matrix with GARCH, please post it here. Apologies that this was not a better post.