6533b85ffe1ef96bd12c24d7

RESEARCH PRODUCT

Comparing Correlation Matrix Estimators Via Kullback-Leibler Divergence

Giulia IoriVanessa MattiussiRosario N. MantegnaMichele TumminelloMichele Tumminello

subject

Minimum-variance unbiased estimatorEfficient estimatorKullback–Leibler divergenceConsistent estimatorStatisticsEstimatorMultivariate normal distributionTrimmed estimatorInvariant estimatorMathematics

description

We use a self-averaging measure called Kullback-Leibler divergence to evaluate the performance of four different correlation estimators: Fourier, Pearson, Maximum Likelihood and Hayashi-Yoshida estimator. The study uses simulated transaction prices for a large number of stocks and different data generating mechanisms, including synchronous and non-synchronous transactions, homogeneous and heterogeneous inter-transaction time. Different distributions of stock returns, i.e. multivariate Normal and multivariate Student's t-distribution, are also considered. We show that Fourier and Pearson estimators are equivalent proxies of the `true' correlation matrix within all the settings under analysis, and that both methods are outperformed by the Maximum Likelihood estimator when prices are synchronously sampled and price fluctuations follow a multivariate Student's t-distribution. Finally, we suggest to solve the singularity problem affecting the Hayashi-Yoshida estimator by shrinking the correlation matrix towards either Pearson or Fourier matrices, and provide evidence that the resulting combination leads to an improved estimator with respect to its single components.

https://doi.org/10.2139/ssrn.1966714