The importance of independence arises in diverse applications, for inference and whenever it is essential to measure complicated dependence structures in bivariate or multivariate data. This paper focuses on a new dependence coefficient that measures all types of dependence between random vectors X and Y in arbitrary dimension.
For example, the distance covariance dCov statistic, derived in the next section, is the square root of. It will be shown that the definitions of the new dependence coefficients have theoretical foundations based on characteristic functions and on the new concept of covariance with respect to Brownian motion.
Our independence test statistics are consistent against all types of dependent alternatives with finite second moments. Nonlinear or nonmonotone dependence may exist.
Although it does not characterize independence, classical correlation is widely applied in time series, clinical trials, longitudinal studies, modeling financial data, meta-analysis, model selection in parametric and nonparametric models, classification and pattern recognition, etc.
Ratios and other methods of combining and applying correlation coefficients have also been proposed. Although methods based on ranks can be applied in some problems, many classical methods are effective only for testing linear or monotone types of dependence.
There is much literature on testing or measuring independence. See, for example, Blomqvist blom50 , Blum, Kiefer, and Rosenblatt bkr61 , or methods outlined in Hollander and Wolfe hw99 and Anderson anderson Multivariate nonparametric approaches to this problem can be found in Taskinen, Oja, and Randles tor05 , and the references therein.
Our proposed distance correlation represents an entirely new approach. R X , Y is defined for X and Y in arbitrary dimension. The notion of covariance of random vectors X , Y with respect to a stochastic process U is introduced in this paper.
A surprising result develops: the Brownian covariance is equal to the distance covariance. This equivalence is not only surprising, it also shows that distance covariance is a natural counterpart of product-moment covariance. Distance correlation and distance covariance are presented in Section 2. Brownian covariance is introduced in Section 3.
Extensions and applications are discussed in Sections 4 and 5. Let X in R p and Y in R q be random vectors, where p and q are positive integers. The lower case f X and f Y will be used to denote the characteristic functions of X and Y , respectively, and their joint characteristic function is denoted f X , Y.
Thus, a natural approach to measuring the dependence between X and Y is to find a suitable norm to measure the distance between f X , Y and f X f Y. Then an empirical version of V is developed and applied to test the hypothesis of independence.
In Monte Carlo studies, the distance covariance test exhibited superior power relative to parametric or rank-based likelihood ratio tests against nonmonotone types of dependence. It was also demonstrated that the tests were quite competitive with the parametric likelihood ratio test when applied to multivariate normal data. The practical message is that distance covariance tests are powerful tests for all types of dependence.
Approximating the inverse of banded matrices by banded matrices with applications to probability and statistics , [ pdf ], Jing Lei and Peter Bickel. Juli Atherton et al. Non Parametric Methods for Genomic Inference, [ pdf ], Peter Bickel and Aiyou Chen. MacArthur et al. A, , Peter Bickel and Ying Xu.
Peter Bickel, and Donghui Yan. Peter Bickel and Liza Levina. View 7 excerpts, references background and methods. A Hilbert Space Embedding for Distributions. Mathematics, Computer Science. We describe a technique for comparing distributions without the need for density estimation as an intermediate step. Our approach relies on mapping the distributions into a reproducing kernel Hilbert … Expand.
Measuring and testing dependence by correlation of distances. Distance correlation is a new measure of dependence between random vectors. Distance covariance and distance correlation are analogous to product-moment covariance and correlation, but unlike the … Expand. Approximation Theorems of Mathematical Statistics. Preliminary Tools and Foundations.
The Basic Sample Statistics. Transformations of Given Statistics. Asymptotic Theory in Parametric Inference. Von Mises Differentiable Statistical … Expand. Joint measures and cross-covariance operators.
Let H1 resp. This … Expand. View 1 excerpt, references background. Citation Type. Has PDF. Publication Type. More Filters. Partial Distance Correlation with Methods for Dissimilarities. Distance covariance and distance correlation are scalar coefficients that characterize independence of random vectors in arbitrary dimension.
Properties, extensions, and applications of distance … Expand. View 7 excerpts, cites background. Detecting independence of random vectors: generalized distance covariance and Gaussian covariance. Distance covariance is a quantity to measure the dependence of two random vectors. Highly Influenced. View 6 excerpts, cites background. On the uniqueness of distance covariance.
Distance covariance and distance correlation are non-negative real numbers that characterize the independence of random vectors in arbitrary dimensions. In this work we prove that distance covariance … Expand. Fast Computing for Distance Covariance. Mathematics, Computer Science. View 8 excerpts, cites background and methods. Journal of the American Statistical Association. Abstract Many statistical applications require the quantification of joint dependence among more than two random vectors.
0コメント