Abstract
Taking the first-order partial derivative of a vector-valued function results in the Jacobian matrix, which contains all partial derivatives of each entry. The matrix which contains all possible second-order partials of each entry, called the Hessian, is also well-studied. The first-order partial derivatives of a vector is a matrix, the next and higher-order partials constitute matrices with complicated structures. Among the different ways of handling this problem, there are some methods which use the tensor product of possible matrix valued functions and partials. Here we follow a very simple version in that line, namely we put partials into a column vector and apply it as consecutive tensor products on a vector-valued function. In this way we keep the results as vectors, and although the tensor product is not commutative, we can use linear operators to reach all permutations of the terms involved in the process. The main objective of this chapter is to show how simple and clear formulae can be derived if we use the method of tensor products for higher-order partial derivatives of vector-valued functions. Faà di Bruno’s formula will play an important role later on when we will be interested in the connections between moments and cumulants.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Hardy M (2006) Combinatorics of partial derivatives. Electron J Combin 13(1):1
Henderson HV, Searle SR (1979) Vec and vech operators for matrices, with some uses in jacobians and multivariate statistics. Canad J Stat 7(1):65–81
Henderson HV, Searle SR (1981) The vec-permutation matrix, the vec operator and Kronecker products: a review. Linear Multilinear Algebra 9(4):271–288
Kollo T, von Rosen D (2006) Advanced multivariate statistics with matrices, vol 579. Springer Science & Business Media, New York
Lukacs E (1955) Applications of Faa di Bruno’s formula in statistics. Am Math Monthly 62:340–348
Lukacs E (1970) Characteristic functions. Griffin
Ma TW (2009) Higher chain formula proved by combinatorics. Electron J Combin 16(1):N21
MacRae EC (1974) Matrix derivatives with an application to an adaptive linear decision problem. Ann Stat 2:337–346
Miatto FM (2019) Recursive multivariate derivatives of arbitrary order. Preprint. ar**v:191111722
Magnus JR, Neudecker H (1999) Matrix differential calculus with applications in statistics and econometrics. John Wiley & Sons Ltd., Chichester. Revised reprint of the 1988 original
Neudecker H (1969) Some theorems on matrix differentiation with special reference to Kronecker matrix products. J Am Stat Assoc 64(327):953–963
Noschese S, Ricci PE (2003) Differentiation of multivariable composite functions and bell polynomials. J Comput Anal Appl 5(3):333–340
Savits TH (2006) Some statistical applications of Faa di Bruno. J Multivar Anal 97(10):2131–2140
Schumann A (2019) Multivariate bell polynomials and derivatives of composed functions. Preprint. ar**v:190303899
Shopin SA (2010) Cubic map** of the normal random vector. Izvestiya Tula State Univ 2010(2):211 – 221
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Terdik, G. (2021). The Tensor Derivative of Vector Functions. In: Multivariate Statistical Methods . Frontiers in Probability and the Statistical Sciences. Springer, Cham. https://doi.org/10.1007/978-3-030-81392-5_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-81392-5_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-81391-8
Online ISBN: 978-3-030-81392-5
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)