Abstract: Tensor is a multidimensional array (for example matrix is a tensor of order 2). Tensors often arises from the discretizations of multidimensional functions that are involved in the numerical treatment of complex problems in many different areas of natural, financial or social sciences. The direct numerical treatment of these arrays leads to serious problems like memory requirements and the complexity of basic operations (they grow exponentially in d). In the last decade the approximation of multidimensional arrays has become a central issue in approximation theory and numerial analysis. The main idea of the approximation of a tensor is decomposing the given tensor as sums of outer products of vectors. In the language of functions, it is an approximation of multivariable functions by sums of products of univariate functions. Tensor decompositions has lot of applications in image processing, quantum chemistry, data mining, machine learning stochastic partial differential equations etc.In the matrix case (i.e tensor of order 2), the singular value decomposition (SVD) represents a matrix as sum of outer product of vectors. SVD algorithm requires O(n3) arithmeticoperations (if the matrix is of size n × n). So it is very expensive when the matrix dimensions are large. Various inexpensive techniques of low rank approximation based on skeleton/cross approximation are available in the literature. SVD and its applications, other low rank approximation techniques like RRQR, Interpolative decomposition, randomized algorithms, skeleton/cross approximation techniques will be discussed in the talk. Canonical,Tucker, Tensor Chain and Tensor Train formats for higher order tensors will be introduced.
Dr. Naraparaju Kishore Kumar
Birla Institute of Technology, Pilani