![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
Is it possible to insert a reference in which one can find proofs for the listed properties of the Kronecker product? E.g. Abstract properties / 1. Spectrum needs in my opinion a citation. — Preceding unsigned comment added by 161.116.80.135 ( talk) 15:55, 10 September 2012 (UTC)
![]() | This edit request by an editor with a conflict of interest has now been answered. |
I suggest we could add some other applications to the Khatri-Rao product.
Please note that one of the articles sited is of my authorship. Nevertheless, it gives new applications to the Khatri-Rao problem, and, more importantly, presentes a new relationship between the Khatri-Rao and the Kronecker product, what is in accordance with the contents of this Wikipage. If you wish to verify the contents of the paper I listed, you can find a copy of it in my webpage http://www.decom.fee.unicamp.br/~masiero or directly at http://www.decom.fee.unicamp.br/~masiero/articles/Journal/LSP2674969.pdf
Suggested text: This column-wise version of the Khatri-Rao product is useful in linear algebra approaches to data analytical processing [1] and in optimizing the solution of inverse problems dealing with a diagonal matrix [2] [3].
Bmasiero ( talk) 17:29, 27 March 2017 (UTC)Dr. Bruno Masiero
References
Requesting addition/articles for Khatri-Rao and Tracy-Singh products. [1] Shyamal 04:39, 25 July 2006 (UTC)
The paragraph should note that a choice of bases is involved: If A and B represent homomorphisms given certain bases of the involved vector spaces, the Kronecker product of A and B represents the tensor product of these homomorphisms with respect to certain bases of the tensor products of the domain and codomain vector spaces of the form a_1 x b_1, a_1 x b_2, ..., a_1 x b_n, a_2 x b_1, ... 84.190.181.201
An anonymous user edited (concerning the final related matrix operation):
The reason I got me a Wikipedia-account in the first place was that I needed the definition of the colunm-wise KR product for my master's thesis, and I was tired of always looking in the paper by Liu. Later I examined this paper in which I saw (on p.3 in the pdf) what I had by then found out namely that the Khatri-Rao product is implied to operate on matrices with as partitions their columns. I wasn't sure whether this would be a mistake or a different (and confusing) convention or something, therefore, and also for my own reference, I added it to the article as a seperate case. But maybe it needs some clarification. -- StevenDH ( talk) 20:23, 16 April 2008 (UTC)
If A is n-by-n, B is m-by-m and denotes the k-by-k identity matrix then we can define the Kronecker sum, , by
(Note that this is different from the direct sum of two matrices.)
But the denotion of Kronecker sum and Direct sum is equel!!! So is it mistake? Gvozdet ( talk) 13:29, 18 August 2009 (UTC)
http://ru.wikipedia.org/wiki/%D0%A2%D0%B5%D0%BD%D0%B7%D0%BE%D1%80%D0%BD%D0%BE%D0%B5_%D0%BF%D1%80%D0%BE%D0%B8%D0%B7%D0%B2%D0%B5%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5 —Preceding unsigned comment added by 217.29.95.125 ( talk) 14:23, 23 July 2010 (UTC)
There is a confusing clash of nomenclature regarding the Kronecker product.
Despite the occasional use of the phrase "tensor product" to describe the Kronecker product, the Kronecker product doesn't coincide with the usual definition of the tensor product. The tensor product is an operation that produces a tensor of higher rank. That is, in coordinates the tensor product adds indices:
(Refer, for example, to: John Lee, Introduction to Smooth Manifolds)
As an example, the tensor product of two vectors gives you a matrix (ignoring covariant and contravariant issues for the moment). But the tensor product of two matrices is a fourth-rank tensor, not another matrix. The Kronecker product between matrices simply gives you another matrix (although with higher dimension), and is then not the same thing as the tensor product.
The distinction might be blurred in some literature where they presumably call the actual tensor product a Cartesian product, which is also bad nomenclature since the Cartesian product is a product on sets, not tensors; better nomenclature would be direct product (and indeed MathWorld adopts an explicit hybrid nomenclature to be absolutely clear: [ [2]]).
The relevant section of the Tensor Product page agrees with me except for the instances that mention the Kronecker product. In fact, this confusion arises in the talk page of the Tensor Product article as well, where Physdragon also agrees with me.
I think the confusion is that when working with only vectors and matrices it's easy to identify matrices and their vectorized counterparts (or indeed fourth-rank tensors and their matricized counterparts) -- something that can't be done with general tensors. The Kronecker product in this light is then the "matricized" version of the fourth-rank tensor which maps between second-rank tensors of the form
or more explicitly is the matrix which maps between vectorized tensors of the form
The Kronecker product should then be contrasted with the tensor product and should not be used to exemplify the tensor product, which is a different -- albeit related -- beast. 129.32.11.206 ( talk) 18:22, 15 October 2012 (UTC)
Maybe in order to clarify the relationship, we could put up something like this:
One problem with this is that the between vector spaces is an abstract tensor product and not a Kronecker product, so to be consistent the notation will be funny-looking: , etc.
Another problem is the ambiguity of saying "viewed as a space of..." However I think this is sufficiently clear that one shouldn't have to obfuscate the notation to be precise: 129.32.11.206 ( talk) 17:18, 16 October 2012 (UTC)
Miller (1981) derives a formula for the inverse of sums of Kronecker products (p. 72) in the special case where the "last" matrix is of rank 1. Specifically, Let W = A x G + B x E, where rank(E) = 1. Then W^{-1} = A^{-1} x G^{-1} - T x G^{-1} E G^{-1} where T = ( A + g B )^{-1} B A^{-1} and g = trace(E G^{-1}). Miller also works out examples where G is the identity matrix and B is also of rank 1. Are these too detailed to include in the page? If they are appropriate, I can add them. -- Nathanvan ( talk) 22:31, 1 January 2013 (UTC)
The section on Properties says:
"The operation of transposition is distributive over the Kronecker product:
This can be made more generic by also including the complex case and considering the conjugate transpose instead:
I think this is worth noting. I believe it is more trivial to see from this that distributiviness also holds for the real case with the transpose, than the other way around. anoko_moonlight ( talk) 11:15, 12 August 2013 (UTC)
How does one pronounce A ⊗ B ? If there’s a standard way, I think it should be mentioned the first time the notation appears in the article. Loraof ( talk) 17:28, 5 July 2018 (UTC)
"Kronecker sums appear naturally in physics when considering ensembles of non-interacting systems. citation needed Let Hi be the Hamiltonian of the ith such system. Then the total Hamiltonian of the ensemble is
It's not quite true. I would say that "Direct sums appear naturally ..." not "Kronecker sums ..."
Property 2 of the Kronecker product uses the MATLAB notation without explaining it (I suspect it is just a copy/paste of the paper quoted [3]), which is unclear. Moreover, there is more to it, and there is even an article about this fact : /info/en/?search=Commutation_matrix. Perhaps the page should be edited to fill in the gaps and clarify this "Shuffle matrix" thing ? AnthonyStC ( talk) 21:33, 6 August 2020 (UTC)
![]() | This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||
|
Is it possible to insert a reference in which one can find proofs for the listed properties of the Kronecker product? E.g. Abstract properties / 1. Spectrum needs in my opinion a citation. — Preceding unsigned comment added by 161.116.80.135 ( talk) 15:55, 10 September 2012 (UTC)
![]() | This edit request by an editor with a conflict of interest has now been answered. |
I suggest we could add some other applications to the Khatri-Rao product.
Please note that one of the articles sited is of my authorship. Nevertheless, it gives new applications to the Khatri-Rao problem, and, more importantly, presentes a new relationship between the Khatri-Rao and the Kronecker product, what is in accordance with the contents of this Wikipage. If you wish to verify the contents of the paper I listed, you can find a copy of it in my webpage http://www.decom.fee.unicamp.br/~masiero or directly at http://www.decom.fee.unicamp.br/~masiero/articles/Journal/LSP2674969.pdf
Suggested text: This column-wise version of the Khatri-Rao product is useful in linear algebra approaches to data analytical processing [1] and in optimizing the solution of inverse problems dealing with a diagonal matrix [2] [3].
Bmasiero ( talk) 17:29, 27 March 2017 (UTC)Dr. Bruno Masiero
References
Requesting addition/articles for Khatri-Rao and Tracy-Singh products. [1] Shyamal 04:39, 25 July 2006 (UTC)
The paragraph should note that a choice of bases is involved: If A and B represent homomorphisms given certain bases of the involved vector spaces, the Kronecker product of A and B represents the tensor product of these homomorphisms with respect to certain bases of the tensor products of the domain and codomain vector spaces of the form a_1 x b_1, a_1 x b_2, ..., a_1 x b_n, a_2 x b_1, ... 84.190.181.201
An anonymous user edited (concerning the final related matrix operation):
The reason I got me a Wikipedia-account in the first place was that I needed the definition of the colunm-wise KR product for my master's thesis, and I was tired of always looking in the paper by Liu. Later I examined this paper in which I saw (on p.3 in the pdf) what I had by then found out namely that the Khatri-Rao product is implied to operate on matrices with as partitions their columns. I wasn't sure whether this would be a mistake or a different (and confusing) convention or something, therefore, and also for my own reference, I added it to the article as a seperate case. But maybe it needs some clarification. -- StevenDH ( talk) 20:23, 16 April 2008 (UTC)
If A is n-by-n, B is m-by-m and denotes the k-by-k identity matrix then we can define the Kronecker sum, , by
(Note that this is different from the direct sum of two matrices.)
But the denotion of Kronecker sum and Direct sum is equel!!! So is it mistake? Gvozdet ( talk) 13:29, 18 August 2009 (UTC)
http://ru.wikipedia.org/wiki/%D0%A2%D0%B5%D0%BD%D0%B7%D0%BE%D1%80%D0%BD%D0%BE%D0%B5_%D0%BF%D1%80%D0%BE%D0%B8%D0%B7%D0%B2%D0%B5%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5 —Preceding unsigned comment added by 217.29.95.125 ( talk) 14:23, 23 July 2010 (UTC)
There is a confusing clash of nomenclature regarding the Kronecker product.
Despite the occasional use of the phrase "tensor product" to describe the Kronecker product, the Kronecker product doesn't coincide with the usual definition of the tensor product. The tensor product is an operation that produces a tensor of higher rank. That is, in coordinates the tensor product adds indices:
(Refer, for example, to: John Lee, Introduction to Smooth Manifolds)
As an example, the tensor product of two vectors gives you a matrix (ignoring covariant and contravariant issues for the moment). But the tensor product of two matrices is a fourth-rank tensor, not another matrix. The Kronecker product between matrices simply gives you another matrix (although with higher dimension), and is then not the same thing as the tensor product.
The distinction might be blurred in some literature where they presumably call the actual tensor product a Cartesian product, which is also bad nomenclature since the Cartesian product is a product on sets, not tensors; better nomenclature would be direct product (and indeed MathWorld adopts an explicit hybrid nomenclature to be absolutely clear: [ [2]]).
The relevant section of the Tensor Product page agrees with me except for the instances that mention the Kronecker product. In fact, this confusion arises in the talk page of the Tensor Product article as well, where Physdragon also agrees with me.
I think the confusion is that when working with only vectors and matrices it's easy to identify matrices and their vectorized counterparts (or indeed fourth-rank tensors and their matricized counterparts) -- something that can't be done with general tensors. The Kronecker product in this light is then the "matricized" version of the fourth-rank tensor which maps between second-rank tensors of the form
or more explicitly is the matrix which maps between vectorized tensors of the form
The Kronecker product should then be contrasted with the tensor product and should not be used to exemplify the tensor product, which is a different -- albeit related -- beast. 129.32.11.206 ( talk) 18:22, 15 October 2012 (UTC)
Maybe in order to clarify the relationship, we could put up something like this:
One problem with this is that the between vector spaces is an abstract tensor product and not a Kronecker product, so to be consistent the notation will be funny-looking: , etc.
Another problem is the ambiguity of saying "viewed as a space of..." However I think this is sufficiently clear that one shouldn't have to obfuscate the notation to be precise: 129.32.11.206 ( talk) 17:18, 16 October 2012 (UTC)
Miller (1981) derives a formula for the inverse of sums of Kronecker products (p. 72) in the special case where the "last" matrix is of rank 1. Specifically, Let W = A x G + B x E, where rank(E) = 1. Then W^{-1} = A^{-1} x G^{-1} - T x G^{-1} E G^{-1} where T = ( A + g B )^{-1} B A^{-1} and g = trace(E G^{-1}). Miller also works out examples where G is the identity matrix and B is also of rank 1. Are these too detailed to include in the page? If they are appropriate, I can add them. -- Nathanvan ( talk) 22:31, 1 January 2013 (UTC)
The section on Properties says:
"The operation of transposition is distributive over the Kronecker product:
This can be made more generic by also including the complex case and considering the conjugate transpose instead:
I think this is worth noting. I believe it is more trivial to see from this that distributiviness also holds for the real case with the transpose, than the other way around. anoko_moonlight ( talk) 11:15, 12 August 2013 (UTC)
How does one pronounce A ⊗ B ? If there’s a standard way, I think it should be mentioned the first time the notation appears in the article. Loraof ( talk) 17:28, 5 July 2018 (UTC)
"Kronecker sums appear naturally in physics when considering ensembles of non-interacting systems. citation needed Let Hi be the Hamiltonian of the ith such system. Then the total Hamiltonian of the ensemble is
It's not quite true. I would say that "Direct sums appear naturally ..." not "Kronecker sums ..."
Property 2 of the Kronecker product uses the MATLAB notation without explaining it (I suspect it is just a copy/paste of the paper quoted [3]), which is unclear. Moreover, there is more to it, and there is even an article about this fact : /info/en/?search=Commutation_matrix. Perhaps the page should be edited to fill in the gaps and clarify this "Shuffle matrix" thing ? AnthonyStC ( talk) 21:33, 6 August 2020 (UTC)