Let is a set of random matrices, where , and and are two matrices containing orthogonal columns (i.e., ). I was wondering, if the following question has a analytical solution:
If not, how should I solve it? Alternating optimization?
(At first, I thought it may be related to the SVD of the sum of the matrices , but so far I have no hint to prove it.)