Mohamed Mooney

2022-06-24

For a transformation

$F(X)={X}^{T}$

on $V={M}_{3,3}(\mathbb{R})$, apparently the matrix of $F$ is a $9\times 9$ matrix. How can this be possible? Isn't the definition that

$F(X)=AX?$

, so if $A$ is the $9\times 9$ matrix, we can't multiply a $9\times 9$ matrix with a $3\times 3$, can we?

$F(X)={X}^{T}$

on $V={M}_{3,3}(\mathbb{R})$, apparently the matrix of $F$ is a $9\times 9$ matrix. How can this be possible? Isn't the definition that

$F(X)=AX?$

, so if $A$ is the $9\times 9$ matrix, we can't multiply a $9\times 9$ matrix with a $3\times 3$, can we?

assumintdz

Beginner2022-06-25Added 22 answers

For any linear transformation $T:V\to W$ over finite dimensional real vector spaces if we consider fixed ordered bases $B,{B}^{\prime}$ of $V$ and $W$ respectively, then we can find the coordinate vectors/expansion in terms of these bases.

This directly follows from the definition of the bases. Given $X\in V$ and $B=\{{v}_{1},\dots ,{v}_{n}\}$ clearly $X=\sum {\alpha}_{i}{v}_{i}$ is a unique way to express $X$ in terms of the ${v}_{i}$. If this way was not unique then linear independence of $B$ will be contradicted. So we may associate $[X]={\left(\begin{array}{ccc}{\alpha}_{1}& \cdots & {\alpha}_{n}\end{array}\right)}^{T}$ with $X$ in a one to one way. Likewise coordinates can be associated with every vector in $V$. This association essentially allows us to identify an element of the vector space $V$ with the vector space ${\mathbb{R}}^{\mathrm{dim}V}$. The same can be done in $W$ too with respect to ${B}^{\prime}$.

Now if $A$ is the matrix of $T$ with respect to these bases, then a very beautiful result (which is not difficult to prove also) says that as $X\to T(X)$, the corresponding coordinates of $X$, say $[X]={\left(\begin{array}{ccc}{\alpha}_{1}& \cdots & {\alpha}_{n}\end{array}\right)}^{T}$, go to the corresponding coordinates of $T(X)$, which are precisely given by $A{\left(\begin{array}{ccc}{\alpha}_{1}& \cdots & {\alpha}_{n}\end{array}\right)}^{T}$. Essentially, the role of the action of $T$ is played by multiplication by $A$ in the coordinate world. So the correct interpretation is not that $X\to AX$ but $[X]\to A[X]$. It is not that the matrix and the transformation are identical, but under the identification of vectors by their coordinates, the behavior of the transformation matches the action of multiplication by the matrix. The same holds if we are working over any arbitrary field instead of $\mathbb{R}$.

As an illustration in your case let $V=W={M}_{3,3}(\mathbb{R})$ and $B={B}^{\prime}=\{{e}_{11},{e}_{12},{e}_{13},{e}_{21},{e}_{22},{e}_{23},{e}_{31},{e}_{32},{e}_{33}\}$ be the fixed ordered basis of $V$ where ${e}_{ij}$ is the $3\times 3$ matrix which has $1$ at the $(i,j)th$ place and $0$'s elsewhere.

It is now clear that the matrix of $F$ with respect to this basis is

$A=\left(\begin{array}{ccccccccc}1& 0& 0& 0& 0& 0& 0& 0& 0\\ 0& 0& 0& 1& 0& 0& 0& 0& 0\\ 0& 0& 0& 0& 0& 0& 1& 0& 0\\ 0& 1& 0& 0& 0& 0& 0& 0& 0\\ 0& 0& 0& 0& 1& 0& 0& 0& 0\\ 0& 0& 0& 0& 0& 0& 0& 1& 0\\ 0& 0& 1& 0& 0& 0& 0& 0& 0\\ 0& 0& 0& 0& 0& 1& 0& 0& 0\\ 0& 0& 0& 0& 0& 0& 0& 0& 1\end{array}\right)$

If we have some $3\times 3$ matrix $X=\left(\begin{array}{ccc}1& 2& 3\\ 4& 5& 6\\ 7& 8& 9\end{array}\right)$ then its coordinates will be

${\left(\begin{array}{ccccccccc}1& 2& 3& 4& 5& 6& 7& 8& 9\end{array}\right)}^{T}$

Now since $X=\left(\begin{array}{ccc}1& 2& 3\\ 4& 5& 6\\ 7& 8& 9\end{array}\right)\to F(X)=\left(\begin{array}{ccc}1& 4& 7\\ 2& 5& 8\\ 3& 6& 9\end{array}\right)$, under $F$, so it must happen that their coordinates also change in the same fashion under $A$.

So we must have

$[X]={\left(\begin{array}{ccccccccc}1& 2& 3& 4& 5& 6& 7& 8& 9\end{array}\right)}^{T}\to A[X]={\left(\begin{array}{ccccccccc}1& 4& 7& 2& 5& 8& 3& 6& 9\end{array}\right)}^{T}$

which can be verified by direct multiplication.

Also, in the same vein, if you compose linear transformations ${T}_{1},{T}_{2}$ then their effect is the same as multiplying the corresponding matrices. This is the real reason behind defining matrix multiplication in the fashion that we do so.

This directly follows from the definition of the bases. Given $X\in V$ and $B=\{{v}_{1},\dots ,{v}_{n}\}$ clearly $X=\sum {\alpha}_{i}{v}_{i}$ is a unique way to express $X$ in terms of the ${v}_{i}$. If this way was not unique then linear independence of $B$ will be contradicted. So we may associate $[X]={\left(\begin{array}{ccc}{\alpha}_{1}& \cdots & {\alpha}_{n}\end{array}\right)}^{T}$ with $X$ in a one to one way. Likewise coordinates can be associated with every vector in $V$. This association essentially allows us to identify an element of the vector space $V$ with the vector space ${\mathbb{R}}^{\mathrm{dim}V}$. The same can be done in $W$ too with respect to ${B}^{\prime}$.

Now if $A$ is the matrix of $T$ with respect to these bases, then a very beautiful result (which is not difficult to prove also) says that as $X\to T(X)$, the corresponding coordinates of $X$, say $[X]={\left(\begin{array}{ccc}{\alpha}_{1}& \cdots & {\alpha}_{n}\end{array}\right)}^{T}$, go to the corresponding coordinates of $T(X)$, which are precisely given by $A{\left(\begin{array}{ccc}{\alpha}_{1}& \cdots & {\alpha}_{n}\end{array}\right)}^{T}$. Essentially, the role of the action of $T$ is played by multiplication by $A$ in the coordinate world. So the correct interpretation is not that $X\to AX$ but $[X]\to A[X]$. It is not that the matrix and the transformation are identical, but under the identification of vectors by their coordinates, the behavior of the transformation matches the action of multiplication by the matrix. The same holds if we are working over any arbitrary field instead of $\mathbb{R}$.

As an illustration in your case let $V=W={M}_{3,3}(\mathbb{R})$ and $B={B}^{\prime}=\{{e}_{11},{e}_{12},{e}_{13},{e}_{21},{e}_{22},{e}_{23},{e}_{31},{e}_{32},{e}_{33}\}$ be the fixed ordered basis of $V$ where ${e}_{ij}$ is the $3\times 3$ matrix which has $1$ at the $(i,j)th$ place and $0$'s elsewhere.

It is now clear that the matrix of $F$ with respect to this basis is

$A=\left(\begin{array}{ccccccccc}1& 0& 0& 0& 0& 0& 0& 0& 0\\ 0& 0& 0& 1& 0& 0& 0& 0& 0\\ 0& 0& 0& 0& 0& 0& 1& 0& 0\\ 0& 1& 0& 0& 0& 0& 0& 0& 0\\ 0& 0& 0& 0& 1& 0& 0& 0& 0\\ 0& 0& 0& 0& 0& 0& 0& 1& 0\\ 0& 0& 1& 0& 0& 0& 0& 0& 0\\ 0& 0& 0& 0& 0& 1& 0& 0& 0\\ 0& 0& 0& 0& 0& 0& 0& 0& 1\end{array}\right)$

If we have some $3\times 3$ matrix $X=\left(\begin{array}{ccc}1& 2& 3\\ 4& 5& 6\\ 7& 8& 9\end{array}\right)$ then its coordinates will be

${\left(\begin{array}{ccccccccc}1& 2& 3& 4& 5& 6& 7& 8& 9\end{array}\right)}^{T}$

Now since $X=\left(\begin{array}{ccc}1& 2& 3\\ 4& 5& 6\\ 7& 8& 9\end{array}\right)\to F(X)=\left(\begin{array}{ccc}1& 4& 7\\ 2& 5& 8\\ 3& 6& 9\end{array}\right)$, under $F$, so it must happen that their coordinates also change in the same fashion under $A$.

So we must have

$[X]={\left(\begin{array}{ccccccccc}1& 2& 3& 4& 5& 6& 7& 8& 9\end{array}\right)}^{T}\to A[X]={\left(\begin{array}{ccccccccc}1& 4& 7& 2& 5& 8& 3& 6& 9\end{array}\right)}^{T}$

which can be verified by direct multiplication.

Also, in the same vein, if you compose linear transformations ${T}_{1},{T}_{2}$ then their effect is the same as multiplying the corresponding matrices. This is the real reason behind defining matrix multiplication in the fashion that we do so.

An object moving in the xy-plane is acted on by a conservative force described by the potential energy function

where$U(x,y)=\alpha (\frac{1}{{x}^{2}}+\frac{1}{{y}^{2}})$ is a positive constant. Derivative an expression for the force expressed terms of the unit vectors$\alpha$ and$\overrightarrow{i}$ .$\overrightarrow{j}$ I need to find a unique description of Nul A, namely by listing the vectors that measure the null space

?

$A=\left[\begin{array}{ccccc}1& 5& -4& -3& 1\\ 0& 1& -2& 1& 0\\ 0& 0& 0& 0& 0\end{array}\right]$T must be a linear transformation, we assume. Can u find the T standard matrix.$T:{\mathbb{R}}^{2}\to {\mathbb{R}}^{4},T\left({e}_{1}\right)=(3,1,3,1)\text{}{\textstyle \phantom{\rule{1em}{0ex}}}\text{and}{\textstyle \phantom{\rule{1em}{0ex}}}\text{}T\left({e}_{2}\right)=(-5,2,0,0),\text{}where\text{}{e}_{1}=(1,0)\text{}{\textstyle \phantom{\rule{1em}{0ex}}}\text{and}{\textstyle \phantom{\rule{1em}{0ex}}}\text{}{e}_{2}=(0,1)$

?Find a nonzero vector orthogonal to the plane through the points P, Q, and R. and area of the triangle PQR

Consider the points below

P(1,0,1) , Q(-2,1,4) , R(7,2,7).

a) Find a nonzero vector orthogonal to the plane through the points P,Q and R.

b) Find the area of the triangle PQR.Consider two vectors A=3i - 1j and B = - i - 5j, how do you calculate A - B?

Let vectors A=(1,0,-3) ,B=(-2,5,1) and C=(3,1,1), how do you calculate 2A-3(B-C)?

What is the projection of $<6,5,3>$ onto $<2,-1,8>$?

What is the dot product of $<1,-4,5>$ and $<-5,7,3>$?

Which of the following is not a vector quantity?

A)Weight;

B)Nuclear spin;

C)Momentum;

D)Potential energyHow to find all unit vectors normal to the plane which contains the points $(0,1,1),(1,-1,0)$, and $(1,0,2)$?

What is a rank $1$ matrix?

How to find unit vector perpendicular to plane: 6x-2y+3z+8=0?

Can we say that a zero matrix is invertible?

How do I find the sum of three vectors?

How do I find the vertical component of a vector?