Chelsea Lamb
2022-09-26
Answered

If the matrix of a linear transformation $:{\mathbb{R}}^{N}\to {\mathbb{R}}^{N}$ with respect to some basis is symmetric, what does it say about the transformation? Is there a way to geometrically interpret the transformation in a nice/simple way?

You can still ask an expert for help

asijikisi67

Answered 2022-09-27
Author has **10** answers

If ${\mathbb{R}}^{n}$ is endowed with an inner product $\u27e8\phantom{\rule{thinmathspace}{0ex}}\cdot \phantom{\rule{thinmathspace}{0ex}},\phantom{\rule{thinmathspace}{0ex}}\cdot \phantom{\rule{thinmathspace}{0ex}}\u27e9$ and the matrix $A$ of $T$ is symmetric with respect to an orthonormal basis, then we have the important property that

$$\u27e8y,T(x)\u27e9=\u27e8y,Ax\u27e9=yAx={y}^{T}{A}^{T}x=(Ay{)}^{T}x=\u27e8Ay,x\u27e9=\u27e8T(y),x\u27e9;$$

in this case we say that $T$ itself is symmetric. There's too much to say about why these are important in a single post, but let me point out two useful facts:

(1) By the Spectral Theorem, $T$ is orthogonally diagonalizable, that is, $T$ is conjugate by an orthogonal transformation to a diagonal transformation.

(2) Suppose $x,y$ are eigenvectors of $T$. If they correspond respectively to distinct eigenvalues $\lambda ,\mu $, then we have

$$\lambda \u27e8x,y\u27e9=\u27e8\lambda x,y\u27e9=\u27e8T(x),y\u27e9=\u27e8x,T(y)\u27e9=\u27e8x,\mu y\u27e9=\mu \u27e8x,y\u27e9.$$

In particular, if $\lambda \ne \mu $ then $\u27e8x,y\u27e9=0$, that is, the eigenspaces of $T$ are all orthogonal.

$$\u27e8y,T(x)\u27e9=\u27e8y,Ax\u27e9=yAx={y}^{T}{A}^{T}x=(Ay{)}^{T}x=\u27e8Ay,x\u27e9=\u27e8T(y),x\u27e9;$$

in this case we say that $T$ itself is symmetric. There's too much to say about why these are important in a single post, but let me point out two useful facts:

(1) By the Spectral Theorem, $T$ is orthogonally diagonalizable, that is, $T$ is conjugate by an orthogonal transformation to a diagonal transformation.

(2) Suppose $x,y$ are eigenvectors of $T$. If they correspond respectively to distinct eigenvalues $\lambda ,\mu $, then we have

$$\lambda \u27e8x,y\u27e9=\u27e8\lambda x,y\u27e9=\u27e8T(x),y\u27e9=\u27e8x,T(y)\u27e9=\u27e8x,\mu y\u27e9=\mu \u27e8x,y\u27e9.$$

In particular, if $\lambda \ne \mu $ then $\u27e8x,y\u27e9=0$, that is, the eigenspaces of $T$ are all orthogonal.

asked 2021-09-18

I need to find a unique description of Nul A, namely by listing the vectors that measure the null space.

$A=\left[\begin{array}{ccccc}1& 5& -4& -3& 1\\ 0& 1& -2& 1& 0\\ 0& 0& 0& 0& 0\end{array}\right]$

asked 2021-09-13

Suppose that A is row equivalent to B. Find bases for the null space of A and the column space of A.

$A=\left[\begin{array}{ccccc}1& 2& -5& 11& -3\\ 2& 4& -5& 15& 2\\ 1& 2& 0& 4& 5\\ 3& 6& -5& 19& -2\end{array}\right]$

$B=\left[\begin{array}{ccccc}1& 2& 0& 4& 5\\ 0& 0& 5& -7& 8\\ 0& 0& 0& 0& -9\\ 0& 0& 0& 0& 0\end{array}\right]$

asked 2021-06-13

For the matrix A below, find a nonzero vector in the null space of A and a nonzero vector in the column space of A

$A=\left[\begin{array}{cccc}2& 3& 5& -9\\ -8& -9& -11& 21\\ 4& -3& -17& 27\end{array}\right]$

Find a vector in the null space of A that is not the zero vector

$A=\left[\begin{array}{c}-3\\ 2\\ 0\\ 1\end{array}\right]$

asked 2022-07-29

Find the inverse of the following matrix A, if possible. Check that $A\ast {A}^{-1}$ and ${A}^{-1}\ast A=I$

$A=\left[\begin{array}{cc}4& 8\\ -5& -10\end{array}\right]$

The inverse, ${A}^{-1}$, is A=?

$A=\left[\begin{array}{cc}4& 8\\ -5& -10\end{array}\right]$

The inverse, ${A}^{-1}$, is A=?

asked 2022-01-31

Can a matrix transformation ever make a linearly dependent matrix linearly independent?

For example, if A is a linearly dependent matrix, and B any matrix, could BA ever come out to be linearly independent?

For example, if A is a linearly dependent matrix, and B any matrix, could BA ever come out to be linearly independent?

asked 2021-12-15

Is a matrix multiplied with its transpose something special?

In my math lectures, we talked about the Gram-Determinant where a matrix times its transpose are multiplied together.

Is$\mathrm{\forall}}^{T$ something special for any matrix A?

In my math lectures, we talked about the Gram-Determinant where a matrix times its transpose are multiplied together.

Is

asked 2022-06-14

Transformation Matrix of a dot product transformation

Let v be a arbitrary vector in R3 and T(x)=v.x. What is the matrix of the transformation T in terms of the components of v? It seems like trying to figure out the matrix using the equation T(x)=Ax does not work, as the left side is a scalar, and the other side is a matrix. Any idea

Let v be a arbitrary vector in R3 and T(x)=v.x. What is the matrix of the transformation T in terms of the components of v? It seems like trying to figure out the matrix using the equation T(x)=Ax does not work, as the left side is a scalar, and the other side is a matrix. Any idea