Let T : U rightarrow U be a linear transformation and let beta be a basis of U Define the determinant det(T) of T as det(T) = det([T]_{beta}). Show ta det (T) is well-defined, i. e. that it does not depend on the choice of the basis beta Prove that T is invertible if and only if det (T) neq 0. If T is invertible, show that det (T^{-1}) = frac{1}{det(T)}

Question
Transformation properties
asked 2021-02-11
Let \(T : U \rightarrow U\) be a linear transformation and let beta be a basis of U Define the determinant det(T) of T as
det\((T) = det([T]_{\beta}).\)
Show ta det (T) is well-defined, i. e. that it does not depend on the choice of the basis beta
Prove that T is invertible if and only if det \((T) \neq 0.\) If T is invertible, show that det \((T^{-1}) = \frac{1}{det(T)}\)

Answers (1)

2021-02-12
Step 1
To establish some basic facts about the determinant of a linear transformation T from U to U (U a vector space).
Step 2
A linear transformation \(T, U \rightarrow U\) is a map satisfying the properties shown.
\(T : U \rightarrow\) (U a vector space) is linear if
\(T (v + w) = T (v) + T (w),\) and
\(T (cv) = cT (v),\) for vectors v, w in U and scalars c
Step 3
Let B be a basis of U. Then T is represented as a square matrix of size n (n= dimension of U)
Let \(B = {v_{1}, v_{2}, ..., v_{n}}\) be a in basis for U.
(The vectors in B are linearly independent and they span U).
Then anu linear map T : B \rightarrow B can be represented as a (unique) square matrix T_{b} = [b_{ij}]{1 \leq i, j \leq n),\) defined by
\(T (v_{1}) = \sum_{j = 1}^{n} b_{ji} v_{j}
Step 4
With the above notation, we define det(T) = determinant of the matrix
\(T_{B} = [b_{ij}]\)
In order that det (T) is well-defined , we need to show that this definition does not depend on the choice of the basis B. In other words, if C is some other basis for U, we need to prove that.
Det \((T_{B})=det ([b_{ij}]) = det([c_{ij}]) = Det(T_{C})\)
Let B, C are two bases of U.
We need to show det \((T_{B}) = det (T_{C}),\)
which in turn proves that (T) is well - defined.
(det (T) is independent of the basis)
Step 5
Here , we have used the standard properties of determinants of matrices .
Proof: Let B,C are two bases of U.
Then \(T_{B} = P^{-1} T_{C} P,\)
where P is the change of basis matrix.
Now
det \((T_{B}) = det(P^{-1} T_{C} P) = det (P)^{-1} det (P) = det (T_{C})\)
Step 6
Proof of T is invertible iff det (T) is not zero
\(T : U \rightarrow U\) invertible
\(\Leftrightarrow EE S (inverse\ of\ T)\ with\ TS = I\) (identity)
\(\Leftrightarrow\ det\ (TS) = det (I) = 1\)
\(\Leftrightarrow det (T) det (S) = 1\)
\(\Leftrightarrow det (T) \neq 0\)
Step 7
Now, \(S = T^{-1}\) and
det \((T) det (S) = 1\) (from the above)
det \((T^{-1}) = det (S) = \frac{1}{det(T)} = det (T)^{-1},\)
as required.
0

Relevant Questions

asked 2020-11-05
Let \(T\ :\ U\ \rightarrow\ U\) be a linear transformatiom and let
\(\mathscr{B}\) be a basis of U. Define the determanant det (T) of T as
\(det(T) = det([T]\ \mathscr{B}).\)
Show that det (T) is well=defined, i.e. that it does not depend on the choice of the basis \(\mathscr{B}.\)
Prove that T is invertible if and only if det \((T) \neq\ 0.\) If T is invertible, show that
\(det(T^{-1}) =\ \frac{1}{det(T)}.\)
asked 2021-01-17
Guided Proof Let \({v_{1}, v_{2}, .... V_{n}}\) be a basis for a vector space V.
Prove that if a linear transformation \(T : V \rightarrow V\) satisfies
\(T (v_{i}) = 0\ for\ i = 1, 2,..., n,\) then T is the zero transformation.
To prove that T is the zero transformation, you need to show that \(T(v) = 0\) for every vector v in V.
(i) Let v be the arbitrary vector in V such that \(v = c_{1} v_{1} + c_{2} v_{2} +\cdots + c_{n} V_{n}\)
(ii) Use the definition and properties of linear transformations to rewrite T(v) as a linear combination of \(T(v_{j})\) .
(iii) Use the fact that \(T (v_{j}) = 0\)
to conclude that \(T (v) = 0,\) making T the zero transformation.
asked 2021-03-07
Let \(\displaystyle\le{f}{t}{\left\lbrace{v}_{{{1}}},\ {v}_{{{2}}},\dot{{s}},\ {v}_{{{n}}}{r}{i}{g}{h}{t}\right\rbrace}\) be a basis for a vector space V. Prove that if a linear transformation \(\displaystyle{T}\ :\ {V}\rightarrow\ {V}\) satisfies \(\displaystyle{T}{\left({v}_{{{1}}}\right)}={0}\ \text{for}\ {i}={1},\ {2},\dot{{s}},\ {n},\) then T is the zero transformation.
Getting Started: To prove that T is the zero transformation, you need to show that \(\displaystyle{T}{\left({v}\right)}={0}\) for every vector v in V.
(i) Let v be an arbitrary vector in V such that \(\displaystyle{v}={c}_{{{1}}}\ {v}_{{{1}}}\ +\ {c}_{{{2}}}\ {v}_{{{2}}}\ +\ \dot{{s}}\ +\ {c}_{{{n}}}\ {v}_{{{n}}}.\)
(ii) Use the definition and properties of linear transformations to rewrite \(\displaystyle{T}\ {\left({v}\right)}\) as a linear combination of \(\displaystyle{T}\ {\left({v}_{{{1}}}\right)}\).
(iii) Use the fact that \(\displaystyle{T}\ {\left({v}_{{i}}\right)}={0}\) to conclude that \(\displaystyle{T}\ {\left({v}\right)}={0}\), making T the zero tranformation.
asked 2020-11-02
Find the linear or affine transformations that satisty the desired properties and write it in the form \(T(x) = Ax\ +\ b:\)
The transformation \(T\ :\ \mathbb{R}^{2}\ \rightarrow\ \mathbb{R}^{2}\) sending the origin to itself and a triangle of vertices
\((0,\ 0),\ (1,\ 0),\ (0,\ 1)\) to a triangle of verices (0, 0),
\((\sqrt{\frac{2}{2}},\ \sqrt{\frac{2}{2}}),\ (- \sqrt{\frac{2}{2}},\ \sqrt{2}).\)
asked 2020-11-09
To determine:
To prove:
The function \(\displaystyle{T}:{V}\rightarrow{W}\) is linear transformation if and only if
\(\displaystyle{T}{\left({a}{u}\ +\ {b}{v}\right)}={a}{T}{\left({u}\right)}\ +\ {b}{T}{\left({v}\right)}\) for all vectors u and v and all scalars a and b.
asked 2020-11-30
Assum T: R^m to R^n is a matrix transformation with matrix A. Prove that if the columns of A are linearly independent, then T is one to one (i.e injective). (Hint: Remember that matrix transformations satisfy the linearity properties.
Linearity Properties:
If A is a matrix, v and w are vectors and c is a scalar then
\(A 0 = 0\)
\(A(cv) = cAv\)
\(A(v\ +\ w) = Av\ +\ Aw\)
asked 2020-10-20
To show:
The set \(\displaystyle{\left\lbrace{T}{\left({x}_{{1}}\right)},\ \ldots\ ,{T}{\left({x}_{{k}}\right)}\right\rbrace}\) is a linearly independent subset of \(\displaystyle{R}^{{{m}}}\)
Given:
Let \(\displaystyle{T}\ :\ {T}\ :\ {R}^{{{n}}}\rightarrow{R}^{{{m}}}\) be a linear transformation with nulity zero. If \(\displaystyle{S}={\left\lbrace{x}_{{{1}}},\ \cdots\ \ ,{x}_{{{k}}}\right\rbrace}\) is a linearly independent subset of \(\displaystyle{R}^{{{n}}}.\)
asked 2021-02-03
To sketch:
(i) The properties,
(ii) Linear transformation.
Let \({T}:\mathbb{R}^{2}\to\mathbb{R}^{2}\) be the linear transformation that reflects each point through the
\(x_{1} axis.\)
Let \({A}={\left[\begin{matrix}{1}&{0}\\{0}&-{1}\end{matrix}\right]}\)
asked 2021-02-10
\(\displaystyle\text{Let A be an}\ {n}\ \times\ {n}\ \text{matrix and suppose that}\ {L}:{M}_{{\cap}}\ \rightarrow\ {M}_{{\cap}}\ \text{is defined by}{L}{\left({x}\right)}={A}{X},\text{for}\ {X}\in{M}_{{\cap}}.\text{Show that L is a linear transformation.}\)
asked 2021-01-06
Use a counterexample to show that the statement is false.
\({T}:{R}^{2}\to{R}^{2},{T}{\left({x}_{{2}},{x}_{{2}}\right)}={\left({x}_{{1}}+{4},{x}_{{2}}\right)}\) is a linear transformation?
...