Showing that detA=detB⋅detC when B,C are the restrictions of A onto a subspace I am a bit unsure about one approach that is mentioned to prove this determinant result Here is the quote from Pages 100-101 of Finite-Dimensional Vector Spaces by Halmos: Here is another useful fact about determinants. If M is a subspace invariant under A, if B is the transformation A considered on M only, and if C is the quotient transformation A/M, then detA=detB⋅detC This multiplicative relation holds if, in particular, A is the direct sum of two transformations B and C. The proof can be based directly on the definition of determinants, or, alternatively, on the expansion obtained in the preceding paragraph. What I am confused about is how you can use the definition of determinants to conclude this res

ngombangouh

ngombangouh

Answered question

2022-09-01

Showing that det A = det B det C when B,C are the restrictions of A onto a subspace
I am a bit unsure about one approach that is mentioned to prove this determinant result.
Here is the quote from Pages 100-101 of Finite-Dimensional Vector Spaces by Halmos:
Here is another useful fact about determinants. If M is a subspace invariant under A, if B is the transformation A considered on M only, and if C is the quotient transformation A / M , then
det A = det B det C
This multiplicative relation holds if, in particular, A is the direct sum of two transformations B and C. The proof can be based directly on the definition of determinants, or, alternatively, on the expansion obtained in the preceding paragraph.
What I am confused about is how you can use the definition of determinants to conclude this result.
In this book, the determinant of a linear transformation A is defined as the scalar δ such that α i j for all alternating n-linear forms w, where V is an n-dimensional vector space.
It is then shown that by fixing a coordinate system (or basis) and letting α i j be the entries of the matrix of the linear transformation under the coordinate system, the determinant of the linear transformation A in that coordinate system is:
det A = π ( sgn π ) α π ( 1 ) , 1 α π ( n ) , n
where the summation goes over all permutations in S n .
I have been able to use the expression involving the coordinates to show this result, but I am not sure about how this would be done directly from the definition. I have tried looking at defining other alternating forms and using their product to show this, but I was not able to make much use of that approach.
Are there any suggestions for proving this result directly from the definition?
Edit: I would like to add that part of my confusion may be from the fact that A, B and C are all linear transformations on different vector spaces and I am not sure how the definition can be used in this situation.

Answer & Explanation

Branson Grimes

Branson Grimes

Beginner2022-09-02Added 9 answers

Let d = dim ( M ). Let v 1 , , v n V such that v 1 , , v d M is a basis of M and let ω be a non-zero alternating form on V. Then
M ( w 1 , , w d ) ω ( w 1 , , w d , v d + 1 , , v n )
is an alternating form on M and
V ( w d + 1 , , w n ) ω ( v 1 , , v d , w d + 1 , , w n )
is (or induces) an alternating form on V / M . Therefore
det ( A ) ω ( v 1 , , v n ) = ω ( A v 1 , , A v n ) = det ( B ) ω ( v 1 , , v d , A v d + 1 , , A v n ) = det ( B ) det ( A / M ) ω ( v 1 , , v n ) .
Since this holds for all v 1 , , v n (with v 1 , , v d M a basis) and ω is non-zero it follows that
det ( A ) = det ( B ) det ( A / M ) .

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Linear algebra

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?