Writing a composite transformation as a matrix multiplication I have two

maliaseth0 2022-02-01 Answered
Writing a composite transformation as a matrix multiplication
I have two matrices, P and Q as follows:
P=(12323212)
Q=(12323212)
You can still ask an expert for help

Want to know more about Matrix transformations?

Expert Community at Your Service

  • Live experts 24/7
  • Questions are typically answered in as fast as 30 minutes
  • Personalized clear answers
Learn more

Solve your problem for the price of one coffee

  • Available 24/7
  • Math expert for every subject
  • Pay only if we can solve it
Ask Question

Expert Answer

fumanchuecf
Answered 2022-02-02 Author has 21 answers
Step 1
Matrices work like functions. If you have functions f:AB and g:BC, and a element xA, we do first f(x), then g(f(x))=(g×f)(x). Notice that the order is different, we applied f, then g, but the result ended up being g×f. It is the same for matrices.
Extra info. for you: The matrix of rotation by a angle θ is given by
(cosθsinθsinθcosθ)
This way, P is rotation by 60 deg and Q is rotation by 120 deg.
Not exactly what you’re looking for?
Ask My Question
lilwhitelieyc
Answered 2022-02-03 Author has 10 answers
Step 1
Here's the deal: operating on a point p by a transformation T is written T(p). That is, the point operated on goes to the right of the function being applied. If you wanted to then take the result,
q=T(p)
nd apply a transformation S, you'd write S(q). Substituting, you'd find that this was
S(T(p))
Many important transformations can be written in the form of multiplication by a matrix, and one might conjecture that the definition of matrix multiplication was made so that
T(p)=Mp
where M is a matrix, would work out nicely. Given that, if T(p)=Mp and S(q)=Kq, you get that
S(T(p))=K(Mp)=(KM)p
So yes, either in applying transformations OR in multiplying matrices for transformations (at least when points are represented by column vectors), you write the matrix for the first operation to the left of the point; the matrix for th next operation to the left of that, and so on.
Foley and van Dam, in their Computer Graphics text (first edition), used ROW vectors for point-coordinates; in that situation, the matrices go on the right, and you can read operations/matrices from right to left, which seems nice. Unfortunately, if you do that, then operations on covectors (like the normal vector to a surface) end up reading left-to-right, so you don't really win. You also, in using row vectors, go against the conventions widely used in math courses, which just messes up your students. That's why in later editions (I confess I'm a co-author) we went with column vectors.
Not exactly what you’re looking for?
Ask My Question

Expert Community at Your Service

  • Live experts 24/7
  • Questions are typically answered in as fast as 30 minutes
  • Personalized clear answers
Learn more

You might be interested in

asked 2021-09-13

Assume that A is row equivalent to B. Find bases for Nul A and Col A.
A=[12511324515212045365192]
B=[12045005780000900000]

asked 2021-06-13
For the matrix A below, find a nonzero vector in Nul A and a nonzero vector in Col A.
A=[2359891121431727]
Find a nonzero vector in Nul A.
A=[3201]
asked 2021-09-18

Find an explicit description of Nul A by listing vectors that span the null space.
A=[154310121000000]

asked 2022-05-15
Which transformations include in the matrix?
do this matrix transformation options:
1) rotate and scaling
2) translate and scaling
3) scaling and rotate
4) not 1,2,3
[ 1 0 1 0 0 1 0 0 1 ]
asked 2022-07-08
Let β and β be bases for the finite dimensional vector space V of dimension n over the field F, and let Q = [ I V ] β β , where I V is the identity operator on V. I just recently proved that Q [ x ] β = [ x ] β for every x V (twas rather simple), which suggests the title of "basis transformation" matrix or "coordinate transformation" matrix for the matrix Q. I am now wondering whether the converse holds.
Let V denote V with its elements written with respect to the basis β , and suppose that Q M n ( F ) is such that Q [ x ] β = [ x ] β for every x. Since L ( V V ) is isomorphic to the matrix algebra M n ( F ) by sending a linear operator to its matrix representation, given Q there exists a linear operator T L ( V , V ) such that Q = [ T ] β β . Hence [ T ] β β [ x ] β = [ x ] β or [ T ( x ) ] β = [ x ] β ...
I want to say that this implies T = I V , but I can't clearly see what lemma I need in order to make that conclusion.
asked 2022-07-16
Let T : M 2 x 2 R 3 have matrix [ T ] B , A = [ 1 2 0 1 0 1 1 0 1 1 1 1 ] relative to A = { [ 2 0 0 0 ] , [ 0 3 0 0 ] , [ 0 0 5 0 ] , [ 0 0 0 6 ] } and β = { ( 1 , 1 , 1 ) , ( 1 , 2 , 3 ) , ( 1 , 4 , 9 ) }. Find the matrix of T relative to the bases A = { [ 1 0 0 0 ] , [ 0 4 0 0 ] , [ 0 0 2 0 ] , [ 0 0 0 7 ] } and β = { ( 1 , 1 , 1 ) , ( 1 , 0 , 0 ) , ( 1 , 1 , 0 ) }
asked 2022-01-20
Two welders worked a total of 47 h on a project. One welder made $37/h, while the other made $39/h. If the gross earnings of the two welders was $1781 for the job, how many hours did each welder work? Using row- echron matrix.

New questions