# Linear algebra questions and answers

Recent questions in Linear algebra
Liberty Mack 2022-05-20 Answered

### Does there exist a matrix $A$ for which $AM$ = ${M}^{T}$ for every $M$. The answer to this is obviously no as I can vary the dimension of $M$. But now this lead me to think , if I take , lets say only $2×2$ matrix into consideration. Now for a matrix $M$, $A={M}^{T}{M}^{-1}$ so $A$ is not fixed and depends on $M$, but the operation follows all conditions of a linear transformation and I had read that any linear transformation can be represented as a matrix. So is the last statement wrong or my argument flawed?

codosse2e 2022-05-19 Answered

### Finding alternate transformation matrix for similarity transformationA pair of square matrices $X$ and $Y$ are called similar if there exists a nonsingular matrix $T$ such that ${T}^{-1}XT=Y$ holds. It is known that the transformation matrix $T$ is not unique for given $X$ and $Y$. I'm just wondering whether those non-unique transformation matrices would have any relation among themselves, like having column vectors with same directions...What I want to mean is: Given $X$ and $Y$ a pair of similar matrices, if $S$ and $T$ are two possible transformation matrices satisfying ${S}^{-1}XS={T}^{-1}XT=Y$, is there any generic (apart from scaling) relation between $T$ and $S$ (e.g., direction of column vectors)?For a specific example, consider $X=\left[\begin{array}{cc}A& BK\\ C& 0\end{array}\right]$ and $Y=\left[\begin{array}{cc}A+{A}^{-1}BKC& -{A}^{-1}BKC{A}^{-1}B\\ KC& -KC{A}^{-1}B\end{array}\right]$. Assuming $K$ to be invertible it can be shown that $X$ and $Y$ are similar with transformation matrix $T=\left[\begin{array}{cc}I& -{A}^{-1}B\\ 0& {K}^{-1}\end{array}\right]$. Can there be any other matrix $S$ which will be independent of $K$, and would result ${S}^{-1}XS=Y$?

Jazmine Bruce 2022-05-19 Answered

### What is the general transformation matrix for a rotation of angle $\theta$ about the origin?That is all the questions says, any one able to help me out, who may understand it better then me?

Camille Flynn 2022-05-19 Answered

### The form $f\left(x\right){y}^{n}\left(x\right)+...u\left(x\right)y\left(x\right)=h\left(x\right)$ is supposed to be the definition of lineaity in diff equations. It excludes functions of y in the right hand side, but is multiplication of y by another function allowed in the right hand side allowed? It seems to be the case sometimes but not all of the time, as only composition of linear functions gives linear functions. Can't we just move it to the other side with the other y and call that form linear as well?

Chaz Blair 2022-05-19 Answered

### Let $f:{\mathbb{R}}^{m}\to {\mathbb{R}}^{n}$ be a linear transformation. As is common knowledge, it can be expressed as an $n×m$ matrix.

Rachel Villa 2022-05-18 Answered

### I have a non-linear form of the Poisson equation (with a diffusion coefficient that is a function of the derivatives of the dependent variable) that I'm trying to solve numerically (using FEM which requires the pde to be posed in its weak form).$\mathrm{\nabla }\cdot \left(\eta \mathrm{\nabla }v\right)=G\left(x\right)$where $\eta =\sqrt{1/2\left(\mathrm{\nabla }v:\mathrm{\nabla }v\right)}$ and $v=v\left(y,z\right)$. Multiplying by a test function θ and integrating wrt the domain, $\mathrm{\Omega }$, gives the weak form${\oint }_{\mathrm{\Gamma }}\theta \cdot \left(\eta \mathrm{\nabla }v\cdot \mathbf{n}\right)d\mathrm{\Gamma }-{\int }_{\mathrm{\Omega }}\left(\eta \mathrm{\nabla }v\cdot \mathrm{\nabla }\theta +G\right)d\mathrm{\Omega }=0$However, none of my boundary conditions correspond to the Dirichelet type as I have ${\mathrm{\partial }}_{n}v=v$ on three of the boundaries in a rectangular domain and ${\mathrm{\partial }}_{n}v=0$ on the other (n is the normal vector to surface). Is this problem actually solvable using FEM, or at all? I have the solution to a simpler problem which might be a reasonable guess at the solution here.

tinydancer27br 2022-05-18 Answered

### I am studying about the linear odes with non-constant coefficients.I know the first order linear ode with non-constant coefficient$\begin{array}{}\text{(1)}& {y}^{{}^{\prime }}\left(x\right)+f\left(x\right)y\left(x\right)=0\end{array}$has a general solution of the form$\begin{array}{}\text{(2)}& y=C{e}^{-\int f\left(x\right)dx}\end{array}$However, I am more interested in the case of linear second order odes with non-constant coefficients$\begin{array}{}\text{(3)}& {y}^{{}^{″}}\left(x\right)+g\left(x\right){y}^{{}^{\prime }}\left(x\right)+f\left(x\right)y\left(x\right)=0\end{array}$I know that this equation does not have a closed form solution like (2). However, I am interested in special cases of that.Questions1. Consider (3), when $g\left(x\right)=0$, then we have$\begin{array}{}\text{(4)}& {y}^{{}^{″}}\left(x\right)+f\left(x\right)y\left(x\right)=0\end{array}$Is Eq.(4) a famous well-known equation? If YES, what is its name?2. Does (4) have a closed form solution like (2)?3. Can you name or give me a list of well-known linear second order odes with non-constant coefficients which are not polynomial?For example, I know Cauchy-Euler, Airy, Bessel, Chebyshev, Laguerre and Legendre equations whose coefficients are polynomials. But I don't know any well-known equation with non-polynomial coefficients.

velitshh 2022-05-18 Answered

### Find linear transformation matrix from a linear transformation matrix of its compositionLet's say that $V$ is a vector space over the field $K$ and suppose we have a linear transformation $f\in \text{End}V$ thats matrix is know on some basis.How to find a matrix of a linear transformation $g\in \text{End}V$ on the same basis, so that $g\circ g=f$.For example if $V$ is vector space over the field ${\mathbb{Z}}_{11}$ and matrix of linear transformation $f\in \text{End}V$ on some basis is$\left(\begin{array}{cccc}\overline{7}& \overline{3}& \overline{4}& \overline{0}\\ \overline{0}& \overline{6}& \overline{9}& \overline{6}\\ \overline{1}& \overline{9}& \overline{3}& \overline{1}\\ \overline{0}& \overline{2}& \overline{8}& \overline{5}\end{array}\right)$Find matrix of a linear transformation $g\in \text{End}V$ on the same basis, so that $g\circ g=f$

hushjelpw4 2022-05-18 Answered

### Find covariance of matrix transformation$VX=\left[\begin{array}{ccc}4& & 4\\ 4& & 16\end{array}\right]$If ${Y}_{1}={X}_{1}+2$ and ${Y}_{2}={X}_{2}-{X}_{1}+1$, then how do I find the variance matrix for $Y$?I have tried the following where I emitted the constants as my guess is they don't affect variance:$Cov\left({Y}_{1},Y1\right)=Cov\left({X}_{1},{X}_{1}\right)=Var\left({X}_{1}\right)=4$$Cov\left({Y}_{2},{Y}_{2}\right)=Cov\left({X}_{2}-{X}_{1},{X}_{2}-{X}_{1}\right)\phantom{\rule{0ex}{0ex}}=Cov\left({X}_{2},{X}_{2}-{X}_{1}\right)-Cov\left({X}_{1},{X}_{2}-{X}_{1}\right)\phantom{\rule{0ex}{0ex}}=Cov\left({X}_{2},{X}_{2}\right)-Cov\left({X}_{2},{X}_{1}\right)-Cov\left({X}_{1},{X}_{2}\right)+Cov\left({X}_{1},{X}_{1}\right)\phantom{\rule{0ex}{0ex}}=16-4-4+4=12$Variance = $\left[\begin{array}{ccc}4& & 0\\ 0& & 12\end{array}\right]$

lurtzslikgtgjd 2022-05-17 Answered

### I am trying to show that the functions ${t}^{3}$ and $b$ are independent on the whole real line. To do this, I try and prove it by contradiction. So assume that they are dependent. So then there must exists constants $a$ and $b$ such that $a{t}^{3}+b|t{|}^{3}=0$ for all $t\in \left(-\mathrm{\infty },\mathrm{\infty }\right)$. Now pick two points $x$ and $y$ in this interval and assume without loss of generality that $x<0$, $y\ge 0$. Now form the simultaneous linear equations$a{y}^{3}+b|y{|}^{3}=0$, viz.$\left[\begin{array}{cc}{x}^{3}& |x{|}^{3}\\ {y}^{3}& |y{|}^{3}\end{array}\right]\left[\begin{array}{c}a\\ b\end{array}\right]=\left[\begin{array}{c}0\\ 0\end{array}\right]$Now here's my problem. If I look at the determinant of the coefficient matrix of this system of linear equations, namely ${x}^{3}|y{|}^{3}-{y}^{3}|x{|}^{3}$ and noting that $x<0$ and $y>0$, I have that the determinant is non-zero which implies that the only solution is $a=b=0$, i.e. the functions ${t}^{3}$ and |$|t{|}^{3}$ are linearly independent. However what happens if indeed $y=0$? Then the determinant of the matrix is 0 and I have got a problem.Is there something that I am not getting from the definition of linear independence?The definition (I hope I state this correctly) is: If $f$ and $g$ are two functions such that the only solution to in an interval $I$ is $a=b=0$, then the two functions are linearly independent.But what happens if my functions pass through the origin, like the above? Then I've just shown that there exists a $t$ in an interval containing zero such that the two functions are zero, viz. I can plug in any $b$ and $a$ such that $af+bg=0$.

Cesar Mcguire 2022-05-17 Answered

### I have to find components of a matrix for 3D transformation. I have a first system in which transformations are made by multiplying:${M}_{1}=\left[Translation\right]×\left[Rotation\right]×\left[Scale\right]$I want to have the same transformations in an engine who compute like this:${M}_{2}=\left[Rotation\right]×\left[Translation\right]×\left[Scale\right]$So when I enter the same values there's a problem due to the inversion of translation and rotation.How can I compute the values in the last matrix ${M}_{2}$ for having the same transformation?

sg101cp6vv 2022-05-16 Answered

### I have a hard time finding the analytical solution to the following non-linear equation:$\left(1+x{\right)}^{1-p}+{p}^{\frac{p}{1-p}}x\left(p-1\right)-1=0$where $p\in \right]0,1\left[$ and $x>1$. I would like to have a solution $x$ in terms of $p$ for each fixed $p$ in the specified interval. Of course it's possible that no analytical form of the solution exists. If so, I'd also be happy to hear an argument why that is the case. However, for some values of p the solution is 'nice', for example for $p=\frac{1}{2}$ it's easy to compute that $x=8$ and for $p=\frac{1}{3}$ I obtained $x=\frac{9}{16}\left(5\sqrt{3}+\sqrt{11+64\sqrt{3}}\right)$Any kind of help is greatly appreciated!

kazue72949lard 2022-05-15 Answered

### Which transformations include in the matrix?do this matrix transformation options:1) rotate and scaling2) translate and scaling3) scaling and rotate4) not 1,2,3$\left[\begin{array}{ccc}1& 0& 1\\ 0& 0& 1\\ 0& 0& 1\end{array}\right]$

sg101cp6vv 2022-05-15 Answered

### Let $T:{\mathbb{R}}^{2}\to {\mathbb{R}}^{3}$ and $T\left(-2,3\right)=\left(-1,0,1\right)$ and $T\left(1,2\right)=\left(0,-1,0\right)$Obtain the canonical matrix of $T$ and the transformation $T\left(x,y\right)$.

Jaylene Duarte 2022-05-15 Answered

### Let $\mathrm{\Sigma }$ be a symmetric positive definite matrix with ones on the diagonal (= correlation matrix).Let $A$ be an invertible matrix.I'm pretty sure that if $\mathrm{\Omega }:=A\mathrm{\Sigma }{A}^{t}$ has ones on its diagonal, then(which would correspond to $A=Id$ or $A=chol\left(\mathrm{\Sigma }\right)$)But I don't know how to proove it.

fetsBedscurce4why1 2022-05-14 Answered

### Let $B=\left\{{u}_{1},{u}_{2},{u}_{3}\right\}$ with ${u}_{1}=1$, ${u}_{2}=x$, ${u}_{3}={x}^{2}$ denote a basis for the space of polynomials of the second order polynomials ${P}^{2}$. Let $T\left({a}_{0}+{a}_{1}x+{a}_{2}{x}^{2}\right)={a}_{0}+{a}_{1}\left(x-1\right)+{a}_{2}\left(x-1{\right)}^{2}$ be a linear transformation form ${P}^{2}$ to itself. Construct the representation matrix $\left[T{\right]}_{BB}.$.

ureji1c8r1 2022-05-13 Answered

### We know that if we want to reflect any point over an origin, i.e. $O\left(0,0\right)$, we can use matrix transformation like this$\left(\begin{array}{c}{x}^{\prime }\\ {y}^{\prime }\end{array}\right)=\left(\begin{array}{cc}-1& 0\\ 0& -1\end{array}\right)\left(\begin{array}{c}x\\ y\end{array}\right)=\left(\begin{array}{c}-x\\ -y\end{array}\right).$But, what if we reflect any point over another point $M\left(a,b\right)$ with $a,b\ne 0$

kwisangqaquqw3 2022-05-13 Answered

### How to formulate a transformation matrix for the following operation? Like all the examples I found are different and I can't understand how to solve this problem:$y=A.X\phantom{\rule{0ex}{0ex}}y=\left({y}_{1}{y}_{2}{\right)}^{\prime }\phantom{\rule{0ex}{0ex}}x=\left({x}_{1}^{\prime }{x}_{2}\right)\phantom{\rule{0ex}{0ex}}\left({y}_{1}{y}_{2}{\right)}^{\prime }=\left({x}_{1}{x}_{2}{\right)}^{\prime }\phantom{\rule{0ex}{0ex}}\left(-{x}_{1},{x}_{2}{\right)}^{\prime }=\left(-{x}_{1},-{x}_{2}{\right)}^{\prime }\phantom{\rule{0ex}{0ex}}$

Deshawn Cabrera 2022-05-13 Answered

2022-05-12

### $y\le 4$

Finding detailed linear algebra problems and solutions has always been difficult because the textbooks would never provide anything that would be sufficient. Since it is used not only by engineering students but by anyone who has to work with specific calculations, we have provided you with a plethora of questions and answers in their original form. It will help you to see some logic as you are solving complex numbers and understand the basic concepts of linear Algebra in a clearer way. If you need additional help or would like to connect several solutions, compare more than one solution as you approach your task.