A similarity/metric learning method that takes in the form of x T </msup> W y =

dokezwa17

dokezwa17

Answered question

2022-05-28

A similarity/metric learning method that takes in the form of x T W y = z, where x and y are real valued vectors. For example, two images.
Breaking it into a more familiar form:
x T W y = i j w i j x i y j = z
This essentially looks very similar to polynomial regression with only interactions between features (without the polynomials). i.e.
z = f w ( x ) = i w i x i + i j = i + 1 w i j x i x j
I was curious to see if the optimization for the matrix W is the same as doing optimization for multivariate linear/polynomial regression, since x and y are fixed, and the only variate is the weight matrix W?

Answer & Explanation

concludirgt

concludirgt

Beginner2022-05-29Added 11 answers

If you have only one vector ( z , x 1 , . . . , x p , y 1 , . . . , y p ) and you want to find matrix W such that z = x W y, then this is simple linear equation with infinite set of solutions, i.e., infinitely many Ws can solve it. If you have p ( p + 1 ) / 2 distinct vectors of that kind, i.e.,
{ ( z i , x 1 i , . . . , x p i , y 1 i , . . . , y p i ) } i = 1 p ( p + 1 ) / 2 ,
then you'll have unique solution (no optimization is required, just inverting matrices in whatever method you like). And if you have more than p ( p + 1 ) / 2 distinct such vectors, then this is a linear regression problem, i.e.,
z i = i j β i j x i y j + ϵ i ,
that you can solve using ordinary least squares, which is indeed convex optimization problem.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Inferential Statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?