Set S of coordinates (x,y), and am estimating f(x)=ax+b where a>0. ∀x,y((x,y)∈S⟹y<f(x)).

pominjaneh6 2022-08-12 Answered
set S of coordinates ( x , y ), and am estimating f ( x ) = a x + b where a > 0. I also happen to know that x , y ( ( x , y ) S y < f ( x ) ).
The question is how I can utilize this knowledge of the upper bound on values to improve the regression result?
My intuition is to run a "normal" linear regression on all coordinates in S giving g ( x ) and then construct g ( x ) = g ( x ) + c, with c being the lowest number such that x , y ( ( x , y ) S y g ( x ) ), e.g. such that g ( x ) lies as high as it can whilst still touching at least point in S. I do, however, have absolutely no idea if this is the best way to do it, nor how to devise an algorithm that does this efficiently.
You can still ask an expert for help

Expert Community at Your Service

  • Live experts 24/7
  • Questions are typically answered in as fast as 30 minutes
  • Personalized clear answers
Learn more

Solve your problem for the price of one coffee

  • Available 24/7
  • Math expert for every subject
  • Pay only if we can solve it
Ask Question

Answers (1)

Kody Larsen
Answered 2022-08-13 Author has 11 answers
Your suggestion to do an ordinary regression and then move it up is a fine way to go about it. This is pretty easy, especially if you have a decent statistics library:
1. Fit the regression.
2. Calculate the residuals, the difference between the y coordinate of each point and the y coordinate of the line at that point, given by β ^ 0 + β ^ 1 x i , where x i is the x coordinate of the point. There may be a built-in way to do this. There are matrix-algebra representations as well.
3. Find the largest residual. Not largest in absolute value, just straight-up largest.
4. Add the value of the largest residual to the intercept β ^ 0 of your regression model.
Your regression line now passes through the highest point and is above all the other points.
Not exactly what you’re looking for?
Ask My Question

Expert Community at Your Service

  • Live experts 24/7
  • Questions are typically answered in as fast as 30 minutes
  • Personalized clear answers
Learn more

You might be interested in

asked 2022-06-15
regression x ( t ) = a t + b, number of trials, and R 2 of the regression. How do I find the value and 95 % confidence interval for the value of V = x / t?
asked 2022-05-28
A similarity/metric learning method that takes in the form of x T W y = z, where x and y are real valued vectors. For example, two images.
Breaking it into a more familiar form:
x T W y = i j w i j x i y j = z
This essentially looks very similar to polynomial regression with only interactions between features (without the polynomials). i.e.
z = f w ( x ) = i w i x i + i j = i + 1 w i j x i x j
I was curious to see if the optimization for the matrix W is the same as doing optimization for multivariate linear/polynomial regression, since x and y are fixed, and the only variate is the weight matrix W?
asked 2022-07-24

In a poker hand consisting of 5 cards, find the probability of holding (a) 3 aces (b) 4 hearts and 1 club (c) Cards of same suit (d) 2 aces and 3 jacks

asked 2022-07-09
In logistic regression, the regression coefficients ( β 0 ^ , β 1 ^ ) are calculated via the general method of maximum likelihood. For a simple logistic regression, the maximum likelihood function is given as
( β 0 , β 1 ) = i : y i = 1 p ( x i ) i : y i = 0 ( 1 p ( x i ) ) .
What is the maximum likelihood function for 2 predictors? Or 3 predictors?
asked 2022-06-15
Let's say we have two random variables Y and X used to form regression model
Y = α + β X + μ
It also holds that E ( μ ) = 0, Var ( μ ) = σ μ 2 , Var ( X ) = σ X 2 , Var ( Y ) = σ Y 2 , Corr ( X , Y ) = r and Corr ( X , μ ) = r X μ . Find β. I tried to solve this as follows:
For simple linear regression β = Cov ( X , Y ) Var ( X ) and Corr ( X , Y ) = Cov ( X , Y ) σ X σ Y = r so that:
β = Corr ( X , Y ) σ X σ Y σ X 2 = r σ Y σ X
Is this as simple as this?
asked 2022-07-24

In a poker hand consisting of 5 cards, find the probability of holding (a) 3 aces (b) 4 hearts and 1 club (c) Cards of same suit (d) 2 aces and 3 jacks

asked 2022-06-24
Let a sample ( x , y ) R 2 n be given, where y only attains the values 0 and 1. We can try to model this data set by either linear regression
y i = α 0 + β 0 x i
with the coefficients determined by the method of least squares or by logistic regression
π i = exp ( α 1 + β 1 x i ) 1 + exp ( α 1 + β 1 x i ) ,
where π i denotes the probability that y i = 1 under the given value x i and the coefficients are determined by the Maximum-Likelihood method. My question is whether the following statement holds true.
Claim: If β 0 > 0 ( β 0 < 0), then β 1 > 0 ( β 1 > 0).
I figure this could be due to the sign of the correlation coefficient.

New questions

Euclid's view and Klein's view of Geometry and Associativity in Group
One common item in the have a look at of Euclidean geometry (Euclid's view) is "congruence" relation- specifically ""congruence of triangles"". We recognize that this congruence relation is an equivalence relation
Every triangle is congruent to itself
If triangle T 1 is congruent to triangle T 2 then T 2 is congruent to T 1 .
If T 1 is congruent to T 2 and T 2 is congruent to T 3 , then T 1 is congruent to T 3 .
This congruence relation (from Euclid's view) can be translated right into a relation coming from "organizations". allow I s o ( R 2 ) denote the set of all isometries of Euclidean plan (=distance maintaining maps from plane to itself). Then the above family members may be understood from Klein's view as:
∃ an identity element in I s o ( R 2 ) which takes every triangle to itself.
If g I s o ( R 2 ) is an element taking triangle T 1 to T 2 , then g 1 I s o ( R 2 ) which takes T 2 to T 1 .
If g I s o ( R 2 ) takes T 1 to T 2 and g I s o ( R 2 ) takes T 2 to T 3 then h g I s o ( R 2 ) which takes T 1 to T 3 .
One can see that in Klein's view, three axioms in the definition of group appear. But in the definition of "Group" there is "associativity", which is not needed in above formulation of Euclids view to Kleins view of grometry.
Question: What is the reason of introducing associativity in the definition of group? If we look geometry from Klein's view, does "associativity" of group puts restriction on geometry?