Least square method | SoloLearn: Learn to code for FREE!

+4

Least square method

in regression line y = mx + b why m = this ¥¥ SumOf(x - Xbar)*(y - Ybar) m = ____________________________ SumOf(x - Xbar)² how this formula come ? and why this is equal to m ?

9/19/2020 3:10:19 PM

Rajababu Shah

12 Answers

New Answer

+4

You want to minimise the sum of squared errors. How do you do that? As usual, take the first derivative and set it to zero. Your error function is a function of two variables, m and b: J(m,b) = sum_i (m * x_i + b - y_i)² Take the partial derivatives with respect to m and b and set them zero. That'll give you two equations for two unknowns. With a lot of algebra and remembering that n*xbar = sum_i(x_i) and similar for ybar, you should arrive at expressions for both b and m.

+6

edit: Rajababu nicely explained here https://www.mathsisfun.com/data/least-squares-regression.html

+6

I don't know but I think google can help you: https://www.google.com/search?rlz=1C1AVNG_enBD832BD835&sxsrf=ALeKk03cpy3iZ6lCGOi3tVSSq0BsewtgcQ%3A1600528939006&ei=KiJmX4f6PMHEmAXApLygAw&q=how+does+least+squares+method+work&oq=how+does+least+squares+method+work&gs_lcp=CgZwc3ktYWIQAzIECAAQHjoECAAQRzoGCAAQBxAeOgYIABAIEB5QjEJY61RgkVloAHACeACAAb8DiAG1EJIBBzItNC4xLjKYAQCgAQGqAQdnd3Mtd2l6yAEIwAEB&sclient=psy-ab&ved=0ahUKEwiHt_L2wvXrAhVBIqYKHUASDzQQ4dUDCA0&uact=5

+4

yes m is the slope of the line but why that is equal to this formula ? RKK Mikhail BroFar Python [ Send Py Storms ] A J #Level 20 Atlas Hamington 🐉SHAROF🐉🇺🇿 sir, anyone please help

+4

Coder Kitten how to take partial derivatives

+4

Coder Kitten message me why df/dy = 2xy ?

+3

Rajababu Shah https://www.mathsisfun.com/data/least-squares-regression.html

+3

why the formula of m is this ? BroFar i have learned m = y2-y1/x2-x1 m = N Σ(xy) − Σx ΣyN Σ(x2) − (Σx)2 (N is the number of points.) Intercept b: b = Σy − m ΣxN

+3

You have some data points and want to find a regression curve that describes these set of points "adequately". So you need a method to get the parameters which summarize the data. A least square method is one way getting the parameters. As the name says, it aims at minimizing square (deviation). However there are multiple least square methods. This is why you might encounter different formulas.

+3

Pretend everything else is a constant. For example: f(x,y) = x*y² + x Then, for the partial derivative with respect to x, y is constant: df/dx = y² + 1 For the partial derivative with respect to y, x is constant: df/dy = 2xy

+3

What do you know about (multivariate) differential calculus? If you have not yet studied it, then solving this might be a bit ahead of you for the time being. The derivative of y² w.r.t. y is 2y, right? Now, x is just a constant factor, unharmed by the derivative operator: d(xy²)/dy = xd(y²)/dy = x*2y = 2xy

+3

And that is actually the easy part. The difficulty lies more in juggling the summands.