7
$\begingroup$

I need to find the line having minimal distance to all points. I found linear regression and linear interpolation algorithms. But their minimal distance is only in y-axis: $D = y - f(x)$.

But I need to find $a,b,c$ for line: $ax + by + c = 0$ where distance is computed this way: $D_i = \dfrac{|ax_i + by_i + c|}{\sqrt{a^2+b^2}}$

Is there any way or algorithm to solve this problem?

  • 0
    So all your points are in 2D?2011-09-23
  • 0
    Yes. They are in 2D.2011-09-23

1 Answers 1

4

What you want to do is called total least squares or orthogonal regression. Netlib has a bunch of routines for doing this, and a bit of searching turns up routines for other systems, e.g. MATLAB.

  • 0
    Thanks :) This seems to be i want to do.2011-09-23
  • 0
    Someone posted comment about http://en.wikipedia.org/wiki/Principal_component_analysis. Is there any difference with your post?2011-09-23
  • 0
    Certainly, one in fact uses SVD (the machinery behind PCA) for total least squares. See [this](http://books.google.com/books?id=fhLmqG9BQOsC&pg=PA35) for instance.2011-09-23
  • 0
    @Miro: I posted that comment, but deleted it after I saw J.M.'s answer. J.M.: Boy, doesn't SVD on covariance-like matrices turn up everywhere? From PCA to principal axes of inertia to ellipsoids to this question. I posted the first related thing that came to my head, but it's nice to know it has a specific name in this particular application.2011-09-23
  • 0
    @Rahul: SVD is just too useful for data analysis, methinks. ;)2011-09-23