Possible Duplicate:
Error measurement between given perfect 2D shape and freeform shape drawn by user
I am programming (with vectors) an application which requires a user to draw line according to certain data. Then the user will click a check button which will grade the users drawing to the actual data line. So I am wondering how would I go about grading the accuracy of the two lines?
So far what I have been able to do is interpolate the entire line of both the user line and the actual line. So that the user lines data can match with the actual line data.
What is my next step in finding the accuracy of the user line to the actual line?
I can't use area because the line the user draws is not linear, its freeform.
Heres an image of what i mean: