What method should I use to calculate the error between a given perfect shape (e.g. circle, triangle, rectangle etc.) and a freeform shape drawn by the user, which more or less closely matches the perfect shape?
The application context is a program that measures the precision of users hand-drawing shapes displayed on a touch screen. The users try to redraw the shape that is shown on the screen with the finger or a stylus pen and because users are not perfect, the drawn shape does not completely overlap with the given one. I would like to measure the difference or the error between the perfect shape provided by the application and the imperfect shape drawn by the user.
Thanks for your help.