I am sure this is something completely dumb but my mathematics is, well awful, so be kind...
I know three coordinates (2D standard x and y things where the top left is 0,0 and the x increases from left to right and y increases from top to bottom) and I can plot a line between two of them and then make a third point where the intersection with the line should be 90 degrees and what I want to calculate is the length of the line that made this angle (I am sure if I could explain myself properly I would have found an answer to this already).
Here is a picture of what I mean (with some example values for the three coordinates A, B and C that are known to me):
So, how do I calculate the length marked L in the above?
I thought, well, the line L is normal to the vector A to B so I could say...
The vector from A to B is (4, 7) and therefore the normal vectors would be (-7, 4) and (7, -4) but then I am stuck - where do I go now? Am I even on the right track?