hey so i'm programming something that finds an angle of a line between 0 and 180 degrees based on two points....
the equation to find the answer is Angle = sin-1((1/Hypotenuse)*B)
where B is the vertical side of the triangle formed and the hypotenuse is the distance between point 1 and 2.
However the inverse sin function in my program only takes and outputs radians so instead the equation to get degrees becomes
(Angle = sin-1(((1/Hypotenuse)*B *3.14) /180) *180) /3.14
This does not however seem to be right for some reason, as when putting in the parameters of Hypotenuse=150
, B=149.6
i get the answer of 85.8 (right) for the original equation and then .9973 degrees for the new equation??
Please help me fix this!