I'm trying to determine the gradient of a tangent to a curve defined by a radial function $r = f(\theta)$. It's a programming application and the actual function is gigantic but lets say that $r = θ^2$.
My first attempt was to get the gradient of the tangent to the curve of $y = x^2$ at $x = \theta$, calculate the angle of that line then add it to the original angle $+ 90^{\circ}$ i.e, sum the angle of the tangent of a circle and a tangent to the curve.
But I think this is incorrect, as the gradient of the curve itself in a polar coordinate system is not the same as the gradient of the curve in a Cartesian system, nor is this inequality compensated by summing it with the angle of the circles tangent. It certainly isn't working in practise.
But I'm stumped now, I don't know how to compensate the gradient for the turn in the circle. Can anyone help, or provide some hints on how to solve this problem? I don't know how clear I'm articulating my problem or attempted solution, so I'll happly clarify or add diagrams if necessary