This may be braindead, but I'm trying!
If I have a function $f$ and that function is not defined at some x, then asking for the derivative of the function at $x$ makes no sense since there is no $f(x)$ at $x$.
But if I want to find a gradient for that function as close as possible to x, then how does that work? Isn't that the same as the derivative at x? It's like, I can do the same calculation but I have to disregard the result because I'm asking for something that doesn't exist.
For example, if $f(x)=\frac{1}{x−2}$, then $f(x)$ is not defined at $x=2$. So I can't find the derivative at that point since it doesn't exist, But the limit is 2. But the limit is the derivative, and the derivative doesn't exist! I'm confused.
I felt like I understood this but I woke up this morning with no idea. Last week I was happily finding the volume of cylindrical wedges, now I can't understand limits O_O
Please set me straight.
EDIT: I think my problem is the way I'm thinking about limits. It seems that there are two limits and I'm confusing them. The limit that $f(x)$ approaches and the limit that $x+h$ approaches. In the above example where $f(x)=\frac{1}{x-2}$, $2+h$ approaches $2$ and f'(x) is undefined since the numerator contains a division by zero.
Is that my answer?
2nd EDIT: This is what I'm really asking: How do I find lim_{x\to a}f'(x)?