Is there a name for the class of functions whose derivative is expressed only in terms of the function value? One example is the exponential, another example is
\begin{align} s_1(t) = \frac{1}{1 + e^{-t}} \end{align}
with derivative
\begin{align} s'_1(t) = s_1(t)[1-s_1(t)]. \end{align}
Clarification
My question is related to writing about Neural Networks (NN). In neural networks you calculate the derivative of the output relative to the input by means of an algorithm called backpropagation, or backprop (which is really nothing but the chain rule expressed in a computationally-efficient manner).
An important computational advantage while doing backprop is to store the function value when propagating forward, and using that function value to compute the derivative when propagating backward. This is only possible if the derivative only depends on the function value (and not on, say, the variable value).
For example. Suppose that you have a working vector w:
# w is currently storing the value of t, the independent variable
w = [1, 2, 3]
in the first step you calculate the function value (you won't need the value of the independent variable $t$ anymore, so you overwrite the contents in memory)
# w is currently storing the value of s(t) = 1 / [1 + exp(-t)]
w = [0.7310585786300049, 0.8807970779778823, 0.9525741268224334]
in the next step you calculate the derivative value (you won't need the value of $s(t)$ anymore, so you overwrite the contents in memory)
# w is currently storing the value of s'(t) = s(t)[1-s(t)]
w = [0.19661193324148185, 0.10499358540350662, 0.045176659730912]
Notice that if $s'$ had a dependency on the value of $t$ (as opposed to only the value of $s(t)$) I would not be able to reuse the memory in w.
The specific paragraph I'm trying to improve reads as follows:
Pick an activation function whose derivative depends only on the function value, and not on the value of the independent variable. Such activation functions enable reusing the working memory during the backprop stage.
And I'd like it to know if this could be expressed more precisely: pick a ??? activation function.