I am learning about Bayesian statistics and I'm currently doing loss functions. Let $f(\theta | \mathbf{x} ) $ be a posterior pdf . Let $F(\theta | \mathbf{x} ) $ be the associated distribution function. I want to differentiate $F(a - D| \mathbf{x} )$ with respect to $D$. In this case $D$ is the "decision" which is to be optimised and $a$ is constant. I am having a problem here: $\frac{d}{d D}F(a - D| \mathbf{x} ) = \frac{d}{d D} \left( \int_{-\infty}^{a - D} f(\theta|\mathbf{x}) d \theta \right)$ $=f(a - D | \mathbf{x})$
Is this correct ? I think it might be wrong. Should it be $=-f(a - D | \mathbf{x})$ because I have to apply the chain rule somewhere ?? Or is something else wrong ? I know there are some issues about integration under differentiation but my teacher said I don't need to worry about that now, and just use this: $\frac{d}{dy} \int_{-\infty}^y f(t) dt=f(y)$
I'm doing self-study (with a bit of teacher guidance in his own time so I don't like to ask him too much) but I feel a bit out of depth now and school breaks up for the holidays next week !