Let $f$ denote the function defined by
$f(x) = w_{pos} \sum_v \left[ \left(\sum_b d_{v,b} x_b - \theta_v \right)_+\right]^2 + w_{neg} \sum_v \left[ \left(\sum_b d_{v,b} x_b - \theta_v \right)_- \right]^2$
I would like to find the gradient of $f$.
Here, $d_{v,b}$ is a large matrix of dim ${v \times b}$ and $x_b$ is a vector of dim ${b \times 1}$ and $\theta$ is a vector of dim ${n \times 1}$. The first part of the equation penalizes over achieving the goal (theta) and second part penalizes under achieving the goal (theta). The $_+$ indicates that the first sum penalizes the positive results and the $_-$ indicates that the second sum penalizes the negative results.
Could someone differentiate this? I believe it has to be done piece wise, and what would the code look like?