I've got this nasty-looking integral equation involving taking two minimums:
$$a+c\min(b,x)=\int_{-\infty}^{\infty}f(x-t)\left(a+d\delta(t-(x-b))+c\min(b,t)\right)dt$$
where $\delta(\cdot)$ is the Dirac delta function and $a$, $b$, $c$, and $d$ are constants. I am trying to find $f(x)$.
I recognize that this is a convolution of sorts and that the minimum on the LHS is easy to deal with (at least in theory) by breaking this thing apart into two cases. However, I am lost trying to figure the RHS out. Is there a way to solve with a transform of some kind? Any help would be appreciated!