Note that you can easily switch between max and min as $\min(x, y) = - \max(-x, -y)$.
In various computer graphics applications, or in optimization problems, there are various "smooth" or "soft" maximum and minimum functions. In the graphics context they are used purely for aesthetics to eliminate ugly discontinuities. In the optimization problems they're used to give a nicer related function that the optimizers handle better than the true function.
One such common softmax is $\log(\exp(x) + \exp(y))$. Rescaling by $k$ sets a scale for the fuzziness: $\log(\exp(kx) + \exp(ky))/k$. This has the nice property that it approaches the larger of $x$ and $y$ when they are greatly different, but has the bad property that it is off by $\log 2$ when they are equal.
The maximum is also the $p = \infty$ limit of the generalized mean/power means. Conversely, the $p = -\infty$ limit is the minimum.
(There is also a softmax activation function which turns numbers into weights of the various choices. It's really a soft selection of the maximum, so is perhaps misnamed. This is not what you want though it is related -- using the weights for a sum of inputs gives something reasonable.)