6
$\begingroup$

Many years ago, while working as a computer programmer, I tracked down a subtle bug in the software that we were using. Management had dispaired of finding the bug, but I pursued it in odd moments over a period of a few days, and finally found that the problem was that, in computing the geometric mean, the program was taking the log of the sum instead of the sum of the logs. When thinking back on that, I always wonder whether there is any situation in which taking the log of a sum would be of interest. The only case I can think of is that it is often convenient to shift the logarithm to the left by 1 unit, which is done by adding 1 to the argument, that is, one often wishes to deal with log(1 + x) instead of log(x), so that one has the convenient situation of f(0) = 0. Let’s call this the trivial scenario.

So, can anyone think of any non-trivial scenario in which taking the log of a sum is the thing to do?

7 Answers 7

1

The logarithm is the variance-stabilizing transformation of the chi-squared distribution, and a chi-squared-distributed random variable arises as the sum of squared standard-normally-distributed random variables.

One example of a (scaled) chi-squared-distributed random variable is the power spectral density estimate, obtained as a mean of squared frequency-domain coefficients, and the logarithm is routinely used on spectral density estimates, particularly to express it in dB units.

  • 0
    I have up-voted your answer. (I’m not claiming to understand it - I’m just giving you the benefit of the doubt, namely, that your answer is indeed as astute as it seems.) Since you say in your profile that you are a non-native speaker interested in writing and speaking decent English, you might be interested in the learner-friendly front-end to the (free, online) Merriam-Webster dictionary that I am creating, at http://enciklopedia-vortaro-de-la-merk-angla.weebly.com/2016-09-10
9

The logarithm is a concave function. This means that Jensen's inequality can be applied to it, giving the Log sum inequality, an useful lemma in information theory.

Lemma (Log sum inequality) Let $a_i\dots a_n$ and $b_1\dots b_n$ be nonnegative reals. Then we have $\sum_{i=1}^n a_i\log{\frac{a_i}{b_i}} \ge \left(\sum_{i=1}^n a_i\right)\log{\frac{\sum_{i=1}^n a_i}{\sum_{i=1}^n b_i}}$

On the right hand side we recognize two logarithms of a sum.

Remark By convention, $0\log{0} = 0\log{\frac{0}{0}}=0$ and $a\log{\frac{a}{0}}=\infty$ for $a>0$. All these are justified by continuity.

  • 0
    @Jacopo Notarstefano: Thanks for the clarification about the behavior at zero. I'm up-voting your answer now.2011-09-03
6

The log of a sum has been a topic in the time of Gauss; I'll see whether I can find an article which I do recall in the german math newsgroup at about 2002. So far here is a link to tables of such logarithms of Tafeln der Additions- und Subtractions-Logarithmen für sieben Stellen. by Christoph Zech.
As far as I remember it was an approach to simplify computations in astronomy (where Gauss has been involved)

A longer article which explains the rationale and the use of the "Gauss'sche Additionslogarithmen", unfortunately only in german, is at this online archiv The article contains some references which might be helpful even if german is foreign language to you. Sorry I can't be of more help here...

  • 0
    This answer seems closest to what I was looking for, and so I have up-voted it and accepted it.2011-09-03
6

One plausible scenario is this:

Say you want a soft maximum function. The usual function $\max(x,y)$ is not "soft" (try visualizing its graph).

So we define the "soft maximum function" $\mathrm{smax}(x,y)=\log(e^x+e^y)$. The idea is that if $x \gg y$, then $\log(e^x+e^y) \approx x$.

You can read more about this function in this blog post.

  • 0
    (The link in my previous comment doesn't work anymore; now it's http://users.mai.liu.se/hanlu09/research/dp_peakon_formula/.)2017-08-06
3

I don't know if this is anything like what you had in mind, but when you integrate rational functions (by partial fractions) you may get terms like $\ln|x+a|$ and $\ln((x+a)^2+b^2)$ in the result.

  • 0
    Thanks. I remember now seeing that, but I had forgotten it.2011-09-03
3

The logarithm of a sum also occurs in information theory, when one wants to compute the differential entropy $ H = - \int p(x) \log p(x) ~ \mathrm d x $ or related measure for a distribution whose density is defined by a sum, e.g. a Gaussian mixture, $ p(x) = \sum_i \frac1{\sigma_i}\phi \left ( \frac{x - \mu_i}{\sigma_i} \right), $ where $\phi$ is the density of the standard normal distribution.

2

For $a\ge b$, $\log _2(2^a + 2^b) = b+\log_2(2^{a-b}+1)$, where $\log_2$ is the base $2$ logarithm.

For $a\ge b$, $\log(e^a + e^b) = b+\log(e^{a-b}+1)$, where $\log$ is natural logarithm.

Express the terms in the sum as powers of the base of the log to get $a,b$ values.

This was developed to perform calculations with very large numbers that exceeded the MaxNumber on my Mathematica system (had to use logs).

  • 0
    Nice. (The editor won't let me just say'Nice.', so, hmmm, Let me also say, Three cheers for Esperanto!)2014-08-24