18
$\begingroup$

The average value of a function $y=f(x)$, on an interval $[a,b]$, is ${1\over {b-a}}\int_a^b f(t)dt$. This of course relates to the arithmetic average. It is easy to see that a corresponding formula for the geometric average is $\exp\left({1\over {b-a}}\int_a^b \ln(f(t))dt\right)$.

There are many other types of averages. In particular the ones motivated by the elementary symmetric polynomials are interesting as they "mix" the function values. My question is: How can we evaluate those averages?

To be specific, consider a real positive continuous function $y=f(x)$ on $[a,b]$. Create a partition of $n$ sub-intervals of width $\Delta x$. Let $Y=(y_1,y_2,\cdots, y_n)$ be the values of the function $f$ at some point in those intervals. Define the elementary symmetric polynomials $e_k=e_k(Y)$, for $1\le k \le n$, through
$$ \prod_{i=1}^n (t+y_i)= t^n+e_1t^{n-1}+\cdots+e_n. $$ Define the average $$ a_k(Y)={\root k \of {{e_k} \over {\left (n \atop k \right )}}}. $$ Define $A_\alpha(f)$, the $\alpha$-average of $f$ over $[a,b]$, as the limit of $a_k(Y)$ as $n \to \infty$, $\Delta x \to 0$, and $k/n \to \alpha$. Note $\alpha=0$ corresponds to the arithmetic average and $\alpha=1$ is the geometric average.

What do we know about $A_\alpha$ for $0<\alpha <1$? How can we compute it? For example if $f(x)=x$, $[a,b]=[1,2]$, and $\alpha=1/2$ what is $A_\alpha$?

Edit 1:

Some related inequalities are Maclaurin's and Newton's.

Edit 2:

I guess the requirement of continuity can be relaxed to piecewise continuity and still have a unique limit. Finding $A_\alpha$ for the following function, for a given $m>0$, will also be of interest: $$f(x)= \cases { 1 & if $ \quad 0 \le x \le 1/2$ \cr m & if $ \quad 1/2 < x \le 1$ }.$$

  • 0
    Is it obvious that $A_\alpha$ depends only on $\alpha$? Choosing $k=0$ for every $n$, or $k=1$ for every $n$, seems already to yield two different limits.2012-10-08
  • 0
    $k=0$ is there only for uniform definition of $e_k$. I will make the necessary correction. $k=1$ or $k=2$ or any finite number lead to the same $A_0$, i.e. the arithmetic average.2012-10-08
  • 0
    @Maesumi : I wish I could +1 this a thousand times, this is a reaaaaaaaaaaaally cool question.2012-10-10
  • 0
    @Maesumi : It is not clear at all to me how you work with $k=0$, I don't see any natural definition that is supposed to be implicit.2012-10-10
  • 0
    Do you assume you're working with positive functions? It would simply stuff like taking logs...2012-10-10
  • 1
    @PatrickDaSilva There is no case of $k=0$ in the final edited formulation. It won't make sense in the definition of $a_k(Y)$. As it is $1 \le k \le n$. The limit of $k/n$ can be zero in which case we have the arithmetic average.2012-10-10
  • 0
    @PatrickDaSilva Yes $f$ is positive. It can be relaxed to non-negative provided we deal with occasional improper integral as it is evident for the geometric average and presence of logarithm.2012-10-10
  • 0
    @Maesumi : I asked about $k=0$ because there could be some kind of $L^p$ norm analog where you could've taken the $\max$ for $k=0$ in some weird way but I couldn't imagine a right way to state it.2012-10-10
  • 1
    Just a suggestion ; I believe a first step to understanding this problem would be to be able to compute $$ \underset{k/n \to \alpha}{\lim_{n \to \infty}} \left( \binom nk \right)^{\frac 1k} $$ which will most probably involve the $\Gamma$ function to a certain extent... this coefficient will appear in your formulas independently of the function $f$ you work with so I think it will be important to understand.2012-10-10
  • 0
    It's also an interesting: Is your observation related to [Harmonic average](http://en.wikipedia.org/wiki/Harmonic_mean)?2012-10-10
  • 0
    @KvanTTT For harmonic average you get the reciprocal of the average of the reciprocal of the original function. It does not "mix" the function values the way symmetric average does.2012-10-10
  • 0
    @Maesumi : Using Stirling's formula, one can show that the denominator of $a_K(Y)$ goes to $(1-\alpha)^{(1-\alpha)}/\alpha$. Would you like me to detail the computations in an answer? It's not that long but too long for a comment.2012-10-10
  • 0
    @PatrickDaSilva Yes, every step helps.2012-10-10
  • 0
    Does exist the limit of the $a_k(Y)$ at all? Why do you take the $k$-th root?2012-10-16
  • 1
    @vesszabo I can prove the limit exists only for a few special cases, when #k# is finite, or $k=n-s$ and $s$ is finite. Numerically the limit seems to exists. We have to take the $k$-th root to produce a finite number.2012-10-16
  • 0
    @Maesumi : I like the Edit 1. This seems of interest. I'm actually amazed by those results and I would love to see more. It's sad that I don't have time to invest on those questions.2012-11-23
  • 0
    @Maesumi : Thinking of $\log$ as $x^0$ (in the sense of the integral of $x^{-1}$), I think the right conjecture would be that the $\alpha$-average of $f$ is really just $(\int f^{\alpha})^{1/\alpha}$, because the limit when $\alpha \searrow 0$ of this is the geometric average. I have no idea yet how to show this though.2012-12-01
  • 0
    @PatrickDaSilva A numerical experiment will help to decide this. But I doubt it.2012-12-02
  • 0
    @Maesumi : I don't. I'm actually thinking about doing that numerical experiment myself, I got really interested in this problem (and in the $f$-averages as well).2012-12-03
  • 0
    @PatrickDaSilva In my definition $\alpha=0$ corresponds to the arithmetic average.2012-12-03
  • 0
    @Maesumi : Yes... I'll have to give it some more thought, but I still think there's something there.2012-12-03

3 Answers 3

4

As requested in the comments, here is the proof : \begin{gather*} \begin{aligned} \binom nk^{\frac 1k} & = \left( \frac{n!}{k!(n-k)!} \right)^{\frac 1k} \sim \left( \frac{ \sqrt{2 \pi n} (n/e)^n }{ (\sqrt{ 2 \pi k } (k/e)^k ) ( \sqrt{2 \pi(n-k)} ((n-k)/e)^{(n-k)}}\right)^{\frac 1k} \\ & = \left( \frac 1{\sqrt{2\pi}} \right)^{\frac 1k} \left( \frac{n}{k(n-k)} \right)^{\frac 1{2k}} \left( \frac{n^n}{k^k(n-k)^{n-k}}\right)^{\frac 1k} \\ & = \left( \frac 1{\sqrt{2 \pi}} \right)^{\frac 1k} \left( \frac{n}{k(n-k)} \right)^{\frac 1{2k}} \left( \frac{ n^{\frac nk}}{k \, n^{((n/k) -1)} (1 - k/n)^{(n/k) - 1} } \right) \\ & = \left( \frac 1{\sqrt{2 \pi}} \right)^{\frac 1k} \left( \frac{n}{k(n-k)} \right)^{\frac 1{2k}} \left( \frac{ 1 }{ \frac kn (1 - k/n)^{(n/k) - 1} } \right) \\ & \longrightarrow \frac 1{\alpha (1-\alpha)^{1/\alpha -1}} = \frac{(1-\alpha)^{1-1/\alpha}}{\alpha} = \left( \frac 1{\alpha^{\alpha} (1-\alpha)^{1-\alpha} } \right)^{\frac 1{\alpha}} . \end{aligned} \end{gather*}

Note that even if the asymptotic is supposed to work when we don't take the $\frac 1k$ power, the asymptotic holds independently of the $\frac 1k$ power that we take (i.e. the bounds for the asymptotic can be chosen independently of the power that is taken), so it still holds.

Hope that helps,

  • 0
    As you can see in the details I was wrong in my previous comment, but I simply forgot that it was $k/n \to \alpha$ and not $n/k \to \alpha$, I confused things while computing. The proof still remains valid obviously.2012-10-11
  • 0
    On the third line there seems to be some algebra issues. It has resulted in $1-1/\alpha$ as a base. But this expression is negative since $0<\alpha<1$.2012-10-12
  • 0
    @Maesumi : Right, I factored $n$ wrong. Let me correct that. It should make more sense now.2012-10-12
  • 0
    Hm. I was trying some things and I just noticed that it's important to see that we can only *reasonably* compute the $\alpha$-average for $\alpha \in \{0,1\}$ and that's precisely where $\binom nk$ blows up, if you look at my asymptotic expression. I believe it's because of that that "some terms get killed" and we can compute more easily.2012-11-25
2

Wow, so much inspiration.

Here's a generalization of your question : let us begin with a function $f : ]0,\infty[ \to ]\inf_{x > 0} f(x),\sup_{x > 0} f(x)[$ which is monotone. Let $g : [0,1] \to ]0,\infty[$. Define the $f$-average of $g$ by $$ A_f(g) = f^{-1} \left( \int_0^1 f(g(t)) dt \right). $$ Note that when $f(x) = x$ you get the arithmetic average and when $f(x) = \log(x)$ you get the geometric average. These functions are increasing. However, you can also take $f(x) = 1/x$ and get the harmonic mean. This function is decreasing ; perhaps that is why it behaves differently than the other averages.

This $f$-average is not completely randomly defined ; it has some properties. In a similar way that arithmetic and geometric averages are defined for finite sets instead of functions, we can define the $f$-mean of $n$ numbers by $$ M_f(x_1, \dots, x_n) = f^{-1} \left( \frac 1n \sum_{i=1}^n f(x_i) \right). $$ Again, taking $f$ as stated before gives us arithmetic, geometric and harmonic means. Now we can see that the $f$-average of the $f$-mean of $g_1(t), \dots, g_n(t)$ is the $f$-mean of the $f$-averages : $$ A_f( M_f(g_1,\dots,g_n)) = f^{-1} \left( \int_0^1 f(M_f(g_1,\dots,g_n)(t)) \, dt \right) \\ = f^{-1} \left( \int_0^1 \frac 1n \sum_{i=1}^n f(g_i(t)) \, dt \right) \\ = f^{-1} \left( \frac 1n \sum_{i=1}^n \int_0^1 f(g_i(t)) \, dt \right) \\ = f^{-1} \left( \frac 1n \sum_{i=1}^n f(A_f(g_i)) \right) \\ = M_f(A_f(g_1), \dots, A_f(g_n)). $$

We can also obtain a generalized QM-AM-GM-HM inequality :

Theorem. Suppose $f,g$ are twice differentiable, $f' \neq 0 \neq g'$, $f'' \le g''$ and $h(x) = g'(x) / f'(x)$ is injective. Then for all $x_1, \dots, x_n \in ]0,\infty[$, we have $M_f(x_1,\dots,x_n) \le M_g(x_1,\dots,x_n)$.

The proof is similar to the one with QM-AM-GM-HM except that the facts used for $f$ and $g$ are quite clear now. Let us minimize $M_g(x_1,\dots,x_n)$ subject to $M_f(x_1,\dots,x_n) = C$. It follows from the Lagrange Multipliers that there exists a $\lambda$ such that $$ \frac{g'(x_i)}{g'(M_g(x_1,\dots,x_n))} = \frac{\lambda f'(x_i)}{f'(M_f(x_1,\dots,x_n))} \quad \Longrightarrow \quad \frac{g'(x_i)}{f'(x_i)} = \frac{\lambda g'(M_g)}{f'(M_f)} = \frac{g'(x_j)}{f'(x_j)}, \quad i \neq j. $$ Since we assumed $h(x) = g'(x)/f'(x)$ to be injective, $x_1 = x_2 = \dots = x_n \overset{def}= x$, so it follows that $(x,\dots,x)$ is a minimum and $$ M_g(x_1,\dots,x_n) \ge M_g(x,\dots,x) = x = M_f(x,\dots,x) = C = M_f(x_1,\dots,x_n). $$ Note that letting $f \in \{x^2, x, \log x, 1/x\}$, this proves QM-AM-GM-HM.

What would really help for computations is if we could find the functions $f_{\alpha}$ corresponding to the $A_{\alpha}$-averages, if such functions exist. This would definitely be the way to go to compute these averages. Would those functions happen to be nicely differentiable, we could also show using this theorem that $A_{\alpha}(f)$ is a decreasing function of $\alpha$.

1

I'm adding another answer because this answer is significantly different than my previous one.

I agree that understanding the function you suggested is a first step towards understanding the full problem ; if we can understand a simple piecewise-constant function, then we can approach functions using piecewise constant functions (in a similar fashion than the one used to construct the Lebesgue integral) and perhaps use a continuity theorem to have a formula for the general $A_{\alpha}(f)$-average.

I am taking $f(x) = \mathbb I_{[0,1/2]} + m \mathbb I_{]1/2,1]}$ as you suggested. Let $S_n$ denote the group of all permutations of $\{1,\dots, n\}$. Since $n/2$ will appear at some point and I don't want to write $\Gamma$-functions to preserve combinatorial intuition, I will suppose $n$ even.

$$ a_k(Y(f)) = \frac 1{\binom nk^{\frac 1k}} \left( \sum_{1 \le i_1 < \dots < i_k \le n} f(x_k) \right)^{1/k} \\ = \frac 1{\binom nk^{\frac 1k}} \left( \frac 1{k!(n-k)!} \sum_{\sigma \in S_n} \prod_{j=1}^k f(x_{\sigma(j)}) \right)^{1/k} \\ = \left( \frac 1{n!} \sum_{\sigma \in S_n} m^{\#\{ 1 \le j \le k \, | \, \sigma(j) > n/2 \}} \right)^{1/k}. $$ Let $\psi_{k,n}(\sigma) = \#\{ 1 \le j \le k \, | \, \sigma(j) > n/2 \}$ and $\phi_{k,n}(\ell) = \# \{ \sigma \in S_n \, | \, \psi_{k,n} (\sigma) = \ell \}$ for $0 \le \ell \le k$.

By considering the probability space $(S_n, \mathcal P(S_n), \mu)$ where $\mu(\sigma) = \frac 1{n!}$, the above sum can be seen as the $1/k$'th power of the moment generating function of $\psi_{k,n}(\sigma)$ evaluated at $t = \log m$. This is probably of interest but I don't see if it helps right now.

By regrouping the terms, we get $$ a_k(Y(f)) = \left( \sum_{\ell = \max \{0, k - n/2 \}}^{\min\{k,n/2\}} \frac{\phi_{k,n}(\ell)}{n!} m^{\ell} \right)^{1/k}. $$ Now we need to understand the asymptotic behavior of $\phi_{k,n}(\ell)$. Asking $\psi_{k,n}(\sigma) = \ell$ is essentially the same as sampling $n$ balls labelled from $1$ to $n$, putting them in an order (i.e. constructing $\sigma$) and asking that amongst the $k$ first balls sampled, precisely $\ell$ of them have an index $> n/2$. Therefore, if we consider $\psi_{k,n}(\sigma)$ as a random variable of the random permutation $\sigma$, $\psi_{k,n}$ follows an hypergeometric law : $$ \frac{\phi_{k,n}(\ell)}{n!} = \mathbb P(\psi_{k,n}(\sigma) = \ell) = \frac{\binom{n/2}{\ell} \binom{n/2}{k-\ell}}{\binom nk}, \quad \max\{0, k-n/2\} \le \ell \le \min \{ k, n/2 \}. $$ One must not see this probability as a "random" thing. Since we are considering all permutations $\sigma$, this probability is more like the proportion of permutations that satisfy $\phi_{k,n}(\sigma) = \ell$ and thus can be used to study asymptotic behavior. We obtain $$ a_k(Y(f)) = \left( \sum_{l = \max \{0, k - n/2 \}}^{\min\{k,n/2\}} \mathbb P(\phi_{k,n}(\sigma) = \ell) m^{\ell} \right)^{1/k} \\ = \left( \sum_{l = \max \{0, k - n/2 \}}^{\min\{k,n/2\}} \frac{\binom{n/2}{\ell} \binom{n/2}{k-\ell}}{\binom nk} m^{\ell} \right)^{1/k} \\ = \left(\sum_{l = \max \{0, k - n/2 \}}^{\min\{k,n/2\}} \frac{ \binom{n-k}{n/2 - \ell} \binom k{\ell} }{ \binom n{n/2} } m^{\ell} \right)^{1/k} \\ $$ Using Stirling's formula, one computes that

$$ \binom{n}{n/2}^{1/k} \sim \left(\frac{2^{n+1}}{\sqrt{2\pi n}}\right)^{1/\alpha n} \longrightarrow 2^{1/\alpha}, $$ so that $$ a_k(Y(f)) \sim 2^{-1/\alpha} \left(\sum_{\ell = \max \{0, k - n/2 \}}^{\min\{k,n/2\}} \binom{n-k}{n/2 - \ell} \binom k{\ell} m^{\ell} \right)^{1/k} \\ $$ Now understanding the last limit would be key to this problem, but unfortunately I haven't made much progress there. What's cool now though is that we can approximate this limit and plot the graph. Here it is :

enter image description here

In the plot, $m = 500$, $n=200$ and $1 \le k \le 200$. As $k$ increases, the approximation of $A_{\alpha}(f)$ by $a_k^n(f)$ is shown. The plot used the last formula to obtain the value of the points. The plot suggests that the function $A_{\alpha}(f)$ has an inflexion point at $\alpha = 1/2$ (when $A_{\alpha}(f)$ is seen as a function of $\alpha$).