9
$\begingroup$

How can I show that? I've tried to reverse the logarithm to it's exponential form in a trial to show that but I got no success. Can you help me?

  • 1
    I think it is not even a rational function.2012-11-26
  • 2
    A polynomial is a polynomial for all of $t$. So the very fact that log isn't defined for $\le 0$ means it is not a polynomial.2012-11-26
  • 4
    I don't know Kaz, $g(t)=\exp(\log(t))$ could be thought of as a polynomial function even though it is only defined when $t>0$.2012-11-26
  • 4
    You can differentiate log an arbitrary number of times and never get a function that is identically zero.2012-11-29
  • 0
    @tomcuchta: Yes, that is in [beauby's answer](http://math.stackexchange.com/a/245183).2012-11-30

9 Answers 9

24

If the logarithm were a polynomial, we could write $$\log(x) = a_n x^n +a_{n-1}x^{n-1}+\cdots +a_0,$$ for constants $a_0,a_1,\ldots,a_n$, where the formula holds for all $x>0$. Assuming this is the case, a contradiction will be reached. Using $\log(x^2)=2\log(x)$, we have $$a_n x^{2n}+a_{n-1}x^{2n-2}+\cdots + a_0 = 2a_n x^n+2a_{n-1}x^{n-1}+\cdots+2a_0.$$ Using the fact that equal polynomials have equal coefficients, this implies that $\log(x)=0$ for all $x$. That isn't true, and thus the assumption that $\log$ is a polynomial leads to a contradiction.

  • 0
    I can't figure out how you went from my question to this answer.2012-11-27
  • 0
    Gustavo: I was inclined to use a familiar property of logarithms that is foreign to polynomials. Each answer uses such a property. Another: $\log(t)\to\infty$ as $t\to \infty$ but $\dfrac{\log(t)}{t}\to 0$ as $t\to\infty$. No polynomial (or rational function) has this property. (But the answer above doesn't require limits.)2012-11-27
  • 0
    Yep - I just can't understand how you made $\log(x)=Polynomial$ is it a hypothetical claim?2012-11-27
  • 0
    @Gustavo: I see, it was a lack of clarity you were referring to. Yes, it is a hypothetical assumption which leads to an absurdity, thereby showing that the assumption is false. This is called "[proof by contradiction](http://en.wikipedia.org/wiki/Proof_by_contradiction)." Only the first sentence is hypothesizing what I want to show is impossible, and the rest is referring to what can be deduced from this hypothesis, including the absurd $\log(t)=0$ for all $t>0$.2012-11-27
  • 0
    |Is it valid for all $t$? The question refers only to $t>0$ but it seems you've proved for all $t$; |I'm relatively acquainted with proof by contradiction, I just have no clue on why this hypothetical claim process is plausible.2012-11-27
  • 0
    @Gustavo: No, I didn't explicitly refer to $t$, but it is still assumed to be positive so that $\log$ is defined. The fact that coefficients of polynomial functions are unique still holds when restricting to positive real numbers. I'll try to make the answer clearer.2012-11-27
  • 0
    I've seen your edit, this is the part when it gets weird: You just say that $\log(t)=polynomial$, for me it's strange because they should have at least some similarity for this claim, and I can't see this similarity, it's as if $\log(t)$ were a variable.2012-11-27
  • 1
    Gustavo: $\log(t)$ is a function of $t$. (More precisely it might be better to say that $\log$ is the function.) This is a standard proof by contradiction. A polynomial function on the positive real numbers has the form $p(t) = a_nt^n +\cdots +a_0$ for some constants $a_i$. We want to show: The function $\log(t)$ is not equal to any polynomial function on the positive reals. Proof by contradiction: Suppose $\log(t)$ is equal to a polynomial function on the positive reals. (Explicitly, that means there is a polynomial function $p(t)=a_nt^n+\cdots+a_0$ such that $\log(t)=p(t)$ for all...2012-11-27
  • 1
    ...$t>0$.) Next, show that this leads to a contradiction, as the answer above does in sketch form. (Uniqueness of coefficients needs justification. Another exercise in that book deals with this.)2012-11-27
  • 1
    Great! I got it! Both are functions, how couldn't I see this? Really thanks for your answer, coments, time and for helping me to become less dumb.2012-11-27
8

Do you mean $t \mapsto \log t$ ? If this is the case, consider the derivatives and the fact that a polynomial's derivatives are eventually null.

  • 0
    This is a good answer. Another formulation: polynomial is eventually equal to its Taylor series. This is never true for $\log$ since it's Taylor series are infinite.2012-11-26
  • 1
    tehecz: A technicality on your formulation: The logarithm *is* equal to its Taylor series in an interval about each point in its domain, because the Taylor series allows infinitely many terms. However, polynomials are characterized by being equal to a finite Taylor series.2012-11-26
8

You can also use the fact that $\lim_{t \to \infty} \frac{ \log t}{t} = 0 \,.$

For a polynomial of degree at least 2, the corresponding limit is infinity, while for a linear polynomial, the limit is zero if and only if the polynomial is constant....

  • 2
    To be clear, the limit for polynomials is $\pm \infty$.2012-11-26
6

What do you mean? Even if the domain is restricted to $(0,\infty)$, a polynomial function will have a finite right hand limit as $t\rightarrow0^+$, while the logarithmic function has not.

  • 0
    I don't know, I just copied the question from the [Book.](http://www.amazon.com/Polynomials-Problem-Books-Mathematics-Barbeau/dp/0387406271)2012-11-26
  • 2
    Why not also read the answer in the book?2012-11-26
  • 0
    Maybe the answer in the book cannot satisfy Gustavo's intellectual needs. At least I feel more content after reading Jonas Meyer's proof.2012-11-27
3

It does not match in 0: $\lim_{x \to 0^+} \log x = -\infty$ but for any polynomial $\lim_{x \to 0^+} (a_nx^n + a_{n-1}x^{n-1} + \cdots + a_1x + a_0) = a_0$. Hence there is no $f(x) = (a_nx^n + a_{n-1}x^{n-1} + \cdots + a_1x + a_0)$ such that $\lim_{x \to 0} (\log x - f(x)) = 0$ so as we get closer to 0 we get divergence from any polynomial.

The $\lim_{x \to 0} \log x$ can be obtained as $\log a = b \Leftrightarrow a = e^b$ and $\lim_{x \to -\infty} e^x = 0$ (assuming logarithm as reverse of exponentiation is well defined).

3

The series $\sum \frac{1}{n \log n}$ diverges and the series $\sum \frac{1}{n (\log n)^2}$ converges, and this isn't true with $\log n$ replaced by any polynomial (with finitely many terms removed if the polynomial has integer zeroes) or indeed with any rational function (and most of the other arguments given so far are not enough to prove this). Actually this argument probably shows that $\log n$ is not even algebraic.

2

The function $\log(t)$ can be expressed via Taylor series as the infinite summation $$ \log(t) = (t-1)-\frac{1}{2}(t-1)^2+\frac{1}{3}(t-1)^3-\frac{1}{4}(t-1)^4+\cdots, $$ which involves terms of $t^n$ for unbounded $t$.

  • 0
    That series is valid when $0, but the point is still valid because for example the coefficient of $(t-1)^n$ is $\frac{1}{n!}\log^{(n)}(1)$, and therefore this shows that the derivatives of $\log$ aren't eventually $0$. (It is related to beauby's answer, but they can also be thought of independently, and one doesn't need to bring in derivatives to elaborate your answer.)2012-11-26
2

If $\log(t)$ is a polynomial, its derivative $\frac{1}{t}$ also is. But this is clearly impossible (the only invertible polynomials are constants because of the degree).

1

The function $f:t\mapsto \log t$ has a vertical asymptote at $0$ (i.e. $\lim_{t\to0}f'(t)=\infty$), so it has a non-removable discontinuity. Polynomials are continuous everywhere.