5
$\begingroup$

Take any $C^\infty$ (smooth) function $f: R \to R$. For any arbitrary function $t:R\to R$, define $g :R\to R$ as $g(x)= (t\circ f)(x)$

Conjecture: For any such $g$, if $g$ is smooth ($g\in C^\infty$), the following must necessarily hold:

$(i)$: Either: $s(x) = x$ (identity function), or

$(ii)$: There exists no open ($O_R$) interval $U$ on the domain of $f, g$, for which holds: $f(U)=g(U)$. i.e.: $$\forall U\in O_R:\exists x\in U:f(x)\neq g(x)$$

In plain English: A smooth function cannot be transformed into another smooth function, without changing the values in all its intervals: Only isolated points may remain unchanged.

Here is an incomplete argument why it seems to me must be true:

Assume we have a an arbitrary smooth function $f$, and an arbitrary function $s$, and $g=s\circ f$. Assume that $s$ is not the identity function (contradicting condition $i$), and that for some interval $(a,b)$, $f(x)=g(x)$ for all $x\in (a,b)$, (contradicting condition $ii$). Take $b$ here to be the largest $b$, such that this holds (which is possible by the Completeness Axiom on $R$).

Now denote by $f_n, g_n$ the $n$th derivative of $f, g$ respectively. Since by assumption, $f$ is smooth on $b$, we know that

$$(1): \underset{\delta \to 0^-}{\text{Lim}}\left(\frac{f_{n-1}(b+\delta)-f_{n-1}(b)}{\delta}\right)=:L_{f_n}^-=L_{f_n}^+:=\underset{\delta \to 0^+}{\text{Lim}}\left(\frac{f_{n-1}(b+\delta)-f_{n-1}(b)}{\delta}\right)$$ ($L$ will denote the limit with respect to the point $b$).

$(2):$ Since $f$ and $g$ are identical on $(a,b)$, we also know that $L_{f_n}^-=L_{g_n}^-$, for all $n\in \mathbb N$.

$(3):$ Now assume (in order to derive a contradiction) that $g$ is smooth on $b$, so that $L_{g_n}^-=L_{g_n}^+$ for all $n \in \mathbb N$. Then using $(1,2)$ it also holds that $L_{f_n}^+=L_{g_n}^+$ for all $n \in \mathbb N$.

However, since $b$ is the largest value such that $f(x)=g(x)$ on $(a,b)$, that means that either $f(b)\neq g(b)$ (in which case $g$ is discontinuous and not smooth, completing the proof for that case), or for some $c>b$, it is the case that $f(x)\neq g(x)$ for all $x\in (b,c)$.

Now here comes a bit of a leap: Given that $f(x)\neq g(x)$ for all $x\in (b,c)$, we also know that there is an interval $(b,\beta _1)$, where $\beta_1\leq c$, in which for all $x$: $f_1(x)\neq g_1(x)$. Similarly, given interval $(b, \beta_i)$ in which for all $x: f_i(x)\neq g_i(x)$, there is an interval $(b, \beta_{i+1})$, where $\beta_{i+1}\leq \beta_i$, in which for all $x: f_{i+1}(x)\neq g_{i+1}(x)$

Again a leap: Hence we know that for any $n\in \mathbb N$, there is a $\beta \in \mathbb N$, such that for all $x\in (b, \beta), f_{n}(x)\neq g_{n}(x)$. Hence there exists an $n\in \mathbb N$, such that $L_{g_n}^+ \neq L_{f_n}^+$. This contradicts $(3)$, therefore, $g$ is not smooth.

Discussion:

  • Is this conjecture correct?
  • Is the first part of the proof correct?
  • Is there a way to fill in the "leaps" at the end?
  • Are there better ways to prove it (or if the conjecture is false, to restate it into a correct one)?

ps. note, I have no formal maths training, and I came up with this conjecture myself based on intuition, so if this is a stupid conjecture or proof, understand that.

  • 0
    *Analytic* functions that agree on a nonempty open set must agree everywhere. But smooth functions needn't; this is precisely what bump functions (a very widely used tool in analysis) are for.2017-02-08
  • 0
    What is $s(x)?$ Aslo $f(U)=g(U)$ is not the same as the "i.e., ..." statement.2017-02-08
  • 0
    @zhw, i see now that I could have formulated it without the $s(x)$2017-02-09
  • 0
    Similar to https://math.stackexchange.com/questions/1665302/is-a-smooth-function-characterized-by-its-value-on-any-non-empty-open-interval?rq=12017-12-08

2 Answers 2

6

It is not true. I will let someone else look at the details of your argument, but there is a counter-example that is natural enough - bump functions!

A bump function is a smooth function with compact support, which is $1$ on some compact set, goes from $1$ to $0$ a bounded set, and is $0$ off of that set. https://en.wikipedia.org/wiki/Bump_function

What this means is that we can take any smooth function, and multiply by a bump function. On the compact set, it will be unchanged, but eventually, the result will be $0$. So we have smooth functions that agree on some interval, but not everywhere.

Edit: In this question here we find an example with a ray rather than compact support. It is a theorem of topology that we can cook up these functions in a large variety of ways.

  • 0
    What if we rule out trivial joining? I.e. we don't allow joining at 0?2018-03-24
  • 0
    @samthebest what does joining mean?2018-03-25
  • 0
    I mean agree on some segment. Bump functions only agree at 0, is it possible to make them agree on something non-trivial?2018-03-25
  • 0
    @samthebest perhaps 'bump function' means something different to me than to you. I'm thinking of a function which is $1$ on a compact set, $0$ on the complement of a larger compact set, and decays from $1$ to 0$ smoothly in the middle.2018-03-25
3

Counterexample: Define

$$f(x) = \begin{cases} 0 & x\le 0\\e^{-1/x} & x>0\end{cases}$$

Then $f\in C^\infty(\mathbb R).$ With $t(x) = x^2,$ we have $f$ and $ t\circ f$ equal to $0$ on $(-\infty,0].$

  • 0
    What if we rule out trivial joining? I.e. we don't allow joining at 0?2018-03-24