1
$\begingroup$

Given:

$$f(x) = (1-x)^N$$

I can only scale $x$ with a and $f(x)$ and i must have $N=2$, in this case i get

$$g(x) = b * (1-x / a)^2$$

Where $N$ is known and $N>0$ and $x = [0..1]$, and $a$,$b$ are unknowns $a = (0..1]$ and $b = (0..1]$

My conclusion is that i need to find a pair or $a$ and $b$ such that $S$ tends to $0$ and

$$S = \int_0^a {f(x) - g(x)}dx + \int_a^1{f(x)}dx$$

Trying to solve this i came to the following conclusion (might be wrong about this):

$$S = \frac{1}{M} * (1 - 2 * (1 - a) ^ M) - \frac{a * b}{3}$$ $$M = N + 1$$

This works when N is 2 and a = 1 and b = 1, S is zero in this case. This is where i'm stuck where N != 2 , can't find a way find a and b such that S tends to zero. Is this even solvable (or i'm taking wrong approach) ? if Yes , how ?

0 Answers 0