2
$\begingroup$

This is a completely practical problem in application.

Considering $ A \sin(\phi + \phi_{0}) = \sum_{k=1}^{N} A_k \sin(\phi + \phi_k), \forall \phi \in \mathbb{R} $ where $\phi$ is a deterministic variable on the real line, $ A_k \sim N (\mu_{A}, \sigma_{A}) $ and

$ \phi_k, k=1..N$ have a certain identical distribution (here I further simplified the assumption).

And it is also with further assumption that $ \mu_{A}>0$ and $ \sigma_\phi < \mu_{A}$. Furthermore, random variables $A_k$, $\phi_k$, $k = 1..N$ are jointly independent.

Roughly speaking, the summation represents a superposition of N first harmonics whose amplitude and argument have random behaviors. My interest is to know how the components influence the overall harmonic, in terms of distribution.

Since it holds for every $\phi$, it can be simplified into: $ A e^{i \phi_0} = \sum_{k=1}^{N} A_k e^{i \phi_k} $

So, I would like to know how I can derive the distribution of $ A $ and $\phi_0$ from above equation. If precise distribution cannot be found, an approximate solution would also be grateful. As for approximation, one may consider that most $3\sigma_\phi < \mu_A$, so the negative values can be truncated off, and only the positive values are under consideration.

  • 0
    @DidierPiau err..I'm sorry for the modification. It is simply because that change would further simplify the problem. I believe the previous assumption is more closed to the reality. It is preferred if one could work out with the previous assumption. My interest is more lied on the pattern of density function, rather than only mean and variance.2011-07-14

2 Answers 2

1

I don't think the statement "sum of harmonics" is quite accurate. In fact, writing $\omega t$ for $\phi$, the given relation is $A\sin(\omega t + \phi_0) = \sum_{k = 1}^N A_k\sin(\omega t + \phi_k)$ which is saying that the sum of $N$ sinusoidal signals of the same radian frequency $\omega$ radians/second but different amplitudes and phases is a sinusoid at the same radian frequency: there are no harmonics involved. So, $A$ and $\phi_0$ are random variables related to the given $A_k$'s and $\phi_k$ through $ A\cos(\phi_0) = \sum_{k=1}^N A_k\cos(\phi_k); ~~ A\sin(\phi_0) = \sum_{k=1}^N A_k\sin(\phi_k). $ In some applications that I have encountered, the $A_k$'s have Rayleigh distributions and the $\phi_k$'s are uniformly distributed on $[0, 2\pi)$ instead of just having identical (not necessarily uniform) distributions so that $A_k\cos(\phi_k)$ and $A_k\sin(\phi_k)$ are iid Gaussian random variables, their sums are independent Gaussian random variables, and so $A$ and $\phi_0$ are respectively Rayleigh and $U[0,2\pi)$ random variables.

But here the $A_k$'s are Gaussian random variables so the simpler result described above does not apply. However, if $\mu_k \gg 3\sigma_{\phi}$, the Gaussian density of $A_k$ might be modeled as roughly resembling a Rayleigh density, and so something might still be useable.

0

As a practical matter we face this with the various waves on the right having different, but similar, frequencies. All the $\phi$'s are within $5-10\%$ of each other and the $A_k$ are the same. In that case, they "walk past" each other randomly. We ignore the relative phases and figure they are random variables with zero mean and known standard deviation, the power of each signal, and take the variance of the sum to be the sum of the variances.

I think you can do the same here. The variance of one sine function is $\frac{1}{2}$. The variance of the sum of two of the same frequency is the average of $|\sin \phi_1+\sin \phi_2|^2=\sin ^2 \phi_1+\sin ^2 \phi_2 + 2\sin \phi_1 \sin \phi_2=1$ when averaged over $\phi_2-\phi_1$. So if the phases are random, I would just say the variances add nicely.

  • 0
    I was just stating that in our application the $A_k$ are the same. Your mileage may vary. I suspect you can blindly proceed to add the variances, but my calculation does not prove that.2011-07-14