0
$\begingroup$

In a sequence of previous questions I have investigated matrix representations of polynomials and functions thereof: $$P(x), \log(P(x)), \exp(P(x)), \sqrt[k]{P(x)}$$

Using Toeplitz matrices and the common Taylor expansions for the aforementioned functions.

Now to my question. If I have a function in an exponential family: $P_1(x)e^{P_2(x)}$ represented the way mentioned, what is the smallest size matrix for my representation which I would need to be able to invert? Or in other words - how many Taylor terms will I need to store to go backwards - splitting out $P_1$ and $P_2$ from the matrix representing $P_1(x)e^{P_2(x)}$? And what would be a good approach to do it?

1 Answers 1

0

We can consider $$\log\left[P_1(x)e^{P_2(x)}\right] = \log[P_1(x)] + \log[e^{P_2(x)}] = \underset{\text{spread all-over}}{\underbrace{\log[P_1(x)]}} +\underset{\text{band-diagonal}}{ \underbrace{P_2(x)}}$$

We can use the fact that the only thing ending up in the corners of our matrix will be from the first term since any polynomial will be contained in maximum the same number of diagonals as the maximum allowed degree of the polynomial.

Then we can just proceed with for example a line search using a gradient estimating the line direction and the thing to minimize is the sum of absolute values $|\log(R_1)-\log(P_1e^{P_2})|$ taken over the corner vector only containing the elements mentioned before. Starting with $R_1$ from some random distribution, this seems to work. However it is definitely not the best just the first idea I got which seems to work.

Please let me know if you find a more elegant approach.