1
$\begingroup$

$ A^n = AAA..A$

Here I simply do the Power Operation of a Matrix A.
Is there any general form to express ij-th component of this n-th powered Matrix?

$ (A^n)_{ij} = A_{i?}+...+A_{?j}?$

  • 2
    Have you tried for $n=2$? this is a sum and not a product of entries. At the end you will have $n-1$ sums. the best is probably that you try first for $n=2,3,4,$ etc.2017-02-22
  • 0
    @Surb Yes.. I believe you come up with something that I need. But I am poor at indexing and expressing. Could you please elaborate this for a Dummy?2017-02-22
  • 0
    Well, if you feel "poor at indexing and expressing" this is probably a very good starting point.2017-02-23
  • 1
    @Surb Hmm.. I have been recently playing too much around within a lot higher level of abstraction. No one told me like you. Though I don't have such a time at the moment, yet, I choose to follow your advice.2017-02-23

2 Answers 2

0

Suppose $A$ has size $m\times m$. Without further structure, the direct formula is: $$ (A^n)_{i,j} = \sum_{l_1=1}^m\sum_{l_2=1}^m\cdots\sum_{l_{n-1}=1}^m A_{i,l_1}A_{l_2,l_3}\cdot\ldots\cdot A_{l_{n-2},l_{n-1}}A_{l_{n-1},j}= \sum_{l_1,\ldots,l_{n-1}=1}^m A_{i,l_1}A_{l_2,l_3}\cdot\ldots\cdot A_{l_{n-2},l_{n-1}}A_{l_{n-1},j}.$$ This can easily be proved by induction over $n\geq 1$. For $n=1$, it is clear. Suppose it is true for $n$, then \begin{align*} (A^{n+1})_{i,j} &= \sum_{l_n=1}^m (A^n)_{i,l_n}A_{l_n,j}\\ &= \sum_{l_n=1}^m \bigg(\sum_{l_1,\ldots,l_{n-1}=1}^m A_{i,l_1}A_{l_2,l_3}\cdot\ldots\cdot A_{l_{n-2},l_{n-1}}A_{l_{n-1},l_n}\bigg)A_{l_n,j}\\ &= \sum_{l_1,\ldots,l_{n}=1}^m A_{i,l_1}A_{l_2,l_3}\cdot\ldots\cdot A_{l_{n-1},l_{n}}A_{l_{n},j}. \end{align*}

  • 0
    Thank you @Surb. I am only taking the results when I encounter new stuff if a brief verification can be offered or done by myself, which means I am learning only by cases... I don't know what exactly I am missing, and I always wonder where I can find the good exercise for this fundamental manipulations. God, I am even working in Matrix-Vector notation for Multivariate Calculus without these basics. This is problematic. Even though I am always being chased by time, I need to keep filling it. Could you recommend such a learning-material?2017-02-23
  • 0
    Material about Multivariate calculus or about linear algebra?2017-02-23
  • 0
    **[What I Do]** Finding Gradient of Cost Function (Machine Learning) and it is critical to convert **(1) Sigma-Sums into Matrix-Vector Notation** back and forth. **[What I feel]** I am mostly confused when I encounter **(2) complicate sigma sums.** **[Question]** Is there anyway that I could increase the skills for (1), and (2) with some material? I didn't take dedicated Linear-Algebra course unfortunately before, and entered Machine Learning course directly this semester. I put a lot of effort on solving given problems there and I could improve myself a little on (1)&(2).2017-02-24
  • 0
    But I doubt if I could improve (1)&(2) out of the problems given in Machine Learning opposed to the current strategy. Basically, **Manipulations to find some Equality** are most important. You could check my work today taking your answer on the question. You may check [here](https://drive.google.com/open?id=0B22R7P2v6T5dYUdMRENjaXJkNTA) I only need some handy practice. Thank you.2017-02-24
  • 0
    Maybe, just the **on-sight** practices will be the best choice for me. I was thinking if you have some other point of view and some advice.2017-02-24
  • 0
    Well the best is to practice, practice and practice again. If you have an exam soon, practice for this exam by doing the exercises until you master them perfectly. You can also browse the questions here and try to solve the problems. Just look in the tags of interest and select questions which fits you (and read the different solutions!). For more rigorous material you can check the book about linear algebra of Gilbert Strang and for analysis a classic is the Ruddin. You might also want to check the book on convex optimization of Boyd and Vandenberghe.2017-02-24
  • 1
    It seems it is inevitable to check Convex optimization of Boyd and Vandenberghe nevertheless I like it or not because all the people notes what you refer to. (I can take the course for this in the following semester but I am just worried about heavy math in this field). What I can tell is, it was really helpful to talk with you. Thank you Surb :).2017-02-24
0

Consider the diagonalization of $A=PVP^{-1}$ where $V$ contains the eigenvalues of $A$ and $P$ the eigenvectors. Then, $A^n=PV^nP^{-1}$. From this, you can read the $ij$th component of $A^n$.

  • 1
    well not every matrix is diagonalizable..2017-02-22
  • 0
    In general, we'd need the Jordan decomposition2017-02-22
  • 1
    Even if the matrix is diagonalisable, there is no closed-form formula for the eigenvalues in general.2017-02-23